Political Deepfakes Are Legal In Louisiana

by Wesley Muller, Louisiana Illuminator

June 30, 2024

Gov. Jeff Landry has vetoed a bill that would have made it illegal to deceive voters with false impersonations or depictions of a political candidate through audio or video manipulation techniques called “deepfakes.”

House Bill 154, sponsored by Rep. Mandie Landry, D-New Orleans, was one among 31 bills the governor vetoed from the 2024 regular session. The two Landrys are not related.

In his letter explaining the reasons for his veto, the governor said he believes the legislation could have infringed on the free speech rights of artificial intelligence (AI) companies. 

“While I applaud the efforts to prevent false political attacks, I believe this bill creates serious First Amendment concerns as it relates to emerging technologies,” the governor wrote. “The law is far from settled on this issue, and I believe more information is needed before such regulations are enshrined into law.”

The governor pointed out that the Legislature passed a resolution requesting the Joint Legislative Committee on Technology and Cybersecurity to study and make recommendations for the use and regulation of AI. A similar panel at the federal level is also studying the issue to “explore how Congress can ensure America continues to lead the world in AI innovation while considering guardrails that may be appropriate to safeguard the nation against current and emerging threats,” Gov. Landry added. 

Rapid advances in artificial intelligence have allowed virtually anyone with few basic computer skills to create videos that depict someone doing or saying something they never actually did or said. While fake but realistic computer-generated videos have existed for decades, creating them generally required teams of people with advanced skills, special technology and expensive equipment often found only in professional film studios. AI has removed most of those barriers.

Although discussion and debate on Mandie Landry’s legislation centered around deepfake videos, the bill’s text never mentioned the terms “deepfake” or “artificial intelligence.” 

Instead, the text was very similar to what is already written in the state’s existing election laws. Current law says: “No person shall cause to be distributed, or transmitted, any oral, visual, digital, or written material containing any statement which he knows or should be reasonably expected to know makes a false statement about a candidate for election in a primary or general election or about a proposition to be submitted to the voters.”

The bill would have added a sentence to that section saying almost the same thing but with words that more specifically apply to a deepfake or other false images: “No person shall cause to be distributed or transmitted any oral, visual, digital, or written material containing any image, audio, or video of a known candidate or of a person who is known to be affiliated with the candidate which he knows or should be reasonably expected to know has been created or intentionally manipulated to create a realistic but false image, audio, or video with the intent to deceive a voter or injure the reputation of a known candidate in an election.”

Political operatives have already used deepfakes, as well as “cheapfakes,” against President Joe Biden ahead of the November election. A cheapfake uses simpler editing tricks such as clipping footage to hide portions of it. They are often easily debunked but not before millions have seen it on social media. 

The Republican National Committee has been distributing a stream of Biden cheapfakes, and other Trump supporters have spread AI-generated images of Black people supporting the former president

Gov. Landry, a Republican, also vetoed a bill that would have required anyone making a deepfake video to label it as such with a watermark or some similar flag or graphic.

Reached by phone Friday, Mandie Landry said she disagrees with the governor’s belief that her proposal would have violated the First Amendment. 

“This is my view, and I’m not aware of a court or at least a high court that has weighed in: I have a First Amendment right to say X-Y-Z about you; I don’t have a First Amendment right to steal your likeness/voice to do something,” she said.   

Existing law, which makes it illegal to make false statements about a political candidate, appears to be much more constitutionally precarious than her proposal, the representative said.

“As was discussed on the floor, this is not someone making a statement about me,” she said of deepfakes. “It’s using me to make a statement about me. But I do understand that with new technology that there is hesitation. I hope we don’t hesitate too long though because this tech is improving very fast.”

Louisiana Illuminator is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Louisiana Illuminator maintains editorial independence. Contact Editor Greg LaRose for questions: info@lailluminator.com. Follow Louisiana Illuminator on Facebook and X.

Read More

Louisiana Illuminator