Home>Articles>‘Deepfake’ Audio and Visual Political Ads Faces Ban With Looming Senate Vote

Assemblyman Marc Berman. (Photo: Kevin Sanders for California Globe)

‘Deepfake’ Audio and Visual Political Ads Faces Ban With Looming Senate Vote

Opponents warn AB 730 fails to make any provision for speech protected by the First Amendment,

By Evan Symon, September 10, 2019 1:34 pm

Assembly Bill 730, which would ban ads that have been altered using ‘deepfake technology’ will be decided soon in the statehouse.

‘Deepfake’ technology uses facial scans and new audio to create a fake video using footage from a real video. When a real person had talked on screen, the technology can be used to digitally manipulate their mouth and facial features over the original footage. Then new audio is given from the actor. The end result is a fake video that looks and sounds real. As PBS reported earlier this year, some videos are so good that it’s hard to tell what is  real and what isn’t.

Assemblyman Marc Berman (D – Palo Alto) had watched a Buzzfeed video where a video of former President Barack Obama had been digitally changed to say a wide variety of things by actor Jordan Peele. The realness of the videos concerned Assemblyman Berman, who is the chair of the election committee in the Assembly and has had a long history of curtailing fake news in the Assembly. Most recently Assemblyman Berman introduced and helped pass AB 3075 in 2018, which created the Office of Elections Cybersecurity in California. The office, among other duties, helps impede fake news and misleading information regarding elections and candidates in California.

With other ‘deepfake’ videos involving everyone from President Donald Trump to Actor Nicolas Cage to Speaker of the House Nancy Pelosi appearing drunk making their rounds across the internet, Assemblyman Berman introduced AB 730, as a way to counteract against another false form of information and news.

“I immediately realized, ‘Wow, this is a technology that plays right into the hands of people who are trying to influence our elections like we saw in 2016,’” said Assemblyman Berman.

In a recent interview Berman added to his statement, saying “As more and more bad actors try to influence our elections with misinformation campaigns that sow confusion and doubt throughout the electorate, I think we can all agree with the premise that voters have a right to know when video, audio and images that they are being shown have been manipulated.”

The bill enjoys widespread bipartisan support. In an Assembly vote in May AB 730 passed 76-0

However, even with a large amount of support in Sacramento, AB 730 does have limits. The ban on deep ‘fake videos’ will only be in effect within 60 days of an election, and it only covers political candidates. It also allows a fake video or audio ad to run as long as a disclosure is shown on the video saying that it has been manipulated.

Despite these limits, there are still many opposing the bill, most notably the ACLU. The chief concern is that the bill violates the First Amendment’s right to free speech.

As currently written, AB 730 fails to make any provision for speech protected by the First Amendment,” stated the California News Publishers Association in a press release. “ Though the bill creates limited exceptions from liability where a disclosure is provided identifying the image or recording as being manipulated, those exceptions are almost certainly insufficient to ensure that constitutionally protected speech is not punished.”

Only two other states have attempted bans on deep fake videos before California. Virginia had a successful ban involving ‘deep faked’ videos in pornographic videos earlier this year and New York had a bill involving ‘deep faked’ video being used in film die in committee in Albany in 2018. If AB 730 succeeds, California would be the first state to have a political advertising ban on such videos in the country.

Print Friendly, PDF & Email
Evan Symon
Spread the news:


Leave a Reply

Your email address will not be published. Required fields are marked *