Prior to securing his spot in the Chicago mayoral run-off, the campaign for Paul Vallas became a victim of a growing digital deception known as a “deepfake.”
A video, now removed from Twitter, depicted a voice resembling Vallas’ saying erroneous statements regarding police shootings. Current state law is inadequate in providing a legal resource to victims of similar scams, according to proponents of new legislation filed in the Illinois General Assembly.
Senate Bill 1392 and House Bill 2123 would allow the campaign and other victims of digital forgeries to seek legal action against perpetrators who create and share inauthentic media. Both bills advanced out of committee this week as part of a frenzied effort by lawmakers to move proposed legislation out of committees before a Friday deadline.
More:She discovered a naked video of herself online, but it wasn’t her: The trauma of deepfake porn
Deepfakes are commonly made through the use of artificial intelligence, which Northwestern University computer science professor V.S. Subrahmanian said presents positive and negative possibilities.
As he noted in a Monday interview, the legislation however is not limited to deepfakes but also to any manipulated video and audio material. That could include such things as photoshopping a person out of or into an image in a way that is harmful to that individual’s reputation.
“It’s not ruling out deepfakes, all legitimate, for fun purposes,” Subrahmanian said. “So, it’s very hard to disagree with this bill.”
The Senate bill from state Sen. Mary Edly-Allen, D-Libertyville, went before the Senate judiciary committee on Tuesday afternoon where she was joined by Northwestern University Pritzker School of Law professor Mark Kugler and virtually by University of Miami law professor Mary Anne Franks.
More General Assembly:Illinois lawmakers work to meet Friday deadline to pass bills out of committee
In his testimony, Kugler noted several other states had already passed similar bills as a constitutional means to prevent the spread of misinformation. Among them is California, whose Gov. Gavin Newsom signed a 2019 bill into a law that bans the use of political deepfakes within 60 days of an election.
Since then, Kugler said the technology to make deepfakes has only advanced in terms of ease and realism and its victims – especially women edited into pornographic scenes – have increased. Political implications, as seen in the recent Vallas instance, are also at play.
“Looking ahead to 2024 and thinking about election interference, I see a possibility of seeing fake news on steroids,” he said.
Following more than 50 minutes of debate, SB 1392 advanced out of committee with the understanding that amendments will be necessary.
The need for change had bipartisan support from Democrat Senate President Don Harmon, Republican state Sens. Chapin Rose and Jil Tracy, and the American Civil Liberties Union of Illinois. Even Rose, R-Mahomet, chortled at the pairing.
Edly-Allen’s bill was good in intent, legislators agreed but said its broad language could present First Amendment challenges. Tracy feared a “mountain of litigation” if the bill became law as it currently stands.
The Quincy senator asked if the bill had any specific exemptions for news media or political campaigns, to which Kugler say no. Tracy said this is where the lawsuits would pile-up.
“I can’t believe we can’t come up with better language to fine-tune this so that we don’t see litigation over public officials,” Tracy said. Minority Senate Leader John Curran, R-Downers Grove, agreed with the assessment.
Parody, an altered form of an original media with the purpose of entertaining, has been traditionally recognized as protected speech in cases involving defamation. Several committee members on both sides of the aisle acknowledged parody was not explicitly defined in the bill, thus opening the door for lawsuits.
Franks, who assisted in writing the legislation, said the bill is built in a way to ensure parody is still protected. The key word in the five-page bill is “realistic,” added Kugler, responding to a question posed by Harmon regarding the common use of a marionette in political ads.
“Strings controlling a puppet is not realistic, so I think that is where we can deny a cause of action,” Kugler said.
The focus on political considerations was disappointing to Edly-Allen, who said political ads that use altered media should not be protected under her bill.
“If we don’t have our own image protected, what do we have?” she said in a Wednesday interview.
The ACLU was one of two groups filling opposition witness slips to SB 1392 and HB 2123. Angela Inzano, policy and advocacy strategist for the advocacy group, hoped an agreement would be reached on the amendments promised by Edly-Allen.
“We did offer alternative language that we believe would address these problems,” she said, signaling that they could change their ‘no’ position if the language is updated. “I understand that proposal was not accepted, but we are committed to continue those conversations.”
Finding the perpetrator
SB 1392 and HB 2123 sets the barrier for a victim to sue if that individual received “physical, emotional, reputational, or economic harm” as a result of the digital forgery.
Finding who created that digital forgery, especially when done outside of the state or country, and the overall enforcement of the bill remained a question for Subrahmanian. Edly-Allen said federal conversations need to take place when it comes to enforcement.
Right now, she said getting lawmakers to acknowledge the issue and to understand to whom it applies is still in progress.
“If you are doing real things, this doesn’t apply to you,” Edly-Allen said. “This only applies to bad actors.”
Research from the University of Sydney found only 37% of the time could participants verbally identify whether an image was real or fake, compared to 54% when their brain could detect the difference.
The advancing quality of deepfakes will only make detection more challenging, the professor said, potentially making it a burden for the prosecution to prove the media has truly been altered and is not an original.
“A five or 10 percent error rate is not good enough for prosecution,” Subrahmanian said. “I mean, that sounds like reasonable doubt.”
The challenge in spotting what is actually a deepfake prompt a need for education in schools and for law enforcement, Edly-Allen said.
Also on Wednesday, HB 2123 from state Rep. Jennifer Gong-Gershowitz, D-Glenview, went before the House judiciary committee. The debate was much shorter and again featured Kugler, Franks and Inzano before passing in a unanimous vote.
Contact Patrick Keck: 312-549-9340, firstname.lastname@example.org, twitter.com/@pkeckreporter.
via “Illinois Politics” – Google News https://ift.tt/Uquzrvh
March 9, 2023 at 06:56AM