Bipartisan Senators Introduce Bill to Combat Pornographic ‘Deepfakes’

Stacy Robinson
By Stacy Robinson
June 18, 2024Congress
share
Sen. Ted Cruz (R-Texas) introduced a new bill to protect and empower victims of deepfake video technology. The bill also aims to assist victims of non-consensual intimate image abuse, commonly known as "revenge pornography."

WASHINGTON—Sens. Ted Cruz (R-Texas) and Amy Klobuchar (D-Minn.) on June 18 introduced legislation that would require sites to take down artificially generated pornographic images of real people, often minors.

“Now, almost instantaneously perpetrators can use an app on their phone to create fake, explicit images depicting real people, commonly referred to as ‘deepfakes’,” Mr. Cruz said at a press conference. “Disturbingly, this is increasingly affecting, and targeted at minors, and particularly young girls.”

The legislation, dubbed the Take it Down Act, would force tech companies to remove artificial intelligence (AI)-generated sexual images from their platforms. They would also be required to create procedures to remove the content within 48 hours.

The bill has bipartisan support, and is co-sponsored by 15 lawmakers, including Sen. Cynthia Lummis (R-Wyo.), who also spoke at the press conference. It has also been endorsed by the National Center on Sexual Exploitation.

Yiota Souras, chief legal officer of the National Center for Missing and Exploited Children, said at the press conference that the legislation is “crucially needed.”

Most states have laws against “revenge porn,” or posting online nude images of another person without their consent. In 2022 Congress made it possible to file a civil suit against perpetrators. The new bill would add criminal penalties; the lawmaker’s statement said civil suits can be “time-consuming, expensive, and may force victims to relive trauma.”

Mr. Cruz said a person posting the content could face up to two years in prison, or three years if it depicted a minor. To preserve the chain of evidence, the bill does not require the destruction of the image or video, only its removal.

The senator said he would press Commerce Committee Chairwoman Sen. Maria Cantwell (D-Wash.) to schedule a markup of the bill.

“Young girls are waking up to a text message from their friends, saying there’s an explicit photo of them circulating on social media or an explicit video, and yet the photos aren’t real,” Mr. Cruz said. “They were created using AI, typically taking the face of a real person and using AI technology to graft it seamlessly and imperceptibly to a different image, a sexually explicit image.”

One victim, 14-year-old Elliston Berry, spoke of her fear of entering high school after fake, pornographic images of her were shared on social media platform SnapChat. Mr. Cruz said her repeated efforts to have the photos removed met a bureaucratic wall, until his office reached out to the company on her behalf.

Mr. Cruz said he was able to have the images of Ms. Berry removed within 24 hours of his staff contacting SnapChat. “Now, if you don’t happen to be in a situation where a sitting member of Congress intervenes on your behalf … you get a closed door,” he said.

Mr. Cruz contrasted Elliston’s experience with that of pop star Taylor Swift, who in January was able to have deepfake images of herself removed from X. The platform even blocked searches for Ms. Swift’s name for two days to prevent the images from being seen.

“They behave as if they are not accountable,” Mr. Cruz said of social media companies. He said that if a user had instead uploaded a copyrighted video or song, the content would be removed almost immediately.

“This is a horrific experience for these girls and their families; it can leave lasting scars,” Ms. Lummis said. “We must address this.”

The Epoch Times has reached out to Ms. Cantwell for comment.

From The Epoch Times