AI Deepfake Porn Bill Would Require Platforms to Remove Images

A new bill in Congress is aiming to make social media platforms accountable for any AI-generated deepfake porn images displayed on their site.

The bill aims to hold social media companies accountable for policing and removing deepfake pornographic content published on their platforms. The legislation would criminalize the act of publishing or threatening to publish deepfake porn.

Sen. Ted Cruz, R-Texas, is the primary sponsor of this bill that has been dubbed the Take It Down Act but has a bipartisan group of senators supporting the bill who have teamed up with victims of deepfake porn which includes high school students.

The Take It Down Act would mandate social media companies to develop a process to remove the offending images within 48 hours of receiving a valid request from a victim. Additionally, these sites would be required to make reasonable efforts to remove any other copies of the images, including those shared in private groups.

The Federal Trade Commission would be tasked with enforcing the new rules.

“By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime,” Cruz tells CNBC.

Lawmakers are scrambling to keep up with rapidly evolving AI technology that is suddenly capable of making anyone a victim of deepfake porn.

The U.K. recently made the creation of sexually explicit deepfake images a criminal offense — with anyone sharing these images facing the possibility of jail time. With the EU also criminalizing sexually explicit deepfakes.

According to a report by Home Security Heroes, deepfake porn has increased 464 percent year-on-year and while there is widespread agreement on the need to do something about the problem there is little consensus on how to go about it.

In fact, there are now two competing bills now in Congress with Sen. Dick Durbin, D-Ill, introducing a bill that would criminalize the spread of nonconsensual deepfake porn which was in direct response to the sexually explicit AI-generated photos of Taylor Swift that went viral last week.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment