More than 1 in 10 students say they know of peers who created deepfake nudes: report

Jon Healey | Los Angeles Times (TNS)

When news broke that AI-generated nude pictures of students were popping up at a Beverly Hills Middle School in February, many district officials and parents were horrified.

But others said no one should have been blindsided by the spread of AI-powered “undressing” programs. “The only thing shocking about this story,” one Carlsbad parent said his 14-year-old told him, “is that people are shocked.”

Now, a newly released report by Thorn, a tech company that works to stop the spread of child sexual abuse material, shows how common deepfake abuse has become. The proliferation coincides with the wide availability of cheap “undressing” apps and other easy-to-use, AI-powered programs to create deepfake nudes.

But the report also shows that other forms of abuse involving digital imagery remain bigger problems for school-age kids.

To measure the experiences and attitudes of middle- and high-school students with sexual material online, Thorn surveyed 1,040 9- to 17-year-olds across the country from Nov. 3 to Dec. 1, 2023. Well more than half of the group were Black, Latino, Asian or Native American students; Thorn said the resulting data were weighted to make the sample representative of U.S. school-age children.

According to Thorn, 11% of the students surveyed said they knew of friends or classmates who had used artificial intelligence to generate nudes of other students; an additional 10% declined to say. Some 80% said they did not know anyone who’d done that.

In other words, at least 1 in 9 students, and as many as 1 in 5, knew of classmates who used AI to create deepfake nudes of people without their consent.

Stefan Turkheimer, vice president of public policy for the Rape, Abuse & Incest National Network, the country’s largest anti-sexual-violence organization, said that Thorn’s results are consistent with the anecdotal evidence from RAINN’s online hotline. A lot more children have been reaching out to the hotline about being victims of deepfake nudes, as well as the nonconsensual sharing of real images, he said.

Compared with a year ago or even six months ago, he said, “the numbers are certainly up, and up significantly.”

Technology is amplifying both kinds of abuse, Turkheimer said. Not only is picture quality improving, he said, but “video distribution has really expanded.”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment