Instagram Recommends Sexual Videos to Users as Young as 13, Study Finds

In a new study conducted by The Wall Street Journal and an academic researcher, Instagram’s algorithm was found to have repeatedly recommended sexual videos to young teenage users — some as young as 13 years old.

The study, which The Wall Street Journal says took place over several months ending in June, found a high rate of such videos shown to accounts identified as 13-year-olds with little to no prompting.

The Instagram accounts loaded up the Reels feature and were immediately shown “traditional comedy, cars or stunts, as well as footage of people sustaining injuries.” The freshly minted accounts scrolled past these Reels, soon getting recommendations for videos that were sexual in nature. In the study, the accounts did not like or save these sexual videos, but, unlike the initially suggested content, did watch the Reels in their entirety.

This seemed to be enough to increase the amount of recommendations for videos involving sexual content and even ramp up to more risqué content.

“After a few short sessions, Instagram largely stopped recommending the comedy and stunt videos and fed the test accounts a steady stream of videos in which women pantomimed sex acts, graphically described their anatomy or caressed themselves to music with provocative lyrics,” The Wall Street Journal reported. “In one clip that Instagram recommended to a test account identified as 13 years old, an adult performer promised to send a picture of her ‘chest bags’ via direct message to anyone who commented on her video. Another flashed her genitalia at the camera.”

This, the newspaper adds, lines up with internal reports and analysis as well. The Wall Street Journal pointed to an analogous test conducted by Meta staff in 2021 which found similar results. Another analysis from the following year revealed “Instagram shows more pornography, gore, and hate speech to young users than to adults,” according to the publication.

This, likely along with political and public pressure, led to Meta’s January announcement of a stricter algorithm for minor users. However, The Wall Street Journal’s test, which includes results from the months following that pronouncement, brings the effectiveness of such changes into question.

Meta, however, disputed The Wall Street Journal’s claims saying the publication’s test is not reflective of the actual experience on Instagram.

“This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” spokesperson Andy Stone told The Wall Street Journal.

Meanwhile, the publication also looked at competitors TikTok and Snapchat. The Wall Street Journal reported less explicit content on both platforms when performing similar tests.

“All three platforms also say that there are differences in what content will be recommended to teens,” Laura Edelson, a computer science professor at Northeastern University who worked on the test with The Wall Street Journal said, according to the publication. “But even the adult experience on TikTok appears to have much less explicit content than the teen experience on Reels.”

“Despite their systems’ similar mechanics, neither TikTok nor Snapchat recommended the sex-heavy video feeds to freshly created teen accounts that Meta did, tests by the Journal and Edelson found,” The Wall Street Journal reported. “On TikTok, new test accounts with adult ages that watched racy videos to completion began receiving more of that content. But new teen test accounts that behaved identically virtually never saw such material—even when a test minor account actively searched for, followed and liked videos of adult sex-content creators.”

A spokesperson for the ByteDance-owned social media platform attributed this to “stricter content standards for underage users and a higher tolerance for false positives when restricting recommendations.”

PetaPixel reached out to Meta for comment but did not receive a response ahead of publication.


Image credits: Header photo licensed via Depositphotos.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment