According to an investigation published by The Wall Street Journal on Monday, the algorithm behind Instagram’s Reels service is recommending “disturbing doses of suggestive content” that affects children to test accounts created by the news outlet.
The news outlet conducted this investigation to determine what kind of content the platform would suggest to accounts that mainly followed young gymnasts, cheerleaders, and other teenage and preteen influencers.
The WSJ decided to conduct this test after it found that many subscribers to these types of accounts were adult men, some of whom also expressed interest in sexual content involving both children and adults.
During the tests, the news outlet claims that Instagram’s algorithm played a large amount of suggestive content, including “indecent videos of children as well as explicitly sexual videos with adults.” Between these videos, the platform also showed advertising for some of the biggest brands in the USA, according to WSJ.
As an example, the news outlet cites that Instagram offered a video series that showed an advertisement for a dating app, a video of someone caressing a life-size latex doll, and a video of a young girl lifting her shirt to expose her stomach. Another video series contained a commercial advertisement, followed by a video of a man with his arm around a ten-year-old girl on a bed.
According to WSJ, the Canadian Centre for Child Protection conducted similar tests on Instagram and reported similar results.
Meta Platforms, owner of Instagram and Facebook, responded to the WSJ investigation by stating that the tests by the news portal created an “artificial experience” that was not representative of what the majority of the billions of users see.
Still, a number of companies have chosen to cancel their advertising on Meta’s platforms. Justine Sacco, representative of Match, stated, “We have no interest in paying Meta to market our brand to predators or place our advertisements near this content.”
Last month, Instagram was also charged in a lawsuit by the attorneys general of 41 U.S. states alleging that the platform and its parent company contribute to the ongoing mental crisis among young people by deliberately enticing them into compulsive use of social media.