Instagram ‘connects pedophiles’ – WSJ

0
Instagram ‘connects pedophiles’ – WSJ

Child porn peddlers use codes like “cheese pizza” and exploit the features of Meta’s platform, researchers say

Buyers and sellers of underage-sex content have developed a thriving network thanks to the discovery algorithms in Instagram, the photo-sharing platform owned by the Silicon Valley giant Meta, the Wall Street Journal reported on Wednesday.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests,” the Journal reported, based on investigations it conducted jointly with Stanford University and the University of Massachusetts Amherst.

Sexualized accounts on the platform are “brazen” about their interests, but don’t post illegal materials openly, choosing to offer “menus” of content instead, according to the researchers. They also use certain emojis as code, in which a map stands for “minor-attracted person” – a euphemism for pedophile – and cheese pizza is shorthand for “child pornography,” said Brian Levine, director of the UMass Rescue Lab at Amherst. Many users describe themselves as “lovers of the little things in life,” he said. 

Even a passing contact with a pedophile account can trigger Instagram’s algorithm to begin recommending it to other users. One example quoted by the Journal involves Sarah Adams, a Canadian mother who combats child exploitation. In February, one of her followers messaged her with an account promoting “incest toddlers,” and Adams viewed its profile briefly in order to report it as inappropriate. Over the next few days, Adams said, she received messages from horrified parents who visited her profile only to receive recommendations to view the pedophile one.

“Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts,” David Thiel, chief technologist at the Stanford Internet Observatory who previously worked at Meta, told the Journal.

Researchers said that the platform also allows searches that Meta acknowledges are illegal. Users get a pop-up that notifies them that the content may feature “child sexual abuse” and offers them two options: “Get resources” and “See results anyway.” 

The National Center for Missing & Exploited Children, a nonprofit that works with US law enforcement, received 31.9 million reports of child pornography in 2022. Meta-owned platforms accounted for 85% of the reports, with some five million coming from Instagram alone.

Stanford researchers found some child-sex accounts on Twitter as well, but less than a third than they found on Instagram, which has a far larger user base estimated at 1.3 billion. Twitter also did not recommend such accounts as much, and took them down far faster.

“Child exploitation is a horrific crime,” a Meta spokesperson told the Journal when the company was contacted about the findings. Mark Zuckerberg’s company acknowledged it has had “problems within its enforcement operations” and said it had set up an internal task force to address the situation.

Meta actively tries to remove child pornography, banning 490,000 accounts in January alone and taking down 27 pedophile networks over the past two years. The company also said it had cracked down on sexualized hashtags and adjusted its recommendation system since receiving the Journal’s inquiry.

Comments are closed.