New report suggests Instagram’s algorithms are promoting pedophile networks

According to a joint investigation conducted by The Wall Street Journal and academics at Stanford University and the University of Massachusetts Amherst, Instagram’s algorithms are becoming a center of activity for pedophile networks. The report suggests these networks are commissioning and selling child sexual abuse content on Instagram.

The report is, without question, a jarring wake-up call for parents and caregivers who share images of their children on Instagram or allow their children to access Instagram and other social media apps on their own.

Instagram’s recommendation systems “connects pedophiles and guides them to content sellers,” per the report. Researchers found accounts blatantly advertising these images to purchase or commission, using explicit hashtags catered to pedophiles. These accounts offer “menus” of content for users, including videos and imagery of self-harm and bestiality.

“At the right price, children are available for in-person ‘meet ups,’” the WSJ reported.

When the WSJ and academic researchers set up a test account to view content shared by these networks, the Instagram algorithm immediately recommended similar accounts to follow. Per the WSJ: “Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.”

It’s horrifying, to say the least.

“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities,” the WSJ reports. “Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”

The investigation calls out Meta for its dependence on automated detection tools as the reason this can even happen.

In January alone, Meta did take down 490,000 accounts that violated its child safety policies and over the last two years has removed 27 pedophile networks. The company, which also owns Facebook and WhatsApp, said it’s also blocked thousands of hashtags associated with the sexualization of children and restricted these terms from user searches, per The Verge.

Unfortunately, the recent investigation found Instagram’s moderation practices frequently ignored or rejected reports of child abuse material.

In response to the new findings, Meta says it’s creating an internal task force to address the issues specifically raised by the investigation.

“Child exploitation is a horrific crime,” the company said. “We’re continuously investigating ways to actively defend against this behavior.”

This investigation is just the latest reason why parents and caregivers need to take caution when it comes to children and social media. Just last month, the U.S. Surgeon General issued an advisory about the “profound risks” associated with social media and children.

“For too many children, social media use is compromising their sleep and valuable in-person time with family and friends,” Dr. Vivek Murthy, the U.S. Surgeon General, told Motherly in May. “We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis—one that we must urgently address.”

The American Psychological Association has an updated list of social media guidelines for families when it comes to supervising and limiting kids’ access to social media.

With such a lack of oversight on Instagram and other social media networks, the burden of protecting our kids from online harm mostly falls to parents and caregivers. It’s yet another reminder that when it comes to technology, it’s our responsibility to stay current and knowledgeable while monitoring our kids’ usage.

You can read the WSJ report in its entirety here.