Instagram’s Algorithm Promotes and Connects ‘Vast’ Network of Pedophiles

According to an investigation published today by The Wall Street Journal in collaboration with researchers at Stanford University and the University of Massachusetts Amherst, Instagram’s recommendation algorithms have enabled a “vast” network of pedophiles seeking child sex abuse content and activity.

“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” The Wall Street Journal reports.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”

‘Large Scale Communities Promoting Criminal Sex Abuse’
According to the report, academics from Stanford’s Internet Observatory and the UMass Rescue Lab were able to quickly find “large-scale communities promoting criminal sex abuse.”

Researchers were able to find illegal underage sexual content on Instagram using explicit hashtags like #pedowhore, #preteensex, and #pedobait.

The Stanford Internet Observatory used such hashtags blatantly associated with underage sex to find 405 sellers of what researchers labeled “self-generated” child-sex material — or accounts purportedly run by children themselves, some saying they were as young as 12.

After creating test users and viewing a single account, they were immediately inundated with “suggested for you” recommendations of possible Child Sexual Abuse Material (CSAM) sellers and buyers on Instagram, along with accounts linking to off-platform content sites.

They offered “menus” of content for users to buy or commission, including videos and imagery of self-harm and bestiality.

Following just a handful of recommendations on the Instagram app caused the test accounts to be overrun with content that sexualizes children and to be recommended more accounts with this illegal content.

Meta is ‘Investigating’ The Issues
Meta says it has set up an internal task force to investigate the issues raised in The Wall Street Journal’s report.

“Child exploitation is a horrific crime,” Meta tells the publication. “We’re continuously investigating ways to actively defend against this behavior.”

In the past two years, Meta says it has taken down 27 pedophile networks and is planning more removals.

Since the Journal’s report, the company says it has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse.

Meta also says it is working on preventing its systems from recommending that potentially pedophilic adults connect with one another or interact with one another’s content.

In April, PetaPixel reported on a further investigation that revealed Meta was struggling to stop child sex trafficking on Facebook and Instagram.

* Article From: PetaPixel