Why is Meta Under Fire For Facilitating Child Abuse and Sex Trafficking?

*Click the Title above to view complete article on https://newwaveglobal.net/.

In numerous investigations and legal filings, Meta, Instagram and Facebook's parent company, is facing harsh criticism for knowing about how minors are unsafe and are targeted for sex trafficking and abuse on its platforms, and refusing to take concrete action.

2024-03-16T15:44:58+05:00

Meta, Facebook’s parent company, is facing renewed criticism for its role in facilitating and failing to prevent child sexual abuse on its platforms. The owner of Facebook, Instagram and Messenger, Meta has repeatedly come under fire for its failure to act to protect children on its platforms and social media networks.

Facebook and Messenger, two of the company’s most popular platforms, are used by child sex offenders as a marketplace for exchanging information, content and payment. Meta Pay, formerly known as Facebook Pay, functions as a peer-to-peer payments service that allows users to transfer money to their social network connections. Meta Pay is facing criticism for failing to detect or report payments associated with child sexual exploitation.

Two former Meta content moderators told the Guardian that they saw suspicious transactions take place over Meta Pay that they believed were related to child sexual exploitation and trafficking, but they were unable to contact Meta Pay’s compliance team to report these payments.

The Guardian has also reported that Meta Pay’s moderators are not trained for detecting money flows that could be related to child sex trafficking or abuse. Compliance analysts are not trained “in the language, codewords and slang that traffickers often use.”

What’s so bad about parent-run child influencer accounts?

Two new investigations from the Wall Street Journal and New York Times have also brought attention to the exploitation of children by parent-run influencer accounts on social media platforms like Facebook and Instagram, facilitated by Meta's content monetization tools and subscription models.

The Wall Street Journal reported that Meta safety staff alerted the company to adult account owners profiting from exploitative content featuring their own children, sold through Instagram's paid subscription tools. This content often featured young children in suggestive attire, promising videos of them stretching or dancing, with parent-run accounts encouraging inappropriate interactions with followers behind the services’ paywalled ‘subscriptions’ feature, in exchange for payment.

While safety staff recommended banning accounts dedicated to child models or implementing stricter monitoring, Meta opted for an automated system to detect and ban suspected predators. However, employees noted the system's unreliability and susceptibility to evasion.

The New York Times report, titled “A Marketplace of Girl Influencers Managed by Moms and Stalked by Men,” revealed the lucrative business of mom-run Instagram accounts, confirming the sale of exclusive photos and chats with children, with suggestive posts garnering more likes. Male subscribers were found to engage in concerning behavior, including bullying and blackmail.

Instagram’s subscription feature allows users to offer exclusive content that is paywalled from non-paying followers. With child influencer accounts, the exclusive subscription only content features young girls, some only 7 years old, in lace bikinis and leotards. The buyers of this content are overwhelmingly adult men, who are not exactly shy about expressing their overt sexual interest in the children. In their avarice, many parents continue to encourage this behavior as long as the content keeps bringing in money, with some accounts having reportedly earned hundreds of thousands of dollars.

Meta's introduction of end-to-end encryption further complicates efforts to prevent illicit transactions, drawing criticism from child safety experts and law enforcement. Encryption hinders detection of behaviors associated with child exploitation, raising concerns about facilitating criminal activities.

Despite Meta's spokesperson Andy Stone's assurances of preventing suspicious behavior and limiting access to subscription content, the platform's moderation policies have been ineffective. Banned accounts often return, explicit searches and usernames evade detection, and Meta content spreads to offsite forums for predators.

Has there been legal action against Meta?

In December 2023, Raúl Torrez, New Mexico's attorney general, filed lawsuits against Meta, alleging its platforms enable child trafficking and the distribution of CSAM (child sex abuse material). Documents from the legal filings reveal an alarming scale of online sexual harassment faced by children daily. Torrez aims to hold Meta accountable and advocate for regulatory changes prioritizing user safety, particularly for children. The suit alleges that “Facebook and Instagram are breeding grounds for predators targeting children for human trafficking, grooming and solicitation.”

According to the complaint in New Mexico’s filing, Meta “knew that adults soliciting minors was a problem on the platform, and was willing to treat it as an urgent problem when it had to.”

"Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children," Torrez said. He added, "Despite repeated assurances to Congress and the public that they can be trusted to police themselves, it is clear that Meta's executives continue to prioritize engagement and ad revenue over the safety of the most vulnerable members of our society."

“For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation. While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive,” he added.

Despite Meta's assertions of aggressive measures against child exploitation, the company is being criticized for not doing enough. What is clear is that the company cannot be trusted to prioritize the safety of children and prevention of child sex abuse over its own profit and ad revenue. 

View More News