LawsuitSocial Media

Children targeted with sexually explicit photos on Facebook and Instagram, lawsuit claims

Facebook application is seen on an Apple iPhone screen.
Facebook application is seen on an Apple iPhone screen. Worawee Meepian / Alamy Stock Ph/https://www.alamy.com/Alamy Stock Photo

 

— 

An Apple executive in 2020 alerted Meta that their 12-year-old child had been “solicited” on Facebook, according to a newly unredacted version of the complaint in a lawsuit against Meta brought by New Mexico’s Attorney General in December.

The lawsuit accuses Meta of creating a “breeding ground” for child predators.

The 2020 incident — which is detailed in the newly unredacted complaint citing internal documents — is part of a years-long history of people inside and outside of the social media giant run by Mark Zuckerberg raising concerns about young users’ exposure to sexual and inappropriate content, the attorney general alleges.

Join YouTube banner

“This is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store,” the internal Meta document said, according to the complaint. It continued by “asking whether there was a timeline for when ‘we’ll stop adults from messaging minors on IG Direct.’”

Meta’s apps still appear on the Apple App Store, and it’s not clear whether the company itself ever questioned Meta about the issue directly. Apple did not immediately respond to a request for comment.

The anecdote is one of several in the newly unredacted complaint, which alleges that Meta employees have repeatedly raised alarms that the company was not doing enough to protect young users from exploitation.

Meta pushed back against that assertion in a statement Thursday.

“We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents. We’ve spent a decade working on these issues,” Meta spokesperson Liza Crenshaw said in a statement. “The complaint mischaracterizes our work using selective quotes and cherry-picked documents.”

Crenshaw added that Meta uses “sophisticated technology” and partners with child safety experts, the National Center for Missing and Exploited Children and law enforcement to help protect young people and “root out predators.”

The complaint alleges that Meta has long known both that it struggles to detect when young and underage users misrepresent their ages on its platformsand that its apps expose young users to sexual content and inappropriate messages from adults.

Join YouTube banner

For example, in a May 2018 presentation to Meta’s Audit committee, the company said that “a disproportionate number of our younger users register with an inaccurate age,” the complaint states.

Two years later, the company allegedly formed an “Underage Enforcement War Room” to address a growing backlog of accounts suspected of belonging to children who had registered as older users.

A separate lawsuit against Meta brought by 33 states last year also accused the company of refusing to shut down the majority of accounts belonging to children under the age of 13 and collecting their personal information without their parents’ consent.

A 2019 internal document cited in Thursday’s updated filing describes the types of “human exploitation” that take place on Meta’s platforms, according to the complaint.

“The company determined that recruiting and exploiting (or advertising) victims for profit were the most common,” the complaint states. “Meta noted that it had observed traffickers using romance to build trust and rapport with potential victims … and using Messenger ‘to coordinate trafficking activities.’”

And a 2021 internal presentation estimated that 100,000 children received sexual harassment daily, including “pictures of adult genitalia,” the complaint states.

For its part, Meta said in a blog post earlier this month that it has launched technology to proactively detect and disable accounts displaying suspicious behaviors, and that it formed a Child Safety Task Force to improve its policies and practices around youth safety.

Meta points to its 30 safety and well-being tools to support teens and families, including the ability to set screen-time limits and the option to remove “like” counts from posts.

Meta on Thursday introduced a new “nighttime nudge” feature that will encourage teen users to get off the app if they’ve spent more than 10 minutes scrolling late at night.

But New Mexico Attorney General Raúl Torrez has said the company needs to do more to protect children and teens.

“Parents deserve to know the full truth about the risks children face when they use Meta’s platforms,” Torrez said in a statement. “For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation.”

CNN