Elon Musk weighed in on newly unsealed court filings alleging that sex trafficking was “widely tolerated” on Meta’s platforms. On November 25, 2025, he reposted a claim from the X account Not the Bee, pointing to what he described as glaring inconsistencies in Meta’s enforcement policies. In his repost, Elon Musk argued that Meta punished minor infractions more harshly than severe violations. “On Instagram or Facebook, misgendering a trans person resulted in an immediate ban, but trafficking child pr*stitution allowed 17 strikes!” he wrote. The Not the Bee post that Musk shared was linked to an article discussing a lawsuit involving four major social media companies. The article stated that newly unsealed court documents, released on Friday, November 21, alleged that the social media platform failed to adequately address criminal exploitation on its platforms. According to the article, during testimony, Instagram’s former head of safety and well-being, Vaishnavi Jayakumar, told the court that the platform allowed offenders an unusually high number of violations before issuing a ban. Former Meta employee testifies company allowed “17 strikes” for sex-trafficking violations A former senior safety official at Meta testified in federal court that Instagram once allowed accounts involved in human sex trafficking to accumulate as many as 17 violations before facing suspension. Her testimony, filed on November 21 in the Northern District of California, offered an unusually detailed look inside the social media platform’s internal enforcement systems during a period when concerns over child safety were already intensifying. Vaishnavi Jayakumar, formerly Instagram’s head of safety and well-being, said she was alarmed by several gaps in the platform’s safety protocols. She explained that as of March 2020, Instagram still lacked a dedicated tool for reporting child s*xual abuse material (CSAM), despite the scale and urgency of the issue. Jayakumar said she pushed repeatedly for such a feature but was told it would require significant engineering work. “It was very surprising to me,” Jayakumar said of the absence of a CSAM-reporting feature (as per USA Today) She added that she had raised this concern many times, emphasizing that she had tried “multiple times” to get the tool built but was repeatedly told that it would require too much work. Jayakumar also described what she had learned about an internal moderation approach that she referred to as “17x” strike policy for pr*stitution and s*xual solicitation violations. She explained that this policy effectively allowed users connected to sex-trafficking behavior to break relevant rules repeatedly, with no suspension until they reached the seventeenth violation. “That means that you could incur 16 violations for pr*stitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” she said (as per USA Today) She further characterized that this penalty had a “very, very high strike threshold.” According to the plaintiffs, internal documents corroborated her account. They also argued that the company had never warned parents, schools or the public. Their filings claimed Meta permitted these violations to accumulate quietly, without transparency and without timely intervention. Meta rejected the characterization. A company spokesperson told USA Today on November 22 that the company had since transitioned to a strict immediate-removal system for severe violations. The spokesperson explained that the original strike system launched in 2019 had been repeatedly tightened over time, eventually evolving into what the company called a “one strike” standard. Jayakumar’s testimony came amid high-profile multidistrict litigation in which school districts, parents, and state attorneys general accused major social platforms like Meta, ByteDance (TikTok), and Snap Inc. (Snapchat), contributing to what they described as an unprecedented mental-health crisis among American teenagers. They argued that the social media platform, in particular, built its products using strategies reminiscent of those long associated with the tobacco industry, prioritizing engagement from young users because they generated the highest long-term advertising value. Previn Warren, co-lead attorney for the plaintiffs, told Time that the company knowingly designed its platforms to be addictive to minors “Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” Warren told Time. He added that this mirrored the logic of an industry selling harmful products to children, saying that, like tobacco companies, the social media platform continued despite the risks because “more usage meant more profits.” Meta spokesperson Stephanie Otway has pushed back strongly on these allegations. She argued that the allegations misrepresented the company’s efforts and relied on “cherry-picked quotes” and “misinformed opinions.”.
https://www.sportskeeda.com/pop-culture/news-trafficking-child-pr-stitution-allowed-17-strikes-elon-musk-reacts-tweet-claiming-sex-trafficking-widely-tolerated-meta-s-sites
“Trafficking child pr*stitution allowed 17 strikes!”: Elon Musk reacts to tweet claiming “sex trafficking was “widely tolerated” on Meta’s sites”