A recent court revelation has shed light on a troubling policy within Meta Platforms Inc. that allegedly contributes to the persistence of sex trafficking content on Instagram. Former executive Vaishnavi Jayakumar testified that the platform operated under a “17-strike” policy, allowing accounts to post violating material up to 17 times before facing suspension. This information emerged during a federal court session as part of a multidistrict lawsuit, which accuses Meta, along with Alphabet Inc., ByteDance Ltd., and Snap Inc., of causing harm to youth through their platforms.
Jayakumar, who served as Instagram’s head of child safety and well-being until 2022, expressed her shock upon learning about the policy in 2020. She described her reaction during deposition testimony, stating, “I was horrified.” This policy stands in stark contrast to the stricter one- or three-strike rules implemented for less severe violations, such as hate speech or bullying. Reports indicate that this leniency allowed accounts flagged for sex trafficking to operate with impunity, calling into question Meta’s commitment to user safety.
Details of the 17-Strike Policy
The controversial policy was reportedly a product of Meta’s automated moderation systems, which assign strikes based on the confidence levels of detecting violations. Jayakumar clarified that low-confidence flags required multiple instances—up to 17—before any action could be taken. This threshold has been criticized for enabling prolific offenders to evade moderation efforts.
Further scrutiny was placed on Meta’s practices when internal research allegedly indicated causal links between its platforms and mental health crises among teenagers. Plaintiffs in the lawsuit, representing children harmed by social media, claim Meta prioritized user engagement over safety. Unsealed court documents reveal that sex trafficking was “difficult to report and widely tolerated” on Meta platforms, echoing Jayakumar’s concerns.
Regulatory bodies are closely monitoring these developments. The Federal Trade Commission and Congress may intensify investigations into Meta’s content moderation algorithms, which plaintiffs argue are designed to minimize enforcement actions to enhance engagement.
Background on Jayakumar’s Tenure
Jayakumar joined Instagram in 2020, during a period of internal reflection following concerns raised in the Wall Street Journal‘s “Facebook Files” series. Her efforts to reform moderation practices were reportedly met with resistance from leadership. After leaving Meta in 2022, she joined Snap, another defendant in the ongoing lawsuit.
Her deposition, taken in October 2024, provided critical insights into how traffickers exploited Instagram’s moderation policies. She noted that traffickers utilized coded emojis and indirect language to bypass filters, taking advantage of the high strike threshold.
Meta has yet to directly address the claims regarding the 17-strike policy but has reiterated its commitment to safety. In previous statements, the company has highlighted its progress in removing child exploitation imagery, though it has not provided specific details regarding strike policies.
The multidistrict litigation consolidated in Oakland federal court alleges that the addictive designs of social media platforms have contributed to serious issues such as suicides, eating disorders, and sexual exploitation among minors. Meta is also facing additional legal challenges, including a 2023 class action that accuses CEO Mark Zuckerberg of neglecting trafficking concerns.
As investigations continue, industry insiders are voicing concerns about a “moderation winter,” suggesting that existing AI tools are failing to address the nuanced threats posed by coded trafficking posts.
The plaintiffs are advocating for a zero-tolerance policy, improved AI for detecting coded messages, and mandatory reporting mechanisms. Despite Meta’s claims of 99% proactive removals of child exploitation content, the existence of a high strike threshold raises significant doubts about the company’s commitment to user safety.
As the legal proceedings unfold, the implications of Jayakumar’s testimony could influence the court’s assessment of liability and inaction on Meta’s part. The upcoming enforcement report from Meta will likely face intense scrutiny as stakeholders await further developments in this high-stakes case.
