❤️️ Sign Up ©2026 NetHappenings News Email List
https://cyberplayground.org
©2026 Follow@CyberPlayGround
©1998-©2026 *Educational CyberPlayGround®
©2026 https://k12playground.com
©2026 https://RichAsHell.com
©1993 – ©2026 https://edu-cyberpg.com
ESSAY
Nav Toor @heynavtoor
https://x.com/heynavtoor/status/2051488892663349479
If you have a daughter, a sister, a niece, or a younger cousin on Instagram, you should read this once.
In November 2023, a federal court unsealed a lawsuit filed by 33 state attorneys general against Meta. The unsealed pages don’t read like a tech complaint. They read like a confession.
Here is what Meta’s own employees, in Meta’s own words, knew was happening to kids on Instagram.
By 2015, roughly 4 million users under the age of 13 were already on Instagram. The legal age is 13. Meta knew. By 2018, around 40% of 9 to 12 year olds were using Instagram daily.
Between 2019 and 2023, Meta received over 1.1 million reports of under-13 accounts on Instagram. They disabled a fraction of them. The rest stayed live.
Why?
An internal 2024 document put it plainly: “acquiring new teen users is mission critical to the success of Instagram.” A 2017 memo from Adam Mosseri, the head of Instagram, set the goal even earlier: make “teen time spent” the top company priority of the year.
Teens were not the user. Teens were the product line.
In a single day in 2022, Meta’s own systems recommended 1.4 million potentially inappropriate adults to teen accounts. Internal data showed inappropriate interactions on Instagram were 38 times higher than on Facebook Messenger. Meta had an internal acronym for it: “IIC.” Inappropriate interactions with children.
Meta engineers calculated that turning teen accounts private by default would prevent roughly 5.4 million unwanted adult-to-teen interactions every single day. They knew this for years. They didn’t ship private-by-default for teens until 2024.
Now the part that should end careers.
According to testimony from Vaishnavi Jayakumar, a former Meta safety executive, Instagram’s internal policy required an account to rack up 17 separate strikes for sex trafficking before it would be suspended. Seventeen.
A child predator could be reported sixteen times and keep their account.
When Meta’s own researchers proposed safety changes, they were overruled at the top. Internal emails show Mark Zuckerberg personally rejecting proposals from his own well-being team. One of his own executives, Margaret Gould Stewart, wrote back to him on the record: “I respect your call on this and I’ll support it, but want to just say for the record that I don’t think it’s the right call given the risks.”
She was talking about risks to children. He overruled her.
On beauty filters, the ones that morph teen girls’ faces into something they can never look like in real life, Zuckerberg’s defense in 2020 was that there was “no data” showing harm. Meanwhile his own internal survey found that 8% of teens aged 13 to 15 had seen self-harm content on Instagram in the past week. His own 2018 internal study found 58% of Facebook users showed signs of “problematic use.” Publicly, Meta admitted to 3.1%.
The employees were not confused about what they were building.
One internal message: “Oh my gosh yall IG is a drug. We’re basically pushers.”
Another: “Zuck has been talking about that for a while. Targeting 11 year olds feels like tobacco companies.”
A researcher writing about engagement: “Because our product exploits weaknesses in the human psychology to promote product engagement and time spent.”
An engineer on what the algorithm needed to optimize for: “sneaking a look at your phone under your desk in the middle of Chemistry.”
A product manager, on the record: “It’s a social comparison app, fucking get used to it.”
In March 2026, a New Mexico jury awarded $375 million in a case tied to child safety failures on Meta’s platforms. It is one verdict. There are dozens more cases still pending.
Here is the part nobody is telling parents.
The settings exist. Meta just doesn’t turn them on by default for accounts they suspect belong to kids, because the kids don’t have IDs and the parents aren’t watching.
Five minutes tonight:
1. On her phone, open Instagram. Go to Settings → Account privacy. Set the account to Private.
2. Go to Settings → Messages and story replies. Turn off message requests from anyone she doesn’t follow.
3. Go to Settings → Suggested content. Turn off “Sensitive content.” Set everything with a slider to “Less.”
4. Go to Settings → Time. Set a daily limit. 45 minutes is enough.
5. Go to Settings → Tags and mentions. Set to “People you follow” only.
6. Turn off Reels autoplay if you can’t delete Reels entirely.
If she’s under 16, you have the legal right to do this with her, not to her. Sit next to her. Show her the sex trafficking strike policy. Show her the “IG is a drug” quote from the people who built it. She will roll her eyes. She will also remember.
The company that wrote “we’re basically pushers” about itself is not going to protect her.
You are.
Send this to one parent who needs to see it tonight.
▓▓▓—▓▓▓—▓▓▓