🎙️
AIPodify

The All-In Podcast

“No good comes from kids on social media … and the industry wants them addicted”

March 31, 2026
“No good comes from kids on social media … and the industry wants them addicted”

Episode Summary

AI-generated · Mar 2026

AI-generated summary — may contain inaccuracies. Not a substitute for the full episode or professional advice.

This episode of The All-In Podcast delivers a stark warning about the pervasive dangers of social media platforms for minors, asserting unequivocally that "No good comes out of kids under 16 using social media." The discussion is framed by recent legal setbacks for Meta, which faced two significant verdicts: first, being found liable for facilitating child predators on Facebook and Instagram, and then, a California jury finding Meta and YouTube negligent for designing addictive platforms that contributed to a young user's mental health harm.

The hosts highlight a strong correlation between social media usage, even for just "two or three hours a day," and a significant increase in mental health issues. Specifically, they point to a massive correlation with "depression and anxiety and eating disorders specifically in young girls." This isn't accidental, the argument posits; the industry has an inherent motivation to create addictive products. They claim the industry "has not policed itself, nor would they police themselves because they want to get people addicted now. So, they're addicted when they're adults."

Recognizing this global problem, several countries are already taking legislative action. Australia and Malaysia have enforced 16-year-old age restrictions for social media use, with Spain, Germany, and the UK reportedly close behind. The proposed solution involves the tech giants themselves: if Android and Apple were to "show leadership" by implementing default age verification and age-gating for minors when a phone is purchased, the problem could be largely solved.

Listeners will walk away with a critical understanding of the direct link between social media platform design and a growing youth mental health crisis, backed by recent legal precedents. The episode underscores the industry's disincentive for self-regulation and advocates for systemic, device-level age verification as a vital intervention, emphasizing that individual parental controls are insufficient against platforms designed for addiction.

👤 Who Should Listen

  • Parents concerned about the impact of social media on their children's well-being.
  • Policymakers and legislators exploring regulations for online platforms and youth protection.
  • Mental health professionals dealing with adolescent depression, anxiety, and eating disorders.
  • Tech industry executives and developers involved in platform design and user experience.
  • Advocates for children's rights and digital safety.
  • Anyone interested in the legal and ethical implications of social media addiction.

🔑 Key Takeaways

  1. 1.Meta recently faced two legal verdicts, including liability for child predators and negligence for designing addictive platforms that harmed a young user's mental health.
  2. 2.Social media use for just "two or three hours a day" is massively correlated with depression, anxiety, and eating disorders, particularly in young girls.
  3. 3.The social media industry is inherently motivated to get people addicted, including minors, and will not police itself effectively.
  4. 4.Countries like Australia and Malaysia have already enforced a 16-year-old age limit for social media use, with Spain, Germany, and the UK preparing to follow suit.
  5. 5.The most effective solution proposed is for Android and Apple to implement default age verification and age-gating on devices for users under a certain age.
  6. 6.The problem of child social media addiction is presented as solvable through leadership from device manufacturers rather than relying on platforms or parents.

💡 Key Concepts Explained

Platform Negligence & Liability

This concept describes how social media companies can be held legally responsible for the harm their platforms cause. The episode highlights recent verdicts against Meta, finding them liable for child predators and negligent for designing addictive platforms that damaged a user's mental health, shifting accountability to the platform creators.

Social Media Addiction & Youth Mental Health Crisis

The discussion underscores the direct, massive correlation between regular social media use by minors and a surge in mental health issues, including depression, anxiety, and eating disorders, particularly among young girls. It posits that this addiction is a deliberate outcome of platform design, not an accidental side effect.

Age-Gating & Device-Level Verification

This refers to the proposed solution of restricting social media access for minors (e.g., under 16) through mandatory age verification. The episode suggests that this should be enforced by default at the operating system level by companies like Android and Apple, rather than relying on platform self-policing or parental controls alone.

⚡ Actionable Takeaways

  • Advocate for device-level age verification and age-gating for social media access for children under 16, pushing tech companies like Apple and Google to implement these defaults.
  • Educate yourself and your community on the specific correlations between social media use and mental health issues, such as depression, anxiety, and eating disorders in young girls.
  • Support legislative efforts in your region to enforce age restrictions on social media platforms, mirroring policies in countries like Australia and Malaysia.
  • Critically evaluate the design of social media platforms for addictive features, especially concerning their impact on youth.
  • Consider the implications of recent legal verdicts against Meta regarding platform liability for child safety and user mental health when discussing digital policy.

⏱ Timeline Breakdown

00:00Introduction to Meta's 'rough week' with two legal verdicts against them.
00:08Meta found liable for child predators and negligent for addictive platforms harming mental health.
00:15Claim that 'no good comes out of kids under 16 using social media'.
00:20Correlation between social media use (2-3 hours/day) and depression, anxiety, and eating disorders, especially in young girls.
00:30Argument that the industry won't self-police because they want to get people addicted now for future engagement.
00:40Discussion of countries like Australia and Malaysia enforcing 16-year-old age limits, with Spain, Germany, and the UK following.
00:55Proposal for Android and Apple to show leadership by implementing default age verification and age-gating on phones.
01:00How device-level age verification could solve the problem by requiring parental opt-in for access.

💬 Notable Quotes

"No good comes out of kids under 16 using social media."
"kids who use this two or three hours a day. It's massively correlated with depression and anxiety and eating disorders specifically in young girls."
"the industry has not policed itself, nor would they police themselves because they want to get people addicted now. So, they're addicted when they're adults."
"If Android and Apple showed leadership with Facebook and said, 'We're going to by default when you buy a phone, you're going to have to ageverify kids under a certain age. We could solve this problem.'"

Listen to Full Episode

📬 Get weekly summaries like this one

No spam. Unsubscribe anytime. By subscribing you agree to our Privacy Policy.