Is Facebook Addicting Our Kids?

7E8f...dTVY
24 Feb 2026
82

Hey everyone! Welcome to another Learn With Hatty. I am what some people would call an old fart. I am not super old but I am not young. To give you an idea of how old I am, I remember the good old days of AOL and Myspace. Social media now has changed dramatically and it has come to the world’s attention that it might be targeting our children and creating all sorts of problems. In this article we dive deep into the ongoing case of Facebook and and whether their algorithms are addicting our children to social media. I made a video about social media, kids, and addiction if you want to check it out below.


If you’ve heard that Facebook is “in court for making kids addicted to social media” and thought, “Okay, but what exactly are we talking about here?”, you’re in the right place. This isn’t just drama on the timeline. It’s a massive, multi‑state legal fight, backed by years of research and a very concerned U.S. Surgeon General.

What’s actually happening in court?


In late 2023, a bipartisan coalition of 41 U.S. states sued Meta (the parent company of Facebook and Instagram) accusing it of deliberately designing its platforms to be addictive and harmful to children and teenagers. A core federal lawsuit led by states like California and Colorado is joined by parallel suits filed in individual state courts, all painting a similar picture. Meta allegedly engineered its apps to hook young users, hid what it knew about the harms, and violated children’s privacy in the process.

A detailed overview in the BMJ explains that the states’ complaint accuses Meta of using “dopamine‑manipulating” algorithms, “Likes,” social comparison, audiovisual alerts, and infinite scroll to exploit the vulnerabilities of young users’ developing brains and keep them online as long as possible. The New Jersey Attorney General’s office, part of the multistate coalition, adds that these practices have “harmed and continue to harm the physical and mental health of children and teens,” tying the case directly to the broader youth mental health crisis flagged by the Surgeon General.

NPR’s coverage puts it bluntly. More than 40 states say Meta designed “dopamine‑manipulating” features such as recommendation algorithms, unlimited scroll, and likes to addict young people and fuel a youth mental health crisis. They’re asking courts not only for financial penalties, but also for orders forcing Meta to redesign Facebook and Instagram to be safer for kids.

​Why this is happening now: the youth mental health alarm


These lawsuits didn’t come out of nowhere. In May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an official advisory warning that social media may pose a “profound risk of harm” to children and adolescents. The advisory notes that up to 95% of youth ages 13–17 use social media, with more than one‑third saying they use it “almost constantly,” and raises serious concerns about links to poor mental health.

The advisory doesn’t say “all social media is evil,” but it does highlight “ample indicators” that social media can pose significant risks, especially when use is heavy, emotionally intense, or tied to cyberbullying, social comparison, or exposure to self‑harm and eating‑disorder content. It also calls for a multi‑pronged response from policymakers, tech companies, parents, schools, and young people themselves.

Is social media really “addictive,” or just hard to put down?


Let’s talk about the A‑word. Scientists are cautious about slapping the label “addiction” on everything, but a growing body of research shows that “problematic” or “addictive‑like” social media use in youth is associated with higher levels of depression, anxiety, and poorer overall well‑being.

A 2024 narrative review in Cyberpsychology, Behavior, and Social Networking concluded that problematic social media use is consistently linked to depressive and anxiety symptoms in children and adolescents, with girls often showing stronger associations. The authors point to mechanisms like social comparison, fear of missing out, and disrupted sleep as key drivers of harm. A 2025 review of the evidence on youth and social media similarly reported that most studies find a positive association between social media use and depression/anxiety, and that in a subset of studies there is a “dose–response” relationship, more time online, more symptoms. 

Researchers emphasize that not all use is harmful and that social media can provide community and support, but they are increasingly concerned about a pattern known as “problematic social media use”, where young people feel compelled to check constantly, become distressed when they can’t, and sacrifice sleep, school, or offline relationships to stay online.

One 2025 study of adolescents receiving treatment for depression, anxiety, and suicidal thoughts found that those with problematic social media habits (including distress when unable to access social media) tended to show more severe symptoms and poorer well‑being. 

Under the hood, a lot of the concern comes down to design choices that look suspiciously like digital slot machines. Articles discussing the “science of mindless scrolling” explain how infinite scroll removes natural stopping points, variable rewards make each refresh unpredictable, and notifications consistently pull users back into the app, all mechanisms known to stimulate dopamine‑driven reward pathways. A 2025 paper specifically on “Social Media Algorithms and Teen Addiction” digs into how recommendation systems and engagement‑optimizing design can shape compulsive behaviors, further blurring the line between “fun app” and “engineered habit loop.”

What exactly are the lawsuits accusing Meta of?


Back to the courtroom the states’ lawsuits and related cases make several core allegations, which are echoed across official complaints and news coverage.

First, they argue that Meta engineered addictive features aimed specifically at children and teens. The BMJ article on the multistate lawsuit describes design elements such as dopamine‑manipulating recommendation algorithms, “Likes” and social comparison tools, audiovisual and haptic alerts, beauty filters that promote body dysmorphia, and infinite scroll feeds that discourage kids from self‑regulating and logging off. NPR reports that the states describe these as “dopamine‑manipulating” features that have “poisoned an entire generation’s mental health.”

Second, the lawsuits claim Meta concealed or downplayed what it knew about harms to young users. NPR notes that the suits came years after The Wall Street Journal exposed internal Meta research indicating that Instagram could worsen body‑image issues and mental health problems among some teen girls. The BMJ piece likewise reports that Meta is accused of publishing “profoundly misleading” reports that showed low rates of harmful experiences, despite internal evidence of more serious risks.

Third, several complaints allege that Meta violated children’s privacy laws, particularly the Children’s Online Privacy Protection Act (COPPA), by knowingly collecting data from children under 13 without proper parental consent. The New York Attorney General’s press release, for example, states that Meta’s practices violated COPPA and state consumer protection laws by monetizing young users’ personal data without complying with consent requirements. Coverage from PBS NewsHour and a summary on Psychiatrist.com underline that the states are using deceptive‑practices and consumer‑protection laws in the same way they were once used against tobacco and opioid companies.

Finally, the states argue that Meta violated state consumer protection laws by misrepresenting the safety of its products and failing to adequately warn families about known risks. Many officials explicitly compare this effort to earlier lawsuits against cigarette manufacturers and opioid makers, framing it as a similar fight over a powerful industry allegedly profiting from harm to a vulnerable population.

Meta, for its part, says it is “disappointed” that attorneys general chose to sue rather than “work productively” with tech companies to set industry‑wide standards, and points to safety tools, parental controls, and teen‑focused design changes it has introduced, a point repeated in both the BMJ coverage and NPR’s reporting.

The ongoing hearings and legal questions


Beyond the big multistate case, courts across the country are now wrestling with how to handle these kinds of claims. One notable example is in Massachusetts, where the state’s highest court has heard arguments in a lawsuit alleging that Meta designed Facebook and Instagram to be addictive to children through features like endless scrolling and incessant notifications. PBS NewsHour’s coverage explains that Massachusetts is focusing on Meta’s design “tools,” while Meta argues that holding it liable in this way would interfere with its First Amendment and Section 230 protections.

On a separate track, hundreds of individual lawsuits by teens, young adults, and parents have been bundled into a federal multidistrict litigation (MDL) called Social Media Adolescent Addiction/Personal Injury Products Liability Litigation in the Northern District of California. These cases allege that prolonged, compulsive use of Facebook, Instagram, and other platforms contributed to depression, anxiety, eating disorders, self‑harm, and suicide attempts in young people. Law firm updates, like King Law’s “Instagram Mental Health Lawsuit — February 2026 Update”, TorHoerman’s Facebook Mental Health Lawsuit overview, and the Social Media Addiction Lawsuit updates, note that a federal judge has allowed key claims (such as failure to warn, defective design, and deceptive practices) to move forward.
A key legal tension here is Section 230 of the Communications Decency Act, which usually shields platforms from liability for user‑generated content. The states and plaintiffs are trying to sidestep that by focusing on product design (algorithms, interface features) and deceptive business practices, rather than blaming Meta for the content individual users post. A strategy NPR highlights in its lawsuit coverage. Courts are now being asked to decide where to draw the line between protected speech and potentially dangerous “product design,” and those rulings will shape what tech companies are allowed to do with their engagement‑driven features for years to come.

What the science says about harms to kids and teens


Behind the lawsuits and hearings is a pile of research. Some more cautious, some more alarmed about how social media affects young people’s mental health.

A 2020 scoping review of social media use and depression in adolescents found consistent associations between heavy or problematic use and depressive symptoms, with particular concerns around sleep disruption, cyberbullying, and social comparison. That review, “Social media use and depression in adolescents: a scoping review”, emphasizes that it’s not just the number of hours online that matters, but also when and how teens are engaging.

The Surgeon General’s advisory and commentary pieces summarizing it stress that while social media can offer community and support for some youth, there are clear risk patterns: more time spent on platforms, especially late at night and in emotionally intense ways, is linked to poorer mental health outcomes. The New York Times article “Surgeon General Warns That Social Media May Harm Children and Adolescents” does a good job of laying out those concerns for non‑experts, from disrupted sleep to constant comparison. Yale Medicine’s guide for parents underscores the same themes, noting connections between heavy social media use and increased anxiety, depression, and body image concerns, and offering practical strategies for families on setting boundaries and talking about what kids see online. A 2025 review on youth and social media called “Effects of Social Media Use on Youth and Adolescent Mental Health”, highlights three particularly important mechanisms: loss of sleep, intensified social comparison, and exposure to harmful content (like pro‑self‑harm or pro‑eating‑disorder communities). When those factors are present, the link between social media use and poor mental health outcomes is often stronger.

Taken together, the picture isn’t “every kid who touches Instagram is doomed,” but it is “the way these platforms are currently designed can push vulnerable young people into unhealthy patterns, and that risk is big enough that public health officials and courts are stepping in.”

Meta’s response and what might change


Meta has pushed back on the lawsuits, saying it has introduced “over 30 tools and features” to support teens and families, such as parental supervision tools, nudges to take breaks, more restrictive default settings for younger users, and policies to limit harmful content in teen feeds. NPR reported on a 2024 update where Meta tightened restrictions on content related to self‑harm, suicide, and eating disorders for teen accounts in a piece titled “Meta restricts content for teens on Facebook, Instagram”.

Meta also argues that social media is not the sole cause of the youth mental health crisis, points to mixed research findings, and says it supports the idea of industry‑wide standards developed with policymakers rather than piecemeal litigation. At the same time, Education Week has described this moment as a “legal reckoning” for social media companies; its article “Social Media Companies Face Legal Reckoning Over Mental Health Harms to Children” notes that schools and districts are increasingly dealing with the fallout of online behavior in the classroom.

Depending on how the suits and MDL play out, we could see court‑ordered changes such as limits on infinite scroll for minors, stronger notification controls, more aggressive age verification, or even warning labels similar to those used on cigarettes. PBS has already described one proceeding as a “landmark trial accusing tech giants of harming children with addictive social media,” hinting that the outcomes could reshape how these platforms work for young users.

What this means for families, educators, and creators


Even if you’re not about to file a lawsuit yourself, this legal and scientific storm has real implications for how we live online.

For families, the research and advisories suggest focusing less on pure “screen time panic” and more on patterns. Is social media cutting into sleep? Is a teen constantly comparing themselves to filtered images? Do they feel distressed when they can’t check their apps? Resources like the Surgeon General’s advisory and the Yale Medicine guide are good starting points for family conversations and practical steps.

For educators and youth workers, understanding how features like infinite scroll and algorithmic recommendations work can help you teach digital literacy. Instead of telling students “just get off your phone,” you can show them how the apps are designed to keep them there, and how to spot when that design is overriding their own choices. Articles that unpack the “science of mindless scrolling” or explain how social media keeps you clicking can be great teaching tools.

For creators and influencers, there’s a chance to be part of the solution. You can still create engaging content without leaning into FOMO, unrealistic body standards, or “never log off” culture. The research suggests that the quality and emotional tone of content, and how invested followers are, matters a lot in determining whether their social media habit feels healthy or harmful, as shown in reviews like the Cyberpsychology narrative reviewand the 2025 mental health synthesis.

The courts are essentially asking Meta, “What responsibility do you have for the design choices that shape kids’ behavior and well‑being?” The rest of us can ask a parallel question. “What responsibility do we have as parents, teachers, creators, and platforms to make online spaces where young people can participate without sacrificing their mental health?”
Thanks for reading, everyone! I hope you enjoyed this article, and remember. I’m just one man trying to piece together news as I hear about it. Please do your own research, stay curious, and keep learning!

BULB: The Future of Social Media in Web3

Learn more

Enjoy this blog? Subscribe to HattyHats

0 Comments