Close
Updated:

Social Media Addiction Lawsuit

This page is about social media addiction lawsuits and who is eligible to bring a claim. Our lawyers also provide the latest social media class action lawsuit news.

The problem that led to social media lawsuits is millions of people – too many of whom are children – are addicted to social media platforms such as Facebook, Instagram, Snapchat, and others. For these vulnerable users, social media addiction can be very harmful and lead to things like eating disorders, depression, and, in some cases, suicide.

Now, these companies are facing a wave of new social media lawsuits alleging that they knowingly designed the algorithms of their platforms to lure young people into harmful addictions.

Our national mass tort lawyers are currently seeking social media addiction lawsuits from individuals who became addicted to social media before age 21 and suffered a severe physical injury as a direct result of that addiction. Physical injuries resulting from social media addiction could include suicide, self-harm, or an eating disorder. This page will provide news and updates on the social media addiction litigation, as well as our estimates on the potential settlement value of these cases.

If your child has suffered severe harm as a result of addiction to social media, call today at 800-553-8082 or contact our lawyers online.

News and Updates for Social Media Lawsuits

Below are the latest news and updates in the social media MDL and related lawsuits.

December 17, 2024: New Lawsuit

In a new lawsuit filed yesterday in the MDL, the family of two adolescents alleges that social media platforms, including Meta (Facebook and Instagram), Snap Inc., TikTok, and Google (YouTube), are responsible for causing severe mental health injuries. The plaintiffs, minors from Hooks, Texas, claim that addictive design features and inadequate warnings about the dangers of these platforms contributed to compulsive use, depression, anxiety, self-harm, and suicidal ideation, including cutting.

The lawsuit adopts the allegations of the Second Amended Master Complaint, asserting claims of strict liability (design defect and failure to warn), negligence, violations of consumer protection laws, and other causes of action. The plaintiffs seek damages for the significant harm suffered due to the defendants’ alleged misconduct in designing, marketing, and maintaining their platforms.

December 16, 2024: October Trial Date

The October 2025 trial date is off, unfortunately.

December 14, 2024: Next CMC

The next case management conference is set for January 17 before MDL Judge Yvonne Gonzalez Rogers.

December 13, 2024: Meta Pre-Trial Discovery

Magistrate Kang rejected Meta Platforms’ claims that additional discovery demands from personal-injury plaintiffs are overly burdensome, noting that Meta’s production is minimal compared to the millions of documents it has demanded from plaintiff states.
During a contentious hearing, Judge Kang ordered Meta to include documents from the company’s Chief Product Officer. But the judge did deny broader requests, emphasizing fairness and compromise. While Meta argued that its discovery process is largely complete, the judge pointed out that Meta has imposed significant burdens on smaller state agencies with limited resources, calling on all parties to “work harder” to meet deadlines. State attorneys argued Meta’s requests are overly broad and yield irrelevant data, while Meta defended its approach, citing the states’ pursuit of substantial damages. Judge Kang refused Meta’s suggestion to sanction the states for alleged delays, warning that a formal motion may be “fully unjustified.” With bellwether trials postponed from October 2025, the next case management conference is set for January 17 before U.S. District Judge Yvonne Gonzalez

December 12, 2024: Erased Usage Data Problem

One big issue in specific cases will be the plaintiff’s social media usage pattern.  The magistrate judge ruled yesterday that an electronic discovery vendor hired by the plaintiff can be deposed regarding a declaration he provided. The vendor, a digital forensic expert, analyzed a reset iPhone central to the case to assess how much data was lost and detailed his findings in a declaration submitted by the plaintiff to counter claims of evidence spoliation.

The phone is significant because it was used to access social media platforms and allegedly contained data about the plaintiff’s usage patterns, interactions, and digital behavior—evidence critical to claims of social media-related harm. The plaintiff performed a factory reset on the device, erasing its data, which led to defendants alleging spoliation of evidence. The plaintiff argued that the reset did not result in significant data loss and relied on the vendor’s forensic analysis to support this position. The vendor’s findings included an evaluation of recoverable data and an assessment of what was permanently lost.

The court determined that by submitting the vendor’s declaration, the plaintiff effectively made him a testifying expert for the disclosed matters, waiving protections for non-testifying consultants under Rule 26(b)(4)(D). The court ruled the deposition must proceed but limited it to the topics discussed in the declaration, excluding privileged communications or unrelated matters. Plaintiff has the option to avoid the deposition by withdrawing the declaration within ten days and agreeing not to rely on it further.

December 2, 2024: Nearly 200 New Cases in MDL

The social media addiction MDL experienced its highest growth to date, with 195 new cases added in November—representing a 31% monthly increase. The MDL now has 815 pending cases.  Our lawyers are still very bullish on this litigation in cases where the facts of the case are strong.

November 19, 2024: Suicide Lawsuit

The family of 15-year-old boy from Charlotte, North Carolina, alleges that prolonged use of Instagram contributed to his death by suicide nearly three years ago.
The lawsuit, brought by child’s father as the administrator of his estate, claims that Instagram’s design and algorithms led to his addiction and exacerbated mental health conditions, including anxiety, depression, and compulsive social media use.

November 13, 2024: NY Teen Files New MDL Case

A 17-year-old girl from Fishkill, New York became one of the most recent plaintiffs to join the social media addiction class action MDL. According to her complaint, which was filed directly in the MDL using the Short Form Complaint, the teen became heavily addicted to Instagram, TikTok, Snapchat and YouTube (Facebook was not named). As a result of her addiction to these platforms, the teen claims that she developed anorexia, depression, anxiety, and engaged in self-harm.

November 5, 2024: New Lawsuit Filed in the MDL

In a new lawsuit filed yesterday, the family of a Philadelphia minor identified as Z.C. has brought claims against major social media companies Meta Platforms, Instagram, TikTok, Snap Inc., and YouTube as part of the social media MDL.
The family alleges that Z.C.’s use of these platforms led to severe psychological and physical harm, including addiction, depression, anxiety, and self-harm, due to the companies’ alleged failure to adequately warn users and to implement safer design standards. The lawsuit asserts causes of action for strict liability for defective design and failure to warn, general negligence, violations of consumer protection laws, fraudulent misrepresentation, and wrongful death and survival claims.

November 1, 2024: 27 Cases Added to MDL

We saw a total of 27 new cases added to the social media addiction class action MDL last month. That is more than the 10 cases that were added in September, but still a relatively low volume. This will likely be a relatively low-volume mass tort, which is probably a good thing for victims in terms of settlement amounts and how long it takes to advance the litigation. There are now 620 pending cases in the MDL.

October 21, 2024: Getting Cases Ready for Trial

The key to getting to a settlement is getting bellwether cases to trial.  The bellwether plaintiffs have been in active pretrial discovery.  There have been disputes with nearly all of them resolved except for certain requests. But the parties have decided to kick that can down the road a while longer. There are also a few remaining unresolved RFPs specific to MDL claims. Despite these exceptions, the California state court and MDL are lining up nicely.  There is not good reason to delay the path the first trials.

October 19, 2024: TikTok Slow Production

Plaintiffs have expressed concern over the slow pace of TikTok’s document production, which totals only 167,000 documents, significantly fewer than other defendants like Snap (243,000), Google (382,000), and Meta (1,400,000). Plaintiffs have also flagged metadata issues, including missing information on documents, unidentified custodians/authors, and inaccurate creation/modification dates. Plaintiffs’ lawyers complain that these deficiencies are prejudicing their ability to prepare for depositions.  The documents are a key in any litigation, particularly this one.

TikTok actually acknowledges the concerns about the pace of production and says it has already begun investigating and addressing these issues before plaintiffs’ attorneys raised them. They have increased resources to expedite production, proactively engaged with plaintiffs, and established weekly meetings to resolve concerns.

October 15, 2024:  Proving Causation in Social Media Lawsuits

In the social media addiction lawsuits, plaintiffs must overcome two key legal challenges: general causation and specific causation. General causation requires proving that the design of social media platforms—especially their algorithms—can cause or significantly contribute to harmful mental health outcomes. In this litigation, those harms include addiction, depression, suicidal ideation, and suicide attempts or deaths. Plaintiffs must demonstrate that these platforms’ structures and content recommendation engines are not neutral tools but inherently risky systems that increase the likelihood of psychological harm.

Proving that these algorithms are a negligent moral hazard—designed to exploit vulnerabilities for profit—should be relatively straightforward. If one of these cases go to trial, many jurors will take this as a given before they hear a word of testimony.  These systems are much like a road without guardrails along a cliffside, inherently dangerous and making harm not just possible but predictable. With the right internal documents and expert testimony, plaintiffs can show that social media companies knew their platforms could foster addiction, depression, and mental health crises—but chose not to act.

However, specific causation presents a more difficult hurdle. Plaintiffs must prove that, in a specific case, the platform’s algorithms were a substantial factor in causing the user’s mental decline, suicide attempt, or tragic death.

This is where the legal battle becomes a bit more complicated. Suicidal intent, mental health struggles, and other crises are inherently multifactorial, involving mental illness, personal trauma, family dynamics, and external stressors. Isolating the precise role of social media within this web of factors will be challenging in many cases. Defense attorneys will undoubtedly argue that other circumstances were more significant. If your case makes it to discovery—a rare outcome—defense lawyers will closely scrutinize every aspect of the victim’s life to identify alternative explanations for what happened.

That said, the legal standard in most jurisdictions requires only that social media interactions were a substantial contributing cause—not the sole cause—of the harm. As a result, many cases will offer strong grounds to argue that social media was not just background noise, but the dangerous push that ultimately tipped the victim over the edge.

September 28, 2024: School and Individual Victims in Same Class Action

The Social Media Addiction MDL is unusual in that it combines two distinct types of plaintiffs in a single multidistrict litigation: governmental entities and personal injury victims.
The governmental plaintiffs – states, municipalities, or school districts – are pursuing claims related to the public health impact of social media platforms, arguing that these platforms have contributed to widespread mental health issues, increased healthcare costs, and strains on social services. On the other hand, personal injury plaintiffs are individuals or families alleging direct harm, such as addiction, mental health disorders, or even suicide, as a result of social media use.

The governmental cases can consume a disproportionate share of the court’s resources and focus. (We see this in AFFF firefighting foam litigation which is why this is on my mind.)  The governmental plaintiffs, with their expansive claims and large-scale discovery demands, can inadvertently “suck the air out of the room,” shifting attention away from the personal injury cases that deal with the direct, individualized harm suffered by plaintiffs. This dynamic can lead to frustration among personal injury victims and their attorneys, who may feel that their cases are being overshadowed or delayed by the broader governmental claims.

There is also the fear that that the governmental plaintiffs will complicate settlement negotiations, as their interests may not align with those of individual victims. While government entities might seek broader injunctive relief or policy changes. Plaintiffs just want compensation for specific injuries and damages. This divergence in goals can create an imbalance, making it challenging for the court and the parties to adequately address the needs of both groups within the same MDL structure.

September 23, 2024: The Path to a Social Media Addiction Trial

A clear path to realistic trial dates has emerged for social media addiction lawsuits and it is huge step toward getting to a social media addiction settlement.  We now have a tentative – all mass tort trial dates are tentative – for what looks to be January 2026  Here is the schedule:

  • Substantial Completion of Document Production: November 5, 2024
  • Close of Fact Discovery: April 4, 2025
  • Joint Status Report on Protocol for Production of Expert-Related Documents: April 21, 2025
  • Non-Case-Specific and Causation Experts (Plaintiffs’ Opening Reports): May 16, 2025
  • Case-Specific Experts (Plaintiffs’ Opening Reports): May 19, 2025
  • Identification of Bellwether Trial Pools: May 23, 2025 at 12:00 p.m.
  • Hearing re Identification of Bellwether Trial Pools: June 13, 2025 at 9:00 a.m.
  • Non-Case-Specific and Causation Experts (Defendants’ Responsive Reports): July 9, 2025
  • Case-Specific Experts (Defendants’ Responsive Reports): July 11, 2025
  • Non-Case-Specific and Causation Experts (Plaintiffs’ Rebuttal Reports): July 30, 2025
  • Case-Specific Experts (Plaintiffs’ Rebuttal Reports): August 1, 2025
  • Close of Expert Discovery: August 27, 2025
  • Exchange Preliminary Witness Lists: September 10, 2025
  • Exchange Preliminary Proposed Jury Instructions: September 22, 2025
  • Deadline to Meet and Confer on Additional Discovery Needs: September 22, 2025
  • Submit Joint Status Report on Additional Discovery Needs: September 24, 2025
  • Dispositive and Rule 702 (Daubert) Motions (Opening Briefs): September 24, 2025
  • Submit Joint Letter Brief on Remaining Discovery Disputes: September 29, 2025
  • Dispositive and Rule 702 (Daubert) Motions (Opposition Briefs): October 27, 2025
  • File Proposed/Disputed Jury Instructions: October 27, 2025
  • Dispositive and Rule 702 (Daubert) Motions (Reply Briefs): November 25, 2025

This would seem to be the path to a trial in January or February 2026.  Is that a long way way? Frustratingly for victims, it is. But in mass tort world, it is very reasonable.

My two cents: You need the pressure of a trial date to ever get to a global social media settlement.  These companies would never offer compensation to victims without a trial lurking.  The reality is it is unlikely that these companies would ever risk a trial.  They are going to fight like crazy… and then settle.

September 20, 2024: Snapchat Sexual Assault Lawsuit

Snap Inc., the parent company of Snapchat, has settled a Connecticut state court case accusing it of enabling sexual predators to lure victims through the use of Bitmojis—cartoon-like avatars that represent users on the platform. The case involved a girl who was raped by two men she met via Snapchat.

September 18, 2024: Instagram Changes, Probably in Response to This Litigation

Instagram  announced a move to make teen accounts private by default as part of an effort to address rising concerns about social media’s impact on children, particularly the growing issue of social media addiction and the platform’s role in exposing young users to harmful content. This change comes, our lawyers think, from the wave of lawsuits accusing Meta and others of contributing to a youth mental health crisis by deliberately designing addictive features that encourage excessive use among teens, often leading to severe mental health issues.

Starting this week in the U.S., U.K., Canada, and Australia, new Instagram users under 18 will automatically have their accounts set to private, and existing teen accounts will be transitioned to more restrictive settings over the next two months. These settings will limit who can contact teens and restrict access to sensitive content.

Will this help?  Yes? Does it go far enough to combat the root issue: Instagram’s design that built to hook young users and keep them scrolling for hours?  No. These changes, such as private accounts and time notifications, will not fully address the long-term damage caused by social media addiction. It is good to have restrictions on messaging and a “sleep mode” to reduce notifications at night.  But the platform still allows teens to bypass warnings and continue using the app beyond recommended time limits.

We would love to be wrong. So let’s see how it plays out. But our social media addiction lawyers predict these changes will do little to alter Instagram’s underlying business model, which relies heavily on keeping users, particularly vulnerable teens, engaged for as long as possible.

September 11, 2024: Warning Label on Social Media Platforms

A bipartisan coalition of 42 attorneys general has called on Congress to introduce warning labels on social media platforms to address the mental health risks posed to young people. The initiative supports U.S. Surgeon General Dr. Vivek Murthy’s earlier recommendation for warning labels, aimed at raising awareness about the addictive algorithms and harmful effects of social media on youth. New York Attorney General Letitia James emphasized the importance of this step in combating social media addiction and its impact on mental health.

The attorneys general who signed the letter come from various states and territories, including California, Texas, Florida, New York, Pennsylvania, Illinois, Ohio, Georgia, North Carolina, Michigan, New Jersey, Virginia, Washington, Arizona, Massachusetts, Tennessee, Indiana, Missouri, Maryland, Wisconsin, Colorado, Minnesota, South Carolina, Alabama, Louisiana, Kentucky, Oregon, Oklahoma, Connecticut, Iowa, Mississippi, Arkansas, Nevada, Kansas, New Mexico, Nebraska, Idaho, West Virginia, Maine, New Hampshire, Montana, Delaware, and the District of Columbia.

In 2024, just about everything is a political issue. But there is bipartisan consensus on the need to do more to protect children.

September 5, 2024: Joint Request Regarding Extension Of Time To Negotiate Case-Specific Search Terms For Electronic Discovery

Both the plaintiffs and defendants have jointly requested an extension of time to finalize negotiations over case-specific search terms for electronic discovery. These terms will be applied to data sources from bellwether personal injury plaintiffs. The parties have already agreed on general search terms but need more time to negotiate a few remaining case-specific terms.

The current deadline for these negotiations was August 23, 2024. The parties now request an extension to August 30, 2024, with any necessary letter briefing to be submitted by September 6, 2024. This is the second request for an extension, and the parties assert that this additional time will not affect any other court deadlines. The court is asked to approve this stipulated extension.

July 15, 2024: Roblox Corp. Named As Additional Defendant In New Social Media Addiction Lawsuit

A new social media addiction lawsuit filed directly in the MDL recently is one of the very first to name Roblox Corp. as an additional defendant. Roblox Corp. owns the popular online gaming platform called Roblox.

It is not a traditional social media platform, but rather a gaming platform with social media elements. According to the lawsuit, the 13-year-old plaintiff became addicted to Roblox and Snapchat and this addiction eventually caused her to become the victim of child sexual abuse and depression.

There is no further explanation in the Short Form Complaint about exactly what happened, but presumably, the child met a sexual predator on the gaming platform.

July 12, 2024: Judge Denies TikTok’s Request For Comprehensive Forensic Images Of All Personal Devices Used By Key Plaintiffs

In a big win for victims, the magistrate judge in the social media MDL denied a request by TikTok for comprehensive forensic images of all personal devices used by key plaintiffs.

TikTok’s lawyers sought forensic copies of all devices the plaintiffs used to access social media, including cellphones, iPads, laptops, and desktops, to analyze whether other apps or features contributed to their alleged addictions. The company’s attorneys argued to the court that understanding the device usage patterns was essential for TikTok’s defense.

The judge disagreed. His ruling emphasized that TikTok had not demonstrated a compelling need for such extensive data and highlighted the privacy issues involved.

July 11, 2024: New Wrongful Death Case Filed In Social Media Addiction MDL

Earlier this week, a new wrongful death case was filed in the social media addiction class action MDL. The complaint was filed on behalf of a 17-year-old girl from Missouri. The lawsuit claims that the girl became addicted to Snapchat and TikTok when she was only about 10 or 11 years old and that this addiction led to severe mental depression that eventually led to her starting with self-harm and later committing suicide.

July 1, 2024: Social Media Addiction Lawsuit Continues To Grow

The number of social media adolescent addiction lawsuits rose from 475 to 499, a 5% increase.

But these claims keep coming, including a new social media addition suit today.  In that case, a 19-year-old California woman alleges that Instagram and Snapchat led her to addiction/compulsive use, depression, anxiety, and self-harm, including suicidality, attempted suicide, death by suicide, and other forms of self-harm when she was a teenager.

June 23, 2024: Cultural Climate Leans In Favor Of Plaintiffs

Yesterday, the New York Times published a front-page article titled “How Mark Zuckerberg’s Meta Failed Children on Safety, States Say,” focusing on the social media addiction lawsuits.

The cultural climate and prevailing opinions about social media companies are clearly in the plaintiffs’ favor in this litigation.  Juries will be primed to believe these companies – and Meta’s Facebook and Instagram in particular – are not doing what they can to protect children.

June 22, 2024: Meta Denies Liability For Illegal Content On Their Social Media Platforms

The MDL judge made little effort this week to disguise her skepticism about Meta’s attempt to dismiss child pornography possession allegations. The judge pushed back on Meta’s claim of ignorance regarding illegal content on Facebook, Instagram, and its other platforms as disingenuous.

Meta’s lawyer argued – as they always do – that Section 230 of the Communications Decency Act shields the company from liability for third-party content. That has been a successful song in many lawsuits against social media companies.  But we do not think it fits this litigation and so far the judge appears to agree.

June 17, 2024: U.S. Surgeon General Now Calling For Warning Label On Various Social Media Platforms

The U.S. Surgeon General is now calling for social media platforms like Facebook, Instagram, and TikTok, to be required to have a warning label. The warning labels, which would be similar to those on cigarettes, would caution about the high risk of teen addiction and potential negative health consequences associated with social media use. This is major public relations win for the plaintiffs in this litigation.

June 12, 2024: New Study Highlights Significant Impact of Internet Addition On Brain Functionality

A new study highlights the increasing concern over internet addiction among adolescents, showing significant impacts on their brain functionality .

The study, which reviewed twelve functional magnetic resonance imaging studies, demonstrates that internet addiction affects various neural networks in adolescents’ brains, leading to both increases and decreases in functional connectivity. Specifically, the research found alterations in the default mode network, executive control network, salience network, and reward pathways. These neural changes are linked to behavioral issues such as impaired cognitive control, heightened impulsivity, and problematic internet usage.

This study and its brethren are relevant to social media addiction lawsuits.  Plaintiffs’ social media lawyers will leverage these findings to argue that prolonged and unregulated internet use directly contributes to neurodevelopmental disorders. The evidence of altered functional connectivity in critical brain networks provides a scientific basis for claims for the core claims of the adverse effects social media has on young users. This is why social media companies should not be working overtime to get these kids addicted to their platforms.

May 28, 2024: MDL Judge Orders Defendants To Provide Specific Fact Sheets And Account Data For Bellwether Person Injury Plaintiffs

The MDL judge has issued an order regarding the bellwether personal injury actions in the Social Media Adolescent Addiction/Personal Injury Products Liability Litigation. The order requires defendants, including Meta, Snap, TikTok, and YouTube, to provide specific fact sheets and account data snapshots for selected bellwether personal injury plaintiffs.  These fact sheets get at the plaintiff’s usage of the platform.

May 25, 2024: Status Conference Addresses Issues Concerning Plaintiff Fact Sheets, Voluntary Dismissals, And Lexecon Objections

At a status conference last week, the court addressed issues with plaintiff fact sheets, voluntary dismissals, and Lexecon objections, agreeing to try cases in their home districts and ordering immediate discovery for replacement bellwether cases once selected. A protocol for short-form complaint amendments was discussed, and the court granted Meta’s motion to submit supplemental authority for its motion to dismiss. The court also addressed voluntary dismissals in specific cases, granting some and requesting proper filing in the member-case docket for others.

May 15, 2024: New Social Media Addiction Lawsuit Filed Directly In MDL

A new social media addiction lawsuit was filed directly in the MDL recently. The plaintiff is a 19 year-old from Charleston, South Carolina who claims that from 2018 to 2024, she became heavily addicted to Instagram, Snapchat, and Tiktok. The Complaint asserts that this addiction caused her to suffer from depression, anxiety, and other mental health issues. The Complaint also alleges that her addiction was directly responsible for the plaintiff getting into a serious auto accident resulting in major injuries.

April 16, 2024: Zuckerberg Cleared Of Personal Liability 

Mark Zuckerberg has been cleared of personal liability in approximately two dozen lawsuits that claimed Meta Platforms Inc. and other social media companies caused addiction in children.

MDL Yvonne Gonzalez Rogers ruled that Zuckerberg does not have a special duty to disclose safety information about Facebook and Instagram’s risks to children, absent a direct relationship with the users.

Zuckerberg haters will be disappointed. But this means nothing in the overall litigation.  We never thought it was a good idea to name him personally.

April 24, 2024: Judge Expresses Frustration Over Confidentiality Disputes 

U.S. Magistrate Judge Peter H. Kang expressed frustration at a hearing in San Francisco, urging lawyers to resolve disputes over confidentiality designations without court intervention. During the hearing, there was contention over Meta’s habit of marking a substantial portion of documents as confidential. Defense lawyers have a proclivity to make everything confidential.

The hearing also touched on other issues like litigation hold requests, source code, and document production speed, with Judge Kang consistently emphasizing the need for parties to confer and settle most disputes independently. Additionally, discussions arose about whether to seal the identities of potential corporate witnesses, with differing views on the necessity and fairness of such actions.

April 1, 2024: New Lawsuit Filed In Social Media Addiction MDL Over Suicide Of 15 Year-Old Boy

In a new lawsuit filed in the social media addiction MDL, the family of a 15-year-old boy claims that he committed suicide as a direct result of being sexually blackmailed on social media.

According to the lawsuit, online scammers from South Africa coerced the teen into giving them explicit pictures. Once they had the pictures, they proceeded to blackmail him into giving them $3,500 by threatening to release the pictures to everyone he knew. The incident eventually drove the to take his own life.  The lawsuit was filed against Facebook (Meta), Instagram, and other social media companies.

The complaint asserts claims for strict liability, negligence, violation of unfair trade practices/consumer protection laws, fraudulent concealment and misrepresentation, negligent concealment and misrepresentation, violations of federal statutes related to sexual exploitation of minors, wrongful death, survival action, and loss of consortium and society.

March 26, 2024: California Judge Postpones Ruling In Anticipation Of Upcoming Supreme Court Decision 

On Friday, a federal judge in California announced her decision to postpone ruling on whether the claims brought by states in multidistrict litigation concerning the purportedly addictive design of social media platforms should be presented to a jury.

This delay is pending the U.S. Supreme Court’s forthcoming decision in SEC v. Jarkesy. Plaintiffs’ counsel contended that this case before the high court could have implications for the Seventh Amendment rights of tech companies.

March 21, 2024: California Judge Introduces Incentives To Expedite Depositions And Document Production 

A California magistrate judge introduced incentives to expedite depositions and document production in a large-scale lawsuit concerning the alleged addictive design of social media, countering defense arguments that these measures were unfairly biased.

During a hearing, Judge Peter H. Kang emphasized the need for efficiency in handling the discovery process, particularly highlighting the importance of timely identification and deposition of witnesses.

He set up a new deposition protocol offering additional deposition time as an incentive for early identification of witnesses, aiming to prevent last-minute filing of deposition notices.

Defense objections regarding the fairness of the incentive structure and the feasibility of producing documents within 60 days were dismissed by Judge Kang, who underscored his personal experience with document production to highlight its manageability. The judge also addressed concerns about protecting minors from lengthy depositions, suggesting the parties could negotiate time swaps to mitigate this issue.

February 21, 2024: Judge Dismisses Lawsuit Against Snap Inc. Based On Section 230 Law

A Connecticut judge dismissed a lawsuit against Snap Inc., the company behind Snapchat, involving a young girl identified as C.O. who was harmed by two sex offenders she met through the app. The judge’s decision was based on a law called Section 230, which protects companies like Snapchat from being held responsible for what users do or say on their platform. The girl’s parents sued Snapchat, saying it connected their daughter to the offenders and didn’t do enough to keep her safe. However, the judge said that under current laws, Snapchat can’t be blamed for the actions of its users.

The judge did not seem to love the result but explained that without changes to the law, the court must follow the existing rules, leading to the dismissal of  Snapchat. This decision highlights the protection internet companies have under Section 230 from being liable for user actions on their platforms.

February 13, 2024: New Lawsuit Filed On Behalf Of 16 Year-Old Who Attempted Suicide

A new lawsuit was filed last week on behalf of a 16-year-old who attempted suicide.  The lawsuit, filed with her parents, alleges her use of Instagram, Snapchat, and TikTok were contributing factors.

My first draft of this update had the city where this family lives and more specific details. But we all have to be vigilant to help victims maintain their privacy.

February 12, 2024: First Bellwether Trial Scheduled For Late 2025

Judge Yvonne Gonzalez Rogers has scheduled the first bellwether trial in multidistrict litigation against Facebook and other social media platforms for late 2025. The judge outlined a detailed schedule leading up to the trial, with jury selection beginning on October 14, 2025.

The specific case to be tried first has not been chosen yet and will not be for some time now.   This decision will follow a three-step process, including creating “bellwether discovery pools” for personal injury cases brought by parents and those filed by school districts, narrowing down the pool, and determining the order of the cases to be heard based on arguments in June 2025.

February 7, 2024: Defendants Request Judge Dismiss Claims Made By Schools And Local Governments Based Off Section 230

Snap, Meta, and other social media companies have requested a California federal judge to dismiss claims made by schools and local governments that argue these platforms’ addictive features have caused harm to students, leading to increased costs for the institutions.

The companies argue that these claims are barred by the First Amendment and the Communications Decency Act, specifically citing Section 230, which they say protects them from being held liable for user experiences on their platforms. They also argue against the imposition of age verification and parental control policies, stating that previous court decisions support their stance against such requirements.

The problem with this argument is that these lawsuits allege these companies should be held accountable for their own actions, not those of third-party users. At their core, these lawsuits are an attack on the harmful product design of platforms like Snapchat, Facebook, TikTok, and Instagram rather than regulating the content or speech on these platforms, which is what Section 230 is all about.

Again, these cases focus on school lawsuits, not the personal injury and wrongful death lawsuits our lawyers are focused on. In any event, the bet is that Judge Gonzalez Rogers will shoot this motion down quickly.

January 30, 2024: Proposed Deadlines For Social Media Addiction MDL

Yesterday, we discussed how plaintiffs’ social media addiction attorneys have a very different view than the defendants on how fast this litigation should proceed. This chart underscores that difference:

Description Plaintiffs’ Proposed Deadline Defendants’ Proposed Deadline
(Dates Assume Entry of Plan on February 15, 2024)
Close of Fact Discovery November 15, 2024
(provided no phasing)
February 19, 2026
Close of Expert Discovery March 1, 2025
(provided no phasing)
July 9, 2026
(20 weeks later)

January 29, 2024: Significant Difference In Views Regarding Path To Trial

The plaintiffs and defendants have very different views of how the social media addiction class action lawsuit should proceed to trial.

The plaintiffs want to negotiate a bellwether order, a procedure for selecting specific trials to represent the larger group of cases. This order will include details about which plaintiffs must provide more detailed information (beyond the basic Plaintiff Fact Sheet) during the discovery. The plaintiffs’ social media addiction lawyers want a trial. The plaintiffs are prepared to submit a proposed bellwether order, along with a joint letter, if there are differing views or competing orders. This submission is scheduled to be completed by March 1, 2024.

The defendants acknowledge that they have started discussions on selecting bellwether trials for the personal injury and wrongful death victims. But they believe it’s too soon to set a deadline for submitting the bellwether order for the MDL Personal Injury Plaintiffs. They want to wait until after they’ve received the Plaintiff Fact Sheet data, which they see as a necessary step before making any selections.

Judge Gonzalez Rogers will decide the path. But there is no question she wants to move the cases forward quickly. In an MDL class lawsuit like the social media cases, “quick” is a relative term. The judge wants a trial in 2025. This is quick to mass tort lawyers who are familiar with the game. But it is not so quick for victims who want their settlement compensation or jury payout sooner rather than later.

November 2, 2023: Several Firms And Attorneys Seeks Leadership Positions In Federal MDL

Several firms and attorneys are seeking leadership positions in federal multidistrict litigation (MDL) in California, aiming to represent school districts and local governments in cases against tech companies alleged to have harmed young people’s mental health. These legal professionals have submitted applications highlighting their experience and past roles in similar MDLs involving public health issues.

They argue for dedicated representation to address the unique legal challenges and remedies sought by educational and governmental entities, distinct from personal injury claims. The applicants stress the importance of a structured organizational framework to prosecute these cases efficiently, with some already representing major school districts and local governments poised to file or consider lawsuits within this MDL.

January 2, 2023: JPML Rules All Pending Cases Involving Allegations Of Teen Addiction To Social Media Will Be Consolidated Into Class Action MDL 

The JPML ruled last week in the social media harm lawsuits that all pending cases involving allegations of teen addiction to social media platforms will be consolidated into a class action MDL in the Northern District of California. About 80 of these cases are currently pending in federal courts. The lawsuits allege that social media platforms are harmfully addictive for teens, causing them to harm themselves. The new class action MDL will include social media addiction cases against all the various defendants, even though 70% of the cases are Instagram and Facebook addiction lawsuits against Meta.

October 13, 2022: Judicial Panel On Multidistrict Litigation Agrees To Centralize All Social Media Addiction Cases Into New MDL

With social media addiction lawsuits being filed across the county rising, the Judicial Panel on Multidistrict Litigation agreed yesterday to centralize all cases into a new “class action” MDL. The new MDL (In re: Social Media Adolescent Addiction/Personal Injury Prod. Liab. Lit. – MDL No. 304) has been assigned to the Northern District of California, where most defendants are headquartered.

The JMPL Initial Transfer Order consolidates 28 cases pending in 17 different judicial districts. Centralizing a large group of related cases into an MDL is usually the first step toward a global settlement, and this is frequently done in mass tort product liability cases.

The social media addiction lawsuits involve new and unique legal claims that will likely sink or swim collectively based on whether the plaintiffs can present sufficient scientific evidence to avoid preemptive dismissal of the cases. The MDL class action will help that cause by allowing the plaintiffs to pool resources and present the best possible expert opinions.

Social Media Platforms Target Teens

Over 90% of all teens in the U.S. use social media platforms such as Facebook and Instagram. Studies estimate that the average teen spends around three hours daily engaging with social media platforms. Instagram is among the most popular social media platforms for young people and teenagers. Instagram recently reported that it has over 57 million users under 18.

The social media companies such as Meta Platforms (the parent company of Facebook and Instagram) have intentionally designed their products to maximize users’ screen time. They do this by employing complex algorithms designed to exploit human psychology. Meta and other social media constantly update and modify their products to promote excessive consumption.

Social media platforms have created user interfaces that intentionally display contend that is often irresistible to young users. The “feed” features on most social media platforms continuously show young users an endless stream of content that the algorithm curates based on a data profile of the user’s interests.

Teens are Especially Vulnerable to the Dangers of Social Media Addiction

Scientific research shows that the human brain is still developing during adolescence. Teenage brains are not fully developed in regions related to risk evaluation, emotional control, and impulse control. The algorithms utilized by major social media platforms intentionally exploit the lack of fully developed impulse and emotional control in the brains of adolescent users.

When teens get “likes” on social media, their brains release dopamine, which causes euphoria dopamine. However, as soon as dopamine is released, their euphoria is countered by dejection: minor users’ brains adapt by reducing or “downregulating” the number of dopamine receptors that are stimulated.

With normal forms of positive stimulation, the returns to neutral after a brief period. However, social media algorithms are designed to exploit users’ natural tendency to counteract by returning to the source of pleasure for another dose of euphoria.

Eventually, as this pattern continues over months and years, the neurological baseline for triggering the teen users’ dopamine responses increases. Teens then continue to use Facebook and Instagram, not for enjoyment, but to feel normal. When teens attempt to stop using social media products, they experience the universal symptoms of withdrawal from any addictive substance, including anxiety, irritability, insomnia, and craving.

Addictive use of social media by minors is psychologically and neurologically analogous to addiction to internet gaming disorder. Gaming addiction is recognized as a mental disease by the World Health Organization and other public health agencies.

Social Media Companies Profit From Teen Advertising

Social media companies profit enormously from teen advertising, with revenues directly tied to how much time users spend on their platforms. The formula is simple. More users spending more time means more advertising dollars. As teens are a key demographic, companies like Meta (formerly Facebook), TikTok, and Snapchat have consistently developed features and algorithms specifically aimed at increasing engagement, often to the detriment of their users’ mental health.

To keep teens hooked, these platforms employ sophisticated technology and algorithms designed to maximize engagement. Notifications, personalized content feeds, and tailored advertising are all engineered to drive users back onto the app and keep them scrolling.

This approach creates a highly addictive experience, making it challenging for vulnerable teen users to disconnect and avoid constant online pressure and comparison. Studies increasingly link these design choices to serious impacts on teens’ self-esteem, sleep, and mental health, with prolonged use shown to exacerbate anxiety, depression, and body image issues.

A former Facebook employee Frances Haugen testified before Congress, giving us a much better understanding of how it all works. She said that executives were aware of the potential harm to teens. But they prioritized profits over safety. This is theme of every single social media lawsuit in this litigation.

Haugen presented internal research showing that the platform not only failed to protect young users but actively designed features that worsened their mental health. This testimony underscored the need for accountability, as it highlighted the industry’s continued pattern of exploiting teens for revenue, all while downplaying or outright ignoring the risks of social media addiction and psychological harm.

How Social Media Addiction Can Harm Young People

A rapidly growing body of scientific research (including internal research by the social media companies themselves) has shown that addiction to social media can result in severe emotional harm and potentially physical harm for teens. This research, coupled with what we know about what these companies did to foster addiction, drive the social media lawsuit.

In 2018, a study published by the National Center for Biotechnology Information found a clear correlation between time spent on social media platforms and mental health issues, depression, and social ideation among adolescents. The study findings found that excessive use of social media correlated to an increase in self-harm behavior.

In 2021, the results of a long-term study by BYU on the impact of social media on teens found that teenage girls who used social media for 2-3 hours a day had a clinically higher risk for suicide. Research performed by the social media companies themselves confirmed the harm caused by these platforms. According to an article in the Wall Street Journal, Facebook performed internal research that found significant mental health issues related to Instagram use by teenage girls, including suicidal thoughts and eating disorders.

Lawsuits Against Social Media Companies for Harm Caused by Teen Addiction

Over the last year, a growing number of product liability lawsuits have been filed against Meta and other social media companies, seeking to hold them liable for injuries resulting from teen addictions to social media platforms. These lawsuits are being brought by teens and/or their parents. Nearly all of the lawsuits filed so far have been brought against Meta for addictions to its Instagram or Facebook platforms.

The social media harm lawsuits allege that platforms such as Facebook and Instagram were designed in a way that made them addictive and, therefore, unreasonably dangerous to adolescent users. The lawsuits also claim that the social media companies negligently failed to warn minor users and their parents about the risks of addiction and resulting harm.

The plaintiffs in these social media addiction lawsuits are seeking damages for severe physical and emotional injuries allegedly suffered as a result of teens becoming addicted to Instagram and Facebook. Injuries alleged in the lawsuits include self-mutilation, self-harm, severe eating disorders, and death from suicide.

Social Media Grooming Lawsuits

Social media grooming lawsuits against platforms like Facebook, Instagram, Snapchat, TikTok, and others involve legal actions taken by victims or their families against these companies.  There are also lawsuits against Robolox, which is not a regular player in social media lawsuits.  The core of these claims, which usually involve sex abuse or exploitation, is that social media platforms failed to protect minors from being groomed by predators.

The basis of these lawsuits often centers around several key legal arguments that plaintiffs’ social media lawyers believe are very strong.

First, plaintiffs argue that social media platforms were negligent in failing to implement adequate safeguards to protect minors from online grooming. This includes insufficient moderation of content, lack of effective reporting mechanisms, and inadequate measures to verify the age of users. By not taking these precautions because it would decrease their already excessive profits, the platforms are accused of allowing predators to exploit their services to contact and groom children.

Additionally, some lawsuits frame the issue as a product liability claim, asserting that the design of the platform is inherently dangerous or defective because it enables predators to contact and groom children. This perspective treats the platform itself as a product that should be reasonably safe for its intended use, which includes safeguarding vulnerable users from harm.

Furthermore, platforms may be accused of breaching their duty to provide a safe environment for their users. This duty is especially important for minors, who are particularly vulnerable to online predators. By failing to ensure a safe online space, the platforms woefully fail in their responsibility to their users and to the general public.

Blocking Parents from Monitoring Their Children

A key argument in many of the lawsuits against social media companies is their failure to provide adequate tools for parents to monitor and control their children’s use of platforms.  Plaintiffs’ social media addiction lawyers argue that social media companies have not only designed their products to be addictive but have also neglected to implement safeguards that would allow parents to intervene effectively.

While social media platforms offer features like screen time limits and content filters, these tools are often insufficient and difficult to navigate. As a result, parents are left with limited ability to protect their children from the harmful effects of social media addiction, including exposure to harmful content, online grooming, and cyberbullying. The failure to provide meaningful parental controls is seen as a form of negligence by these companies, especially given their knowledge of the risks posed to young users.

In the lawsuits, plaintiffs argue that social media companies are not only aware of these deficiencies but have deliberately chosen not to address them, prioritizing profit and user engagement over the safety of teen users. This lack of accountability, combined with the documented harm caused to minors, forms a central argument in the claims against platforms like Meta and Snap.

Class Action MDL Sought for Instagram Addiction Lawsuits

In August 2022, one of the plaintiffs bringing an Instagram addiction harm lawsuit against Meta filed a motion with the Judicial Panel on Multidistrict Litigation (JPML) requesting that all other Instagram addiction harm lawsuits in federal courts be consolidated into a new class action MDL. The motion identified 28 similar lawsuits pending in various federal district courts alleging harm resulting from teens becoming addicted to Instagram and Facebook.

These cases have been consolidated in a California social media class action lawsuit that houses all federal claims.

Social Media Companies May Claim Legal Immunity

Under 47 U.S.C. §230, website platforms enjoy immunity from liability for content posted by third parties.  These companies have hidden behind the  §230 skirt for years to avoid responsibility.
This law was originally designed to protect online platforms from being held responsible for the vast amount of user-generated content they host, allowing them to moderate and manage content without the fear of facing legal consequences.
In recent years, §230 has become a hotbed of political debate, with critics from across the spectrum arguing that social media companies have either not done enough to remove “harmful” material or “disinformation,” or that they have overstepped in censoring certain viewpoints. These concerns have led to heightened scrutiny, and the scope of immunity under §230 is currently the subject of an appeal before the U.S. Supreme Court.

In the context of social media addiction lawsuits, however, this broad immunity may not apply. While §230 shields platforms like Facebook from liability related to harm caused by user-generated content, it does not protect them from claims involving the design of their own technology, such as their algorithms.  That is the key to these lawsuits and how our social medial addiction lawyers believe we will survive the onslaught of effort to dismiss these lawsuits.

These algorithms, which are designed to maximize user engagement and retention, are at the heart of the social media addiction lawsuits. Plaintiffs’ lawsuits allege that it is not the content posted by other users, but the addictive nature of the platforms themselves—shaped by the company’s own algorithms—that is causing harm. As such, the liability being pursued in these cases targets the companies’ own actions and designs, rather than the actions of third-party users, making the §230 inapplicable.

Will Social Media Addiction Lawsuit Be Successful?

The lawsuits seeking to hold social media companies liable for harm caused by addiction to their platforms are clearly unique and, in many ways, unprecedented. None of these cases have gone to trial or settled. These cases involve somewhat novel tort claims, and the plaintiffs will face a tough uphill battle on many fronts.

One of the most significant pitfalls for these social media addiction lawsuits will be proving causation. First, the plaintiffs will need to scientifically prove that there was a physical addiction to the platform. Studies on this have been conducted, but it’s not certain whether these would meet the level of academic scrutiny necessary to be admissible in court.

Even if the plaintiffs can prove that there was an actual addiction to a social media platform, they will also need to prove that the addiction was “more likely than not” the cause of the physical harm suffered by the teen. In a suicide case, this might be very difficult because social media addiction will likely be just one of several contributing causes.

Social Media Addiction Lawsuit Settlement Amounts

As social media addiction lawsuits against companies like Meta (Facebook and Instagram) and Snap Inc. (Snapchat) move forward, predicting settlement amounts is still speculative, as none of these cases have reached trial or settlement yet.

Still, if the suits proceed as well as most plaintiffs’ lawyers believe they will, our lawyers can speculate about potential settlement amounts these social media companies will ultimately pay based on the nature of the allegations and historical precedents from other major liability cases. Keep in mind, these are just projections and they assume that the plaintiffs will successfully overcome key legal challenges, including proving causation and harm.

High-Value Settlements: Teen Suicide Cases

The highest potential settlements are expected in cases involving teen suicides linked to social media addiction. Wrongful death claims, particularly those involving teenagers, typically result in higher settlement values due to the significant emotional and financial losses. If these cases go as well as most plaintiffs’ lawyers hope, the average suicide settlement amounts could range from $900,000 to $3 million per case.  As for trial, it would not be surprising to see compensation verdicts in the tens of millions even before we get to punitive damages.

Factors driving these higher settlement values include:

  1. Emotional Impact: These are incredibly tragic cases.  The profound emotional toll on families who lose a child, especially under tragic circumstances involving suicide, could lead to substantial awards.
  2. Corporate Responsibility: If plaintiffs successfully demonstrate that companies like Meta were aware of the dangers posed to teens and chose not to act, punitive damages could significantly increase the risk these companies face at trial and that will impact settlement amounts.
  3. Public Pressure: The growing societal concern about the mental health impacts of social media on youth may push companies to settle quickly to avoid public backlash.  The politics and the optics in his litigation definitely lean the plaintiffs’ way.

Moderate Settlements: Severe Injuries and Mental Health Disorders

For cases involving severe but non-fatal injuries, such as self-harm, eating disorders, or long-term mental health issues, settlements are likely to be lower than wrongful death cases but could still be substantial. If these cases proceed favorably for plaintiffs, settlements could be in the range of $300,000 to $900,000, depending on the severity and duration of the harm.

  • Eating Disorders: Lawsuits involving severe eating disorders (such as anorexia or bulimia) that are linked to social media use may result in settlement payouts between $300,000 and $900,000 if plaintiffs can show ongoing physical and psychological harm.
  • Self-Harm: Claims involving self-mutilation or severe self-harm could see similar settlement ranges, particularly if the cases involve long-term emotional trauma and visible scars.
  • Mental Health Disorders: Cases of severe anxiety, depression, or related mental health issues would likely fall within a slightly lower range, around $150,000 to $450,000, unless plaintiffs can prove the harm was directly tied to the addictive nature of social media.

Low-Value Settlements: Mild to Moderate Injuries

For cases involving milder injuries, such as temporary mental health issues or emotional distress, settlements will likely be lower. However, if the plaintiffs’ lawyers succeed in establishing a connection between social media use and these issues, settlements may range from $30,000 to $150,000, especially if there is no ongoing treatment required.

What Are the Major Injuries in a Social Media Lawsuit?

Let’s look at the type of injuries that the social media lawsuits involve:

  • Addiction/Compulsive Use: Social media platforms are designed to be engaging and can lead to compulsive use or addiction. This is often due to the platforms’ use of algorithms that encourage prolonged engagement, potentially leading to neglect of other activities and responsibilities.
  • Eating Disorders:
    • Anorexia: Exposure to content promoting unrealistic body standards can contribute to anorexia, where individuals develop an obsessive fear of gaining weight and severely limit their food intake.
    • Bulimia: Similarly, exposure to certain content on social media can contribute to bulimia, characterized by periods of overeating followed by purging due to body image issues.
    • Binge Eating: Social media can also influence binge eating behaviors, where individuals consume large amounts of food quickly, often triggered by stress or emotional content online.
    • Other Eating Disorders: Various other eating disorders can be influenced by the content and interactions on social media, including exposure to diet culture and body shaming.
  • Depression: Constant comparison with others, cyberbullying, and exposure to harmful or distressing content on social media can contribute to feelings of sadness, hopelessness, and depression.
  • Anxiety: Overuse of social media can lead to heightened anxiety stemming from social comparison, fear of missing out (FOMO), and exposure to anxiety-inducing content.
  • Self-Harm:
    • Suicidality: Exposure to content related to suicide or being part of online communities that discuss self-harm can increase suicidal thoughts, particularly in vulnerable individuals. This and sex abuse are the worst injuries our social media attorneys are seeing in these lawsuits, and surviving family members certainly may be eligible for a lawsuit. We discuss this more just below.
    • Attempted Suicide: There can be an influence on suicide attempts due to the glorification or normalization of such actions in certain social media circles.
    • Death by Suicide: Tragically, social media can play a role in some individuals’ decisions to take their own lives, especially when they are exposed to suicidal content or cyberbullying.
    • Other Self-Harm: This includes various forms of self-injurious behavior, which might be influenced by content seen on social media or as a coping mechanism for the distress experienced due to online interactions.
  • Child Sex Abuse: The presence of predators on social media and the sharing of explicit content can lead to cases of child sexual abuse.
  • CSAM Violations (Child Sexual Abuse Material): The distribution and access to illegal child sexual abuse material can be facilitated through specific platforms, contributing to this serious issue.
  • Other Physical Injuries: This can include a range of physical injuries that may be indirectly caused by social media use, such as accidents occurring while being distracted by social media (we add this for completeness – our lawyers are not handing distracted claims)

Social Media Suicide Lawsuits

Social media can potentially contribute to suicidal behavior in several ways. Here are some ways in which social media might play a role:

  1. Cyberbullying and Harassment: Persistent online bullying or harassment can lead to feelings of isolation, depression, and hopelessness, particularly among adolescents and young adults, which can contribute to suicidal thoughts and behaviors. What do social media companies do to stop this? Not much.
  2. Exposure to Suicidal Content: Exposure to posts, images, or videos discussing or glorifying suicide can influence vulnerable individuals, particularly if they are already struggling with mental health issues.
  3. Social Comparison and Low Self-esteem: Constant exposure to idealized representations of others’ lives can lead to negative self-comparison, decreased self-esteem, and feelings of inadequacy, which can contribute to depressive symptoms and suicidal thoughts.
  4. Isolation and Lack of Real Connection: Excessive time spent on social media can lead to decreased face-to-face interactions and a sense of social isolation, which is a known risk factor for depression and suicide.
  5. Online Echo Chambers: Being part of online communities that promote harmful behaviors, including self-harm or suicidal ideation, can normalize these behaviors and make them seem like viable options to vulnerable individuals.
  6. Sleep Disruption: Overuse of social media, especially before bedtime, can disrupt sleep patterns. Poor sleep is linked to a variety of mental health issues, including an increased risk of depression and suicidal thoughts.
  7. Triggering Content: Users may come across triggering content related to trauma, self-harm, or other sensitive topics that can exacerbate underlying mental health issues.

Who Are the Social Media Defendants?

Here are the key social media defendants:

ETA Entities

  1. Meta Platforms, Inc., formerly known as Facebook, Inc.
  2. Instagram, LLC
  3. Facebook Payments, Inc.
  4. Siculus, Inc.
  5. Facebook Operations, LLC

TikTok Entities

  1. ByteDance Ltd.
  2. ByteDance Inc.
  3. TikTok Ltd.
  4. TikTok LLC.
  5. TikTok Inc.

Snap Entity

  1. Snap Inc.

Google Entities

  1. Google LLC
  2. YouTube, LLC

Lawsuits against Facebook/Meta

Meta, the company responsible for operating and designing Facebook and Instagram, two of the world’s most popular social media platforms, is probably the number one defendant in this litigation. In 2022, the staggering user statistics revealed that Instagram had two billion active monthly users worldwide, while Facebook boasted almost three billion monthly active users. While these numbers showcase Meta’s enormous reach, they also underscore the extensive damage suffered by plaintiffs and other adolescent users of these platforms.

Meta Knew What It Was Doing

Meta’s own internal documents reveal that it was well aware of the negative consequences of what Facebook and Instagram were doing to children. It acknowledged that children under 13 were using Instagram despite being prohibited from doing so. Furthermore, Meta recognized that its Facebook and Instagram – especially Instagram, obviously for kids – platforms were addictive, with teenagers feeling pressured to engage with them constantly. The company admitted that its existing tools did not limit users’ screen time.

Addictive use of Instagram was acknowledged to lead to significant problems. Still, the company is accused of what so many big lawsuits are about – it put growth and profits over the well-being of children. Despite being warned about issues related to problematic use, bullying, harassment, and the impact on young users, Meta failed to take substantial action. Instead, it maintained the status quo, citing concerns that addressing these problems might negatively affect the company’s growth.

Pretending to Try to Solve the Problem

Meta allegedly tried to help the problem with mere public relations gestures, with internal documents suggesting they were not taken seriously. The company provided tools such as “time spent” features to parents and kids, even though it knew that the data presented by these tools was inaccurate.

Additionally, Meta is accused of engaging in a campaign to discredit research about the addictive nature of its products, calling it “completely made up.” In contrast, its internal research highlighted the unique dangers posed to young users.

Despite this knowledge, Meta’s failure to protect child users of Instagram and Facebook has drawn significant attention. Instead of addressing the problems caused by its platforms, the company cut funding for its mental health team and deprioritized addiction-related issues. These actions have raised concerns about the company’s commitment to user safety.

How They Did It

Meta’s platforms deploy recommendation algorithms fueled by extensive data collection. These algorithms are accused of promoting usage patterns and frequencies harmful to adolescents. Features that exploit children’s need for validation and encourage harmful loops of repetitive usage are among the key concerns raised.

Furthermore, the plaintiffs argue that despite having the capability, Meta failed to implement effective mechanisms to limit children’s use of these products. The lack of adequate parental controls and facilitation of unsupervised use are also significant issues.

These platforms are meticulously crafted to increase user engagement. Every detail, from icon colors to notification timings, is strategically designed. This extensive engineered approach aims to maximize the length and frequency of user sessions.

The Algorithm Is a Problem

One of the core issues in social media addiction lawsuits is the way algorithms on platforms like Facebook and Instagram manipulate users’ behavior, particularly young people. Remember when Facebook’s news feed was chronological, showing posts in the order they were shared?  But in 2009, Facebook switched to an engagement-based ranking algorithm, and Instagram followed suit in 2016, replacing its chronological feed with a similar engagement-driven model.

The goal of these algorithmic changes was to increase user engagement—keeping people on the platform longer by showing them content that would provoke the most reactions, whether through likes, shares, or comments.  This worked for Meta—their profits soared.  But it is now a central point of contention in lawsuits focusing on social media addiction.

In 2018, Meta introduced the “meaningful social interaction” (MSI) algorithm, which it claimed would promote more meaningful connections by prioritizing content from friends and family. Meta argued that this change would reduce passive consumption, such as mindlessly scrolling through content without engaging. However, plaintiffs’ social media addiction lawyers have aggressively challenged this claim, arguing that the MSI algorithm does far more harm than good.

Plaintiffs’ lawyers argue that instead of fostering meaningful connections, the MSI algorithm amplifies negative interactions. Internal research from Meta and independent studies have revealed that the algorithm tends to prioritize emotionally charged content, often amplifying posts that provoke anger, fear, or sadness.  This is a bad idea for adults and and an awful way to manipulate children. Our lawsuits assert that Meta, in its quest to drive user engagement, knowingly exposed vulnerable young users to harmful content, including cyberbullying, unrealistic body image portrayals, and self-harm.

On Instagram, in particular, the algorithm’s effect has been devastating, especially for teenagers. Instagram’s visual nature, combined with the platform’s algorithm, often pushes content that features unrealistic body standards or sensationalized trends.

Plaintiffs argue that this content leads to damaging comparisons, fostering feelings of inadequacy, depression, and anxiety—particularly in young girls. On one level, this is life, right.  Plaintiffs’ lawyers get that.  But it is the manipulation of children with  Instagram’s Explore feature, which curates content based on user interactions, that intentionally reinforcing these harmful behaviors.

What you get from these algorithms is a dangerous feedback loop. Users who engage with certain types of content—whether related to fitness, body image, or mental health—are served more of the same content, often in more extreme forms. This algorithmic echo chamber traps vulnerable users in cycles of negative thought patterns, exacerbating issues like eating disorders, depression, and social isolation.

Meta argues that its algorithms are designed to improve user experience, but social media addiction lawsuits present a different reality. By prioritizing engagement metrics like likes, shares, and comments, these lawsuits allege that these algorithms actively harm users, particularly teenagers, by pushing emotionally charged and sometimes dangerous content. Rather than offering balanced or supportive content, the platforms focus on material that keeps users hooked—resulting in addiction-like behaviors and worsening mental health.

Contact Us About a Social Media Addiction Lawsuit

If you or your child has suffered severe physical or emotional harm as a result of addiction to social media, contact our lawyers today at 800-553-8082 or get a free online consultation.

Contact Us