by WorldTribune Staff, March 26, 2026 Non-AI Real World News
A landmark decision against Meta/Facebook and Google this week has set the stage for thousands of lawsuits against the Big Tech titans over addictions to their apps.
The longstanding legal liability shield, which allowed tech companies to flourish, was finally breached when a Los Angeles jury on March 25 found Google and Meta liable for $6 million in damages.
The case pitted a 20-year-old plaintiff identified as “Kaley G.M.” or “K.G.M.” against the two tech giants on whether the companies could be held accountable for psychological harms that she suffered as a result of an addiction to their apps, YouTube and Instagram.

In another landmark case, a jury in New Mexico on Tuesday found that Meta failed to protect kids from sexual predators and misled users about its platforms’ safety, violating state law.
The tech giant was ordered to pay $375 million in civil penalties.
The decisions dealt a blow to the gigantic companies that have historically been shielded by Section 230 of the Communications Decency Act.
The plaintiff testified that she began using YouTube at age 6 and Instagram at age 9, and soon developed anxiety, depression, body dysmorphia, and other disorders. At times, she said she spent up to 16 hours per day on the apps and kept returning to them even after what she described as incessant bullying.
According to 2024 Pew Research Center data, 9 out of 10 U.S. teens use YouTube; 73 percent say they use it daily. Worldwide, YouTube is the most-watched app, followed closely by Instagram, with both estimated to have between 2 billion and 3 billion users.
Snapchat and TikTok were also defendants in the original lawsuit, but both settled before the trial began on Feb. 9.
Of the $3 million in compensatory damages, Meta was ordered to pay 70 percent and Google 30 percent. The jury awarded another $3 million in punitive damages, including $2.1 million from Meta and $900,000 from Google.
Responding to the verdict, a Meta spokesperson said, “We respectfully disagree with the verdict and are evaluating our legal options.” Google did not have an immediate comment.
Speaking to reporters after the trial in the Los Angeles Superior Court, two jurors described the process of reviewing all the evidence, “in every direction,” and carefully considering the law before finally arriving at a consensus.
“We looked at everything Kaley went through, and what was the process these platforms had in place that was going to prevent these harms,” a juror who asked to be identified by her first name, Victoria, said.
“We wanted them to feel it. We wanted them to realize this is not acceptable,” she said of the companies, adding that some jurors were hesitant to award more substantial punitive damages because it would be paid out to the plaintiff in a lump sum, rather than put in a trust.
Judge Carolyn Kuhl instructed jurors to consider each company’s net worth to be equal to its 2025 total stockholders equity: $217 billion for Meta and $415 billion for Alphabet, Google’s parent company.
The case narrowly focused on design and operation—features such as “infinite scroll,” beauty filters, and the companies’ proprietary algorithms—rather than third-party content hosted on the platforms, which is broadly protected by the First Amendment and Section 230 of the 1996 Communications Decency Act.
Pointing to internal documents unsealed during the trial, Meta whistleblowers testified that the company was well aware of the harms to young users but chose to ignore them.
Plaintiff’s attorney Mark Lanier argued that the tech giants preyed upon vulnerable teens in pursuit of money and power.
“They never go after the strongest, never go after the boldest,” Lanier told the jury. “They find the one that’s weaker, that’s more vulnerable, and that’s the one they get.”
Company executives, including Meta CEO Mark Zuckerberg, denied designing addictive apps or targeting minors, and questioned whether social media addiction exists.
“To me, the North Star is making sure we’re delivering value and people are having a positive experience, and if that happens, I think people will naturally spend more time on the platform,” Zuckerberg told the jury.
Defendants’ attorneys argued that their platforms were a creative outlet for an already troubled child who came of age during the heightened stress of the COVID-19 pandemic. They said the companies had taken an overcautious approach, introducing protective features even when evidence of risks was inconclusive.
K.G.M.’s troubles, they argued, started long before she began using social media.
“Their case depends on asking you to find that if you focus only on Instagram, somehow her life would be meaningfully different,” Paul Schmidt, an attorney for Meta, told the jury in his closing argument. “The evidence did not show that. It showed just the opposite.”
Google attorney Luis Li argued that YouTube is not really social media but rather a streaming app akin to Netflix, which people mostly watch on their televisions.
Grieving parents—who lost children to suicide or fatal “viral challenges” they say were caused by the defendants’ app features — were a constant presence throughout the trial, The Epoch Times reported.
“All the parents, we’re so happy, we feel our kids are smiling down on us. And we feel we have a little bit of power back sitting in that room listening to the jury, and the system worked, and it rendered a just verdict,” Victoria Hinks told The Epoch Times.
Victoria and her husband Paul Hinks lost their 16-year-old daughter, Alexandra, to suicide in 2024 after a brief struggle with mental illness they say was driven by social media.
“I’m glad the jury agreed with the plaintiff that this was an unsafe product, they made it addictive to children, they knew what they were doing, and every parent in there tried their best to keep their kids safe,” Paul Hinks said. “The product is broken. You can’t blame the parents. … It’s not our job to keep their customers safe on their product.”
The high-stakes trial unfolded during a critical moment, when governments around the world are ramping up online safety laws or imposing blanket social media bans for younger users, and parents are calling for regulatory reform.
The New Mexico verdict came after six weeks of testimony from witnesses that included ex-Meta executives, teachers and online safety experts.
New Mexico prosecutors argued that Meta hid the extent of safety issues that kids faced on Facebook and Instagram and failed to enforce its claimed minimum age limit of 13 – even as its algorithms allegedly made it easier for creeps to target kids for online harassment and even sex trafficking.
“The safety issues that you’ve heard about in this case, weren’t mistakes,” state attorney Linda Singer told the jury on Monday.
“They were a product of a corporate philosophy that chose growth and engagement over children’s safety. And young people in this state and around the country have borne the cost,” she added.