By Clay Calvert
When a California jury last month awarded $6 million in a social media addiction trial to a 20-year-old plaintiff—identified by her initials, K. G. M.—one word dominated media outlets’ description of the verdict: “landmark.” It was the first time a jury penalized a social media company—here, both Meta and Google—for ostensibly injuring a minor this way. While articles from the Los Angeles Times to the New York Post emphasized how this outcome could be a harbinger for a deluge of future rulings against tech companies, the reality is more complex.
Not all the hundreds of other personal injury cases filed by or on behalf of minors against social media companies will necessarily go the same way. In fact, reasons exist for Meta and Google to be cautiously optimistic that things may turn out differently on the long road of litigation that lies ahead in P. F. (K. G. M.) v. Meta Platforms—the companies are appealing the decision—and the other lawsuits. To grasp why the other cases may not easily fall like dominoes in the plaintiffs’ direction, it is first essential to understand how the “landmark” designation for this case is still apropos and why the outcome upsets free speech advocates.
K. G. M.’s case marked the first time a US jury found social media companies liable for supposedly addicting a minor—through defective “design features,” not via content consumed—to their platforms and causing her numerous mental health harms. The platform design features at issue included structural elements and functions such as engagement-maximizing algorithms, push notifications, infinitely scrolling content on Instagram, and video “autoplay” on YouTube. K. G. M. blamed these features for her emotional distress, depression, anxiety, self-harm, and body dysmorphia.
The 12-person jury voted 10 to 2 in K. G. M.’s favor, holding both defendants liable under two negligence theories—negligently designing or operating their platforms and negligently failing to adequately warn users that their platforms would likely “be dangerous when used by a minor in a reasonably foreseeable manner.” Those theories treat platforms as if they were material products such as cars and coffee makers, not as speech services that convey First Amendment–protected content and allow minors to learn information, engage with others, and express themselves.
The design-defect treatment of platforms is highly problematic because, as attorney Ari Cohn of the Foundation for Individual Rights and Expression explained, “The minute we start treating speech as if it were just another physical product is the minute we hand the government the power to decide what we can read, watch, and say.” In that vein, I’ve contended that holding platforms culpable for alleged behavioral addictions is clearly distinct from holding tobacco companies liable for substance addictions. When litigation targets social media companies over addiction, expressive constitutional rights come into play. Suing tobacco and cigarette companies, however, does not raise First Amendment concerns, because their products don’t convey expression.
Additionally, and unlike cigarettes, social media platforms offer minors multiple benefits, including allowing them to make friends and social connections, letting them realize and express their identities and beliefs, and providing an interactive forum for learning important information and participating in society. As Jacob Mchangama and Jeff Kosseff wrote about the flawed tobacco-litigation analogy, “The scientific consensus on smoking’s harms is unanimous and no one claims smoking has benefits. Neither is true for social media.” The benefits to minors from social media, however, weren’t at issue in K. G. M.’s case; only the averred downsides were.
In delivering its verdict for the plaintiff, the jury concluded that the defendants’ purported negligence in designing or operating their platforms and failing to warn minors of supposed dangers were “substantial factor[s] in causing harm to K. G. M.” (Emphasis added.) The phrase “substantial factor” has legal meaning. It describes “a factor that a reasonable person would consider to have contributed to the harm” and that is “more than a remote or trivial factor.” It doesn’t, however, need “to be the only cause of the harm.” Importantly—and in pro-plaintiff fashion—the jury was instructed that “a defendant cannot avoid responsibility just because some other person, condition, or event was also a substantial factor in causing KGM’s harm.”
The jury awarded K. G. M. $3 million in compensatory damages and $3 million in punitive damages. It pinned 70 percent of the responsibility on Meta and the rest on Google, which had argued, among other things, that it’s not a social media platform but rather a video delivery service. The punitive damages award meant that the jury decided “K. G. M. [had] proved by clear and convincing evidence that [the defendants] acted with malice, oppression, or fraud.”
Read the full essay here. >>