{"id":15066,"date":"2026-04-08T13:31:31","date_gmt":"2026-04-08T18:31:31","guid":{"rendered":"https:\/\/sites.law.duq.edu\/juris\/?p=15066"},"modified":"2026-04-08T13:31:32","modified_gmt":"2026-04-08T18:31:32","slug":"when-likes-turned-to-liability","status":"publish","type":"post","link":"https:\/\/sites.law.duq.edu\/juris\/2026\/04\/08\/when-likes-turned-to-liability\/","title":{"rendered":"When Likes Turned to Liability"},"content":{"rendered":"\n<p>By Kaushik Srinath, Staff Writer<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><a href=\"https:\/\/sites.law.duq.edu\/juris\/wp-content\/uploads\/2026\/04\/SK-feature-.jpg\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"512\" src=\"https:\/\/sites.law.duq.edu\/juris\/wp-content\/uploads\/2026\/04\/SK-feature-.jpg\" alt=\"\" class=\"wp-image-15067\" srcset=\"https:\/\/sites.law.duq.edu\/juris\/wp-content\/uploads\/2026\/04\/SK-feature-.jpg 768w, https:\/\/sites.law.duq.edu\/juris\/wp-content\/uploads\/2026\/04\/SK-feature--300x200.jpg 300w, https:\/\/sites.law.duq.edu\/juris\/wp-content\/uploads\/2026\/04\/SK-feature--580x387.jpg 580w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/a><figcaption class=\"wp-element-caption\"><em>Photo Courtesy of Pixabay.com<\/em><\/figcaption><\/figure>\n<\/div>\n\n\n<p>Attempts to hold social media companies accountable for user harm have been largely unsuccessful due to the protections of Section 230 of the Communications Decency Act.<a href=\"#_ftn1\" id=\"_ftnref1\">[1]<\/a> Enacted in 1996, Section 230 provides that online platforms are not treated as the \u201cpublisher or speaker\u201d of content created by third parties.<a href=\"#_ftn2\" id=\"_ftnref2\">[2]<\/a> Courts have interpreted this provision broadly, allowing companies such as Meta and Google to avoid liability for harmful user-generated content. As a result, plaintiffs alleging harms such as cyberbullying, eating disorders, or self-harm linked to social media have consistently faced dismissal.<a href=\"#_ftn3\" id=\"_ftnref3\">[3]<\/a> However, a recent case, <em>K.G.M. v. Meta Platforms Inc<\/em>., represents a major shift in litigation strategy.<a href=\"#_ftn4\" id=\"_ftnref4\">[4]<\/a> By focusing on platform design rather than user content, plaintiffs were able to bypass Section 230 and achieve a more favorable outcome, signaling a potential turning point in tech liability.<a href=\"#_ftn5\" id=\"_ftnref5\">[5]<\/a><\/p>\n\n\n\n<p>Historically, most lawsuits against social media companies have centered on harmful content posted by users.<a href=\"#_ftn6\" id=\"_ftnref6\">[6]<\/a> Plaintiffs often argued that platforms failed to remove dangerous material, or negligently allowed harmful communities to flourish.<a href=\"#_ftn7\" id=\"_ftnref7\">[7]<\/a> However, these claims almost always failed because courts treated platforms as publishers of third-party speech. Courts repeatedly dismissed such cases, emphasizing that imposing liability would undermine Congress\u2019s goal of promoting free expression and innovation online. Even when plaintiffs presented compelling evidence of harm, courts found that the causal link between the platform and the injury depended on user-generated content, triggering Section 230 immunity.<a href=\"#_ftn8\" id=\"_ftnref8\">[8]<\/a><\/p>\n\n\n\n<p>This legal barrier proved difficult to overcome because it framed the issue too narrowly.<a href=\"#_ftn9\" id=\"_ftnref9\">[9]<\/a> By focusing on content moderation, plaintiffs implicitly accepted the premise that platforms were passive intermediaries.<a href=\"#_ftn10\" id=\"_ftnref10\">[10]<\/a> Under this framework, companies were hosting speech, not creating or contributing to harm.<a href=\"#_ftn11\" id=\"_ftnref11\">[11]<\/a>As a result, even egregious cases involving vulnerable users failed to survive early procedural stages.<a href=\"#_ftn12\" id=\"_ftnref12\">[12]<\/a> Section 230 effectively foreclosed an entire category of claims, leaving plaintiffs without meaningful recourse.<a href=\"#_ftn13\" id=\"_ftnref13\">[13]<\/a><\/p>\n\n\n\n<p>In <em>K.G.M. v. Meta Platforms, Inc.<\/em>, the plaintiffs adopted a fundamentally different approach.<a href=\"#_ftn14\" id=\"_ftnref14\">[14]<\/a> Rather than targeting the content itself, they challenged the design of the platforms.<a href=\"#_ftn15\" id=\"_ftnref15\">[15]<\/a> Specifically, they argued that features such as infinite scroll, autoplay videos, algorithmic recommendations, and persistent notifications were intentionally engineered to maximize user engagement in ways that could be addictive.<a href=\"#_ftn16\" id=\"_ftnref16\">[16]<\/a> By focusing on product design, the plaintiffs positioned their claims within the realm of traditional tort and product liability law, rather than defamation or publisher liability.<a href=\"#_ftn17\" id=\"_ftnref17\">[17]<\/a><\/p>\n\n\n\n<p>Critically, evidence presented in the case suggested that these design features were not incidental but deliberate.<a href=\"#_ftn18\" id=\"_ftnref18\">[18]<\/a> Internal documents and testimony indicated that engineers at Meta and Google understood how these features could exploit psychological vulnerabilities, especially among young users.<a href=\"#_ftn19\" id=\"_ftnref19\">[19]<\/a> To learn more about social media addiction, stay tuned for our upcoming feature, <em>Hooked: The Impact of Social Media on Young Minds <\/em>by Delaney Szekely.<\/p>\n\n\n\n<p>By framing the issue as one of defective or dangerous product design, the plaintiffs avoided Section 230 immunity.<a href=\"#_ftn20\" id=\"_ftnref20\">[20]<\/a> The key distinction was that liability did not depend on what users posted, but on how the platforms themselves were built.<a href=\"#_ftn21\" id=\"_ftnref21\">[21]<\/a> This distinction creates a viable pathway for future litigation, particularly in cases involving minors and allegations of addiction or mental health impacts.<a href=\"#_ftn22\" id=\"_ftnref22\">[22]<\/a><\/p>\n\n\n\n<p>The outcome in <em>K.G.M.<\/em> reflects a broader evolution in how courts and litigants conceptualize social media.<a href=\"#_ftn23\" id=\"_ftnref23\">[23]<\/a> Rather than viewing platforms solely as neutral conduits for speech, there is growing recognition that they are sophisticated products designed to influence user behavior.<a href=\"#_ftn24\" id=\"_ftnref24\">[24]<\/a> This shift aligns with principles from other areas of law, where manufacturers can be held liable for products that are unreasonably dangerous due to their design.<a href=\"#_ftn25\" id=\"_ftnref25\">[25]<\/a> By analogizing social media platforms to such products, plaintiffs can bypass the doctrinal constraints imposed by Section 230.<a href=\"#_ftn26\" id=\"_ftnref26\">[26]<\/a> Thus, this evolution marks an important step toward greater accountability in the digital age and suggests that the future of tech litigation will hinge on how platforms are built.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><a href=\"#_ftnref1\" id=\"_ftn1\">[1]<\/a> Mary G. Leary, The Failed Experiment of Section 230 of the Communications Decency Act: How it Facilitates Exploitation and How it Must Be Reformed, 70 Vill. L. Rev. 49 (2025).<\/p>\n\n\n\n<p><a href=\"#_ftnref2\" id=\"_ftn2\">[2]<\/a> 47 U.S.C.A. \u00a7 230 (West).<\/p>\n\n\n\n<p><a href=\"#_ftnref3\" id=\"_ftn3\">[3]<\/a> Seth A. Stern, Section 230 of the Communications Decency Act Potential Reforms and Implications, 51 Brief 20, 21 (Spring 2022).<\/p>\n\n\n\n<p><a href=\"#_ftnref4\" id=\"_ftn4\">[4]<\/a> Id.<\/p>\n\n\n\n<p><a href=\"#_ftnref5\" id=\"_ftn5\">[5]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref6\" id=\"_ftn6\">[6]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref7\" id=\"_ftn7\">[7]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref8\" id=\"_ftn8\">[8]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref9\" id=\"_ftn9\">[9]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref10\" id=\"_ftn10\">[10]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref11\" id=\"_ftn11\">[11]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref12\" id=\"_ftn12\">[12]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref13\" id=\"_ftn13\">[13]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref14\" id=\"_ftn14\">[14]<\/a> https:\/\/www.reuters.com\/legal\/litigation\/jury-reaches-verdict-meta-google-trial-social-media-addiction-2026-03-25\/<\/p>\n\n\n\n<p><a href=\"#_ftnref15\" id=\"_ftn15\">[15]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref16\" id=\"_ftn16\">[16]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref17\" id=\"_ftn17\">[17]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref18\" id=\"_ftn18\">[18]<\/a> https:\/\/www.nytimes.com\/2026\/03\/25\/technology\/social-media-trial-verdict.html<\/p>\n\n\n\n<p><a href=\"#_ftnref19\" id=\"_ftn19\">[19]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref20\" id=\"_ftn20\">[20]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref21\" id=\"_ftn21\">[21]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref22\" id=\"_ftn22\">[22]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref23\" id=\"_ftn23\">[23]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref24\" id=\"_ftn24\">[24]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref25\" id=\"_ftn25\">[25]<\/a> <em>Id.<\/em><\/p>\n\n\n\n<p><a href=\"#_ftnref26\" id=\"_ftn26\">[26]<\/a> <em>Id.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>By Kaushik Srinath, Staff Writer Attempts to hold social media companies accountable for user harm have been largely unsuccessful due to the protections of Section 230 of the Communications Decency Act.[1] Enacted in 1996, Section 230 provides that online platforms are not treated as the \u201cpublisher or speaker\u201d of content [\u2026] <\/p>\n<div class=\"clear\"><\/div>\n<p><a class=\"more_link clearfix\" href=\"https:\/\/sites.law.duq.edu\/juris\/2026\/04\/08\/when-likes-turned-to-liability\/\" rel=\"nofollow\">Read More<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[3582,1174,1084,4059,639,39,4058],"class_list":["post-15066","post","type-post","status-publish","format-standard","hentry","category-juris-blog","tag-liability-2","tag-communications-decency-act","tag-instagram","tag-product-design","tag-product-liability","tag-social-media","tag-youtube"],"_links":{"self":[{"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/posts\/15066","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/comments?post=15066"}],"version-history":[{"count":1,"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/posts\/15066\/revisions"}],"predecessor-version":[{"id":15068,"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/posts\/15066\/revisions\/15068"}],"wp:attachment":[{"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/media?parent=15066"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/categories?post=15066"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.law.duq.edu\/juris\/wp-json\/wp\/v2\/tags?post=15066"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}