The recent Third Circuit case in which the court denied immunity to TikTok after a child died attempting a “Blackout Challenge” suggested on its “For You” page was wrongly decided. But TikTok and other companies should not be absolved of liability for the foreseeable harms of their platforms. Instead of interpreting existing law, courts should recognize causes of action that accurately reflect the ills of social media.

Revered as “the twenty-six words that created the internet,” Section 230 of Title 47 of the United States Code is a federal law that immunizes platforms like TikTok, YouTube, or Facebook for the content people post there. Without Section 230, the logic goes, no platform would host user-generated content on a large scale for fear of being held liable for it. Section 230, which originally appeared in a law intended to clean up the internet, also immunizes platforms for removing offensive content. The Supreme Court struck down the rest of the law, the Communications Decency Act, in 1997 on free speech grounds. Only Section 230 survived.

Not everyone “likes” Section 230. There are Democrats who blame the law for the prevalence of hate speech, nonconsensual pornography, misinformation, and other harmful content online. Congress amended Section 230 in 2018 to exclude online sex trafficking, and it was never applied to copyright violations or federal crimes. But social media gets a free pass for ignoring everything else. Some conservatives think the platforms are instead over-policing content. These people apparently see Section 230 as cover for tech companies to try to censor conservative viewpoints under the guise of combating misinformation and hate speech.

The law is unpopular with supporters, but Section 230 has always been able to count on the courts. Judges have interpreted Section 230’s immunity broadly since its inception. Some cases (e.g., Fair Housing Council of San Fernando Valley v. Roomates.com in the Ninth Circuit, or FTC v. Accusearch, Inc. in the Tenth Circuit) have found that platforms are liable for soliciting problematic participation. However, for the most part, courts have interpreted Section 230 to exclude state civil or criminal liability based on user-generated or “third-party” content.

A recent decision by the Third Circuit Court of Appeals bucks this trend. Anderson v. TikTok, Inc. reverses a district court’s dismissal of a wrongful death lawsuit against TikTok based on Section 230. The majority’s brief opinion notes that editorial decisions (including the algorithmic recommendation engine that matched ten-year-old Nylah Anderson to a “Blackout Challenge” video) constitute protected speech under the First Amendment following the Supreme Court’s July 2024 decision in Moody v. NetChoice LLC. If recommending content is TikTok’s “speech” for free speech purposes, the court reasons, then it is also TikTok’s speech for purposes of Section 230. The concurrence’s longer opinion reasons that distributing speech is different than simply hosting it. TikTok did not create the “Blackout Challenge,” but it did pass the video on to Nylah.

Suffice it to say that the Third Circuit’s reasoning finds little support in precedent or logic. As court after court, including the Third Circuit, has recognized, Section 230 precludes state civil or criminal liability for platforms based on content provided by third parties. Platforms cannot “be treated as publishers or disseminators of any information provided by another informational content provider”—an immunity that goes beyond what prior First Amendment precedent requires. The fact that editorial decisions enjoy free speech protection under the Constitution does not make TikTok a lawful publisher or disseminator of the harmful content it publishes. Indeed, it seems hard to believe that Moody, a case that struck down Florida and Texas laws regulating social media based on the First Amendment, intended to provide any less legal protection than Congress provided. And if, as argued in the concurrence, a platform can be held liable only for the way it distributes content, then TikTok or YouTube or anyone else could be held liable for alphabetizing content, let alone displaying it in accordance with its popularity.

The basis of the wrongful death suit against TikTok cannot be that TikTok is liable for the “Blackout Challenge” video that led to Nylah’s death. Congress has ruled this out. But there may be alternative ways for courts to hold TikTok and others liable for the damages they impose.

Judges do not make laws except when they do. Every cause of action in tort law at common law arose when an English or American court decided to recognize a new civil wrong for which the law provides a remedy. The advent of new technologies often plays a role in setting in motion these changes. Negligence arguably owes its contemporary stature to the proliferation of the bandwagon. Privacy crimes responded to the invention of “instant photographs” and “numerous mechanical devices [that] threaten to make real the prediction that ‘what is whispered in the closet will be proclaimed from the housetops.’” Intrusion into personal property became “electronic” with the introduction of email and its nemesis, spam.

Plaintiffs are once again beginning to test the limits of tort law to accommodate social media. Seattle and other public school districts recently sued TikTok, YouTube, and other platforms based on the long-standing nuisance theory, arguing that these companies endanger public health by fostering a toxic online environment. When two children were killed in a high-speed crash trying to activate Snapchat’s “speed filter,” the Ninth Circuit allowed a cause of action to be brought against the company for negligent design. Snap could be held liable for the “foreseeable consequences” of its irresponsible function, the court reasoned, even though the “speed filter” always accompanied user-generated content. Washington election officials successfully sued Facebook, despite its Section 230 objection, for failing to keep records of political ads in the state. The emphasis, once again, was on Facebook’s own conduct around the ads, rather than the content of the ads themselves.

Admittedly, there is a fine line between attributing third-party content to the platform, something federal law prohibits, and holding the platform liable for foreseeable harm to individuals and communities, something tort law encourages. What did TikTok do wrong in Anderson? They did not film or upload a dangerous challenge video, and they cannot be held liable for hosting, distributing, or even recommending it. But has TikTok invested enough time and resources into protecting children on the platform, especially given what the company knows about the toxic content that appears there? Should families like Nylah’s be able to rely on TikTok’s own community guidelines, which pledge to “[r]estrict content that is unsuitable for young people”? These questions sound less like derivative liability than noncompliance and misconduct. Section 230 was supposed to be a shield, not a motto. Courts should try to make sense of this question, rather than pretending that Section 230 doesn’t exist. Obviously erroneous interpretations of Section 230, such as the Third Circuit’s in Anderson v. TikTok, Inc., only set the law back.