close
close

Death in “Blackout Challenge” could mean the end of Section 230

The tragic death of a 10-year-old girl after her parents filed a lawsuit alleging she participated in a “blackout challenge” presented to her on her TikTok page, “For You,” could change the internet as we know it.

In an August 27 decision, the U.S. Court of Appeals for the Third Circuit found that in 2021, TikTok – through its “For You Page” algorithm – recommended a video promoting a “blackout challenge” to 10-year-old Nylah Anderson.

“Nylah, still in her first year of adolescence, probably had no idea what she was doing or that following the images on her screen would kill her,” Third Circuit Judge Paul Matey wrote in his concurring opinion. “But TikTok knew Nylah would be watching because the company's customized algorithm placed the videos on her 'For You' page.”

The challenge required viewers to strangle themselves until they passed out. In a 2022 lawsuit filed against TikTok, her parents' lawyers said Nylah tragically died while trying to recreate what she saw.

The case was initially dismissed by a district judge under Section 230. However, the Third Circuit found that TikTok's algorithm, which makes customized content recommendations to certain users based on the posts they interact with, is a form of speech—one that is not protected by Section 230.

The company had argued in court that it was protected from prosecution under Section 230 of the Communications Decency Act. TikTok did not immediately respond to a request for comment from Business Insider.

Section 230 of the Communications Decency Act of 1996, often referred to as the law that created the internet, protects online platforms like TikTok, Meta, X and others from being held liable for content posted by users of their sites.

This means, for example, that the platform on which a user posts a video encouraging viewers to harm themselves cannot be held liable if a receptive viewer follows that encouragement.

But the Third Circuit’s ruling could change that.

Supporters of the ruling say it is time

“Imagine if someone approached Nylah at school and suggested that she suffocate herself. We would immediately recognize the person's guilt,” David French, a columnist and former lawyer, wrote in a recent op-ed for the New York Times. “One could argue that algorithmic suggestion is even more powerful than personal suggestion.”

French and other supporters of the Third Circuit ruling argue that TikTok's liability protection should end where its algorithmic proposals begin.

Neutrally hosting a wide range of content on an online platform is fine, French and other advocates say. But when it comes to promoting certain content — especially content that site administrators know could be harmful — a new line must be drawn and the platforms themselves must be held legally accountable, the Third Circuit ruling said.

Defenders of Paragraph 230 argue that the ruling is a blow to freedom of expression

Opponents of the ruling argue that critics of Section 230 are exploiting Anderson's tragic death and the legitimate desire to protect children online as a means of undermining free speech.

“These laws that are supposed to protect children are a myth,” Betsy Rosenblatt, associate director of the Spangenberg Center for Law, Technology & Arts at Case Western University, told Business Insider. “They all wear the guise of protecting children, but underneath they are not protecting children – they are attempts to silence speech.”

Rosenblatt said the Third Circuit's decision makes no logic or legal sense. It's morally reprehensible to suggest in a video that a child should strangle themselves for online engagement — but it's not illegal, and it shouldn't be illegal for the host of the video to show it on your For You page.

“The more you ask platforms to filter speech, the more they have to delete first and ask questions second,” Rosenblatt told BI. “And that means that controversial content is deleted as soon as it is questioned, even if it should stay online.”

What happens next?

TikTok may appeal the Third Circuit’s decision. If the company appeals, the case will land on the Supreme Court desk, where Supreme Court justices can decide whether to take the case or let the Third Circuit's ruling stand — which would force platforms like TikTok to rethink how their algorithms work to avoid liability in cases like Anderson's.

Although the Supreme Court has so far shied away from defining the scope of Section 230, its conservative justices have previously indicated they are open to revisiting the law. If they do, their ruling could have even more far-reaching consequences than the Third Circuit's ruling.

Justices Clarence Thomas and Neil Gorsuch in July dissented from the court's refusal to hear a case that would have revisited Section 230. The case stemmed from allegations that the Snapchat app had a design flaw that favored sex offenders. But lower courts found that the app's parent company was protected by Section 230.

And in a decision last term, the Supreme Court also left open a legal loophole to hold platforms liable depending on which country their headquarters are located in. In Moody v. NetChoice, Justice Amy Coney Barrett wrote in an article that found that a platform's algorithmic activity is a form of “expressive activity” and how free speech should be regulated that a foreign-owned social media platform – like TikTok – may not enjoy the same First Amendment protections as a U.S. company.

Rosenblatt said if the Supreme Court agrees to hear the case, the court could agree with the Third Circuit that TikTok's algorithmic recommendations are a version of the website's speech. If that's the case, The question would then be whether the recommendation itself was negligent, which could have legal consequences.

“That would still be terrible for business on the Internet, but it wouldn't kill all websites,” Rosenblatt said of maintaining a narrower interpretation of the Third Circuit's ruling. But the potential for a broader interpretation that might conclude that any form of content moderation amounts to converting a user's speech into the speech of a platform would have “devastating effects on the Internet ecosystem and technology in general.”