close
close

TikTok faces lawsuit over deadly viral challenge despite Section 230 protection

Chinese social media giant TikTok could potentially be held liable for content posted on its platform that resulted in the death of a girl, although publishers enjoy federal protections in such cases.

A US appeals court today reopened the lawsuit brought by the mother of a 10-year-old girl who died in Pennsylvania after taking part in a virus challenge in which people were challenged to strangle themselves until they lost consciousness.

If someone has suffered harm from what they saw on a publisher's platform, the platform is in most cases immune from liability under Section 230 of the Communications Decency Act. The law protects technology companies from lawsuits over content uploaded to their platforms and provides immunity from lawsuits related to content moderation decisions. There have been calls for reform of the law for years.

District Judge Paul Matey of a three-judge panel wrote that the law does indeed “provide immunity from lawsuits for hosting videos created and uploaded by third parties,” but added that TikTok could still be held liable for “knowingly distributing and specifically recommending videos that it knew could be harmful.”

Lawyers for the girl's mother, Tawainna Anderson, had said the “blackout challenge,” which appeared sometime in 2021, appeared on her daughter's “For You” feed. TikTok's algorithm had essentially curated something for the girl that ended with her “unintentionally” hanging herself. Judge Patty Shwartz wrote in her opinion that there is a difference between a “keeper of third-party content” and a “positive promoter” of content.

“Nylah [daughter]still in her first year of adolescence, probably had no idea what she was doing or that following the images on her screen would kill her,” Matey wrote in partial agreement with the sentiment. “But TikTok knew Nylah would be watching because the company's customized algorithm placed the videos on her 'For You' page.”

In her original statement, Anderson wrote that TikTok “undoubtedly knew that the deadly Blackout Challenge was spreading through its app and that its algorithm was targeting the Blackout Challenge at children, including those who had already died.” It is believed that up to 20 children died attempting the challenge.

Jeffrey Goodman, one of the family's lawyers, said it's time for Section 230 to be looked at more closely. This particular case is sure to bring widespread support to the family, but according to the American Civil Liberties Union, removing liability protections for tech companies would lead to censorship, and that could mean silencing activists' voices while companies try to avoid lawsuits.

The case will now move forward. The family's lawyers say this proves that Section 230 does not go far enough to protect companies from what they called “egregious and predatory behavior.” In a statement on the ruling, Anderson said the ruling will not bring her child back, but holding TikTok accountable could help other families in the future “avoid future, unimaginable suffering.”

Photo: Alexander Shatov/Unsplash

Your support is important to us and helps us keep the content FREE.

By clicking below you support our mission to provide free, in-depth and relevant content.

Join our community on YouTube

Join the community of more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, ​​Dell Technologies Founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner for the industry. You are truly a part of our events and we are very happy that you are coming. And I know that people also appreciate the content that you create” – Andy Jassy

THANKS