Connect with us

Tech

US Appeals Court Takes Aim at Big Tech Liability Shield

Published

on

US Appeals Court Takes Aim at Big Tech Liability Shield

(Bloomberg) — A US appeals court has struck down a long-standing argument from social media platforms: that a federal law grants them blanket immunity from being held accountable for user harms – even the wrongful death of children.

Most Read from Bloomberg

In a ruling delivered by a three-judge appeals court panel on Tuesday, a Pennsylvania mother won the right to sue TikTok over the death of her 10-year-old daughter. The court said that TikTok could be liable because its algorithm served up dangerous content to the child.

Three weeks before Christmas 2021, Nylah Anderson was found lifeless in her mother’s closet in suburban Philadelphia. She had accidentally strangled herself with her mom’s purse strings while mirroring a “blackout challenge” video that was promoted to her on TikTok.

That algorithmic recommendation forms the crux of the ruling by the US Court of Appeals for the Third Circuit.

The judges overturned a lower court’s decision to dismiss the case on the grounds of social media’s liability defense: Section 230 of the 1996 Communications Decency Act. Under Section 230, online platforms are shielded from responsibility for content posted on their sites by third parties.

But, by promoting a “How-To-Guide on self-asphyxiation” to a child, TikTok moved beyond the realm of a passive intermediary shielded by Section 230, the ruling said. Instead, it became an active promoter of dangerous content. The Anderson family’s lawsuit seeking to hold TikTok liable for its “knowing distribution and targeted recommendation of the blackout challenge” can proceed, it said.

The ruling could have sweeping implications for all websites that run on user-generated content, which the site owners don’t screen before it’s posted. The liability shield has allowed social media platforms to grow to tremendous scale, because they only tend to stop and review users’ posts if other users report them — and even then, tend to do so using artificial intelligence. With so much content, those same platforms have increasingly relied on algorithmic recommendation to ensure users are looking at what will retain their attention, bolstering their advertising businesses.

“Big Tech just lost its ‘get-out-of-jail-free’ card,” Jeffrey Goodman, a partner at Saltz Mongeluzzi Bendesky, who argued on behalf of the family, said in response to the ruling. If social media platforms cause harm, “they will now have to face their day in court,” he said.

A spokesperson for TikTok declined to comment about the decision or whether the company plans to appeal. In a previous statement about the case, the company said, “TikTok remains vigilant in its commitment to user safety and would remove any content related to the blackout challenge from its app.”

The Anderson family declined a request for an interview. In a statement released by Goodman, they said: “Nothing will bring back our beautiful baby girl, but we are comforted knowing that — by holding TikTok accountable – our tragedy may help other families avoid future, unimaginable suffering.” Social media platforms must “stop exploiting children in the name of profit,” they said in the statement.

The appeals court ruling comes amid growing concerns over the harms social media has inflicted upon a generation of children. Hundreds of lawsuits have been filed against social media platforms in recent years, alleging they’ve designed addictive products that have promoted suicidal and self-harm content to kids, and connected young users to drug dealers and sextortionists.

Bloomberg Businessweek published a 2022 cover story on the blackout challenge, which was cited in the brief Goodman filed to the appeal court. The blackout challenge is a dare whereby participants choke themselves with household items, like a shoelace or a power cord, until they black out and film the adrenaline rush they get regaining consciousness.

The Businessweek story linked the dare to the deaths of at least 15 pre-teen children. It also uncovered evidence that TikTok knew its algorithm was sending videos promoting the blackout challenge to kids — and that some of them had killed themselves attempting it — before Anderson’s death.

Five months after she died, in May 2022, Anderson’s parents filed a lawsuit against TikTok, alleging product liability, negligence and wrongful death.

Two months later, a second lawsuit was filed against TikTok by the Social Media Victims Law Center on behalf of the families of Arriani Arroyo, 9, of Wisconsin and Lalani Walton, 8, of Texas, who both accidentally killed themselves attempting the blackout challenge. The families claimed the dangerous dare was recommended to the girls via TikTok’s For You feed, a page of personalized content curated for each user to keep them scrolling for as long as possible. The Arroyo and Walton case is ongoing.

In the Anderson case, TikTok argued that it was “completely protected” by Section 230, and in October of that year, a district court granted the company’s motion to dismiss the case. It said TikTok couldn’t be held liable for a video a third party had posted on its site. The family appealed a week later.

“By promoting this challenge and populating it on the For You Pages of children all around America, TikTok is placing children in harm’s way – and in many cases killing them – in the name of corporate greed,” the appellate brief filed by the Anderson family said.

The family’s lawyers went on to challenge the way section 230 has “tragically been applied to protect goliaths of the technology industry.” Applying the law as a blanket immunity shield has, they argued, “empowered social media companies to develop increasingly predatory and manipulative technologies designed to addict users and control their actions. Children, more than anyone, are paying the price.”

The appellate court agreed, saying that TikTok knew the deadly blackout challenge was spreading across its app, that the algorithm was “feeding” the challenge to children and that several children had died attempting it.

“Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her,” Judge Paul Matey, writing on behalf of the three-judge panel, said. The judges reversed the lower court’s decision, granting Anderson’s parents the right to sue.

The appellate ruling also went beyond the Anderson case, criticizing the way section 230 “rides in to rescue corporations from virtually any claim loosely related to content posted by a third party, no matter the cause of action and whatever the provider’s actions.” Congress did not intend to “create a lawless no-man’s-land of legal liability,” the judges said.

This loose reading of the law has immunized social media platforms “from the consequences of their own conduct.” It has permitted these companies to ignore the ordinary obligations other businesses face, like preventing their services from causing “devastating harm.”

The case is Estate of Nylah Anderson v. TikTok, Inc. et. al., Case No. 22-3061 (on appeal from the United States District Court for the Eastern District of Pennsylvania in Case No. 2:22-cv-01849)

(Updates with context in the 14th paragraph.)

Most Read from Bloomberg Businessweek

©2024 Bloomberg L.P.

Continue Reading