PHILADELPHIA – The U.S. Court of Appeals for the Third Circuit has reinstated a Chester woman’s wrongful death lawsuit against social media app TikTok, filed after her 10-year-old daughter died in December 2021 due to attempting a “Blackout Challenge” and which had been previously dismissed by a lower court federal judge.
Tawainna Anderson (individually and as Administratrix of the Estate of Nylah Anderson, a deceased minor) of Chester first filed suit in the U.S. District Court for the Eastern District of Pennsylvania on May 12, 2022 versus TikTok, Inc. of Culver City, Calif. and its parent company ByteDance, Inc. of Mountain View, Calif.
The suit said that Nylah Anderson, a 10-year-old who could speak three languages, enjoyed dancing to videos she saw on TikTok and shared the platform’s video content. In the beginning of December, the suit continued that Nylah saw the “Blackout Challenge," which dared people to choke themselves until they nearly rendered themselves unconscious, in her personalized feed of recommended TikTok content.
“The TikTok defendants’ app and algorithm pushed exceedingly and unacceptably dangerous challenges and videos to Nylah’s ‘For You Page’ (FYP), thus encouraging her to engage and participate in the challenges. Only days before Nylah attempted the Blackout Challenge that killed her, the TikTok defendants’ algorithm presented Nylah with a similar choking challenge through her FYP, which entailed placing plastic wrap around her neck and holding her breath until a euphoric effect was achieved. The following day, the TikTok defendants’ algorithm thrust the Blackout Challenge onto Nylah’s FYP, encouraging Nylah to participate,” the suit said.
“The particular Blackout Challenge video that the TikTok defendants’ algorithm showed Nylah prompted Nylah to hang a purse from a hanger in her closet and position her head between the bag and shoulder strap and then hang herself until blacking out. On Dec. 7, 2021, Nylah attempted the Blackout Challenge she had seen on her FYP in her mother’s bedroom closet while her mother was downstairs. Tragically, after hanging herself with the purse as the video the TikTok defendants put on her FYP showed, Nylah was unable to free herself. Nylah endured hellacious suffering as she struggled and fought for breath and slowly asphyxiated until near the point of death.”
The plaintiff found her daughter unconscious and hanging in her bedroom closet by her neck from the purse strap, and immediately began performing emergency CPR on her daughter, in an attempt to resuscitate her while emergency responders were en route to her home.
“Three deep ligature marks were found on Nylah’s neck, suggesting that she struggled greatly to free herself from the perilous and terrifying position but was unable to do so. Nylah was emergently transported to Nemours DuPont Hospital in Delaware with the hope that she could survive the extreme injuries she sustained in this horrific event. After spending several days in the pediatric intensive care unit, all hope for Nylah was extinguished and on Dec. 12, 2021, 10-year-old Nylah Anderson succumbed to her injuries and died. This tragedy and the unimaginable suffering endured by plaintiff and Nylah’s family was entirely preventable had the TikTok defendants not ignored the health and safety of its users, particularly the children using their product, in an effort to rake in greater profits,” the suit stated.
“The TikTok defendants’ intentionally manipulative app and algorithm thrust an unacceptably dangerous video that defendants knew to be circulating its platform in front of an impressionable 10-year-old girl. As a direct result of the TikTok defendants’ corrosive marketing practices, Nylah attempted the dangerous challenge and died as a result. As a direct and proximate result of the defendants’ carelessness, negligence, gross negligence, recklessness, willful and wanton conduct, strict liability, failure to warn and defective design, Nylah suffered serious, severe, disabling injuries including, but not limited to, her death resulting from asphyxiation by strangulation.”
The suit also mentioned 22 other dangerous challenges it claims have trended on TikTok, including the “Fire Mirror Challenge,” which encourages viewers to spray a flammable liquid on a mirror and then set it on fire; the “Hot Water Challenge” which involves pouring boiling water on someone else and the “Fire Challenge”, encouraging users to light themselves on fire.
According to the plaintiff, TikTok influences the behavior of its users, especially children, to maximize profits and create addiction to the social media app, while discounting responsibility for its users’ safety, once again with a focus on children.
Counsel for TikTok motioned to dismiss the complaint on July 18, arguing that the corporation is not at-home in Pennsylvania and thus isn’t subject to jurisdiction, and that the state-law claims contained in the complaint are barred by federal law.
“First, this Court lacks personal jurisdiction over defendants. Neither defendant is ‘at home’ in Pennsylvania, nor have they taken any actions directed at Pennsylvania to ‘purposely avail’ themselves of Pennsylvania law in connection with plaintiff’s complaint. Second, Section 230 of the federal Communications Decency Act (CDA) bars plaintiff’s state-law claims,” TikTok’s answer said.
“Third, separate from Section 230 immunity, plaintiff cannot state a claim for any of the individual causes of action in the complaint because:
• TikTok is not a ‘product’ or a ‘seller’ subject to strict product liability (Count I);
• Defendants have no legal duty of care to protect against third-party depictions of dangerous activity that would give rise to a negligence claim (Count II);
• Defendants did not engage in any ‘unfair or deceptive’ conduct—and plaintiff does not otherwise state a claim – under the Pennsylvania Unfair Trade Practices and Consumer Protection Law (Count III) or the California Consumer Legal Remedies Act (Count IV); and
• Plaintiff’s derivative wrongful death (Count V) and survival (Count VI) claims – which both require the existence of an underlying tort – also fail. Because these legal defects cannot be cured by amendment, plaintiff’s complaint should be dismissed with prejudice.”
The plaintiff countered the dismissal motion with an Aug. 1, 2022 response, which found that third-party liability did not apply in this matter and, likewise, the CDA was not applicable.
“Plaintiff’s complaint is crystal clear: Defendants’ liability in this matter is based on their own independent conduct as designers, manufacturers, sellers and/or distributors of their dangerously defective products – TikTok and its algorithm – and their own independent acts of negligence. Despite this, defendants attempt to rewrite plaintiff’s complaint as seeking to hold defendants liable for the content created by third-parties,” the plaintiff’s response stated.
“Plaintiff does no such thing and defendants’ attempts are a transparent effort to shoehorn this case into the immunity provisions of Section 230 of the Communications Decency Act. The CDA is entirely inapplicable to plaintiff’s claims against defendants. In addition to the demonstrably incorrect argument that they are immune under Section 230 of the CDA, defendants’ motion also challenges this Court’s personal jurisdiction, argues that TikTok and its algorithm are not ‘products’ subject to strict product liability, and that defendants owed no duty to Nylah Anderson. Defendants are wrong on all fronts, and their motion to dismiss should be denied.”
In a Sept. 9, 2022 letter, counsel for TikTok requested the Court to stay the case, depending on a Sept. 29 decision to be reached by the Judicial Panel on Multidistrict Litigation (JPML), as to a pending transfer motion which would directly include and impact it. However, U.S. District Court for the Eastern District of Pennsylvania Judge Paul S. Diamond rejected the motion in a Sept. 12 order.
In a contrasting turn of events, on Oct. 6, the JPML ordered the transfer of certain social media products liability actions involving TikTok, Meta, Snap and other platforms to MDL No. 3047 in the U.S. District Court for the Northern District of California.
This led TikTok’s counsel to renew its motion to stay the instant case on Oct. 7, 2022, pending transfer to the MDL in the question.
But before a decision could be rendered on the renewed motion to say, Diamond dismissed the case outright on Oct. 25, 2022. Though finding the circumstances “tragic”, Diamond explained that because the plaintiff sought “to hold the defendants liable as ‘publishers’ of third-party content, they are immune under the Communications Decency Act.”
“Anderson bases her allegations entirely on defendants’ presentation of ‘dangerous and deadly videos’ created by third parties and uploaded by TikTok users. Anderson thus premises her claims on the ‘defective’ manner in which defendants published a third party’s dangerous content. Although Anderson recasts her content claims by attacking defendants’ ‘deliberate action’ taken through their algorithm, those ‘actions,’ however ‘deliberate,’ are the actions of a publisher. Courts have repeatedly held that such algorithms are ‘not content in and of themselves,” Diamond stated.
“In sum, because Anderson’s design defect and failure to warn claims are ‘inextricably linked’ to the manner in which defendants choose to publish third-party user content, Section 230 immunity applies. Anderson’s wrongful death and survival claims cannot proceed in light of that tort immunity.”
Diamond explained that changes as to the responsibility for the child’s death are best left up to Congress, instead of the courts.
“Nylah Anderson’s death was caused by her attempt to take up the ‘Blackout Challenge.’ Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, defendants published that work – exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts. I will thus grant defendants’ motion on immunity grounds. In light of my decision, I need not address defendants’ contentions respecting jurisdiction and failure to state a claim,” Diamond ruled.
On Oct. 31, 2022, plaintiff counsel filed notice of their intent to appeal Diamond’s dismissal to the U.S. Court of Appeals for the Third Circuit.
UPDATE
However, the federal appellate court found that TikTok was not entitled to the same Section 230 immunity which has prevented other social media apps from being held liable in civil actions.
Third Circuit Judge Patty Shwartz authored the Court’s majority opinion in the matter, and provided a rationale for why the aforementioned immunity does not extend to TikTok in this case.
“Section 230 immunizes interactive computer services (ICS) only to the extent that they are sued for ‘information provided by another information content provider. In other words, ICSs are immunized only if they are sued for someone else’s expressive activity or content (i.e., third-party speech), but they are not immunized if they are sued for their own expressive activity or content (i.e., first-party speech),” Shwartz said.
“Here, as alleged, TikTok’s FYP algorithm ‘decides on the third-party speech that will be included in or excluded from a compilation – and then organizes and presents the included items’ on users’ FYPs. Accordingly, TikTok’s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok’s own ‘expressive activity,’ and thus its first-party speech.”
The case now heads back to the trial court for further proceedings.
The plaintiff is represented by Robert J. Mongeluzzi, Jeffrey P. Goodman, Samuel B. Dordick and Rayna McCarthy of Saltz Mongeluzzi & Bendesky in Philadelphia and Shavertown.
The defendants are represented by Joseph E. O’Neil and Katherine A. Wang of Campbell Conroy & O’Neil in Berwyn, plus Albert Giang, Geoffrey M. Drake and TaCara D. Harris of King & Spalding, in Los Angeles, Calif. and Atlanta, Ga.
U.S. Court of Appeals for the Third Circuit case 22-3061
U.S. District Court for the Eastern District of Pennsylvania case 2:22-cv-01849
From the Pennsylvania Record: Reach Courts Reporter Nicholas Malfitano at nick.malfitano@therecordinc.com