TikTok sued over girls' deaths in viral 'blackout challenge' - Los Angeles Times
Advertisement

TikTok is sued over deaths of two young girls in viral ‘blackout challenge’

TikTok logo on a smartphone screen.
TikTok’s logo on a smartphone screen.
(Kirill Kudryavtsev / AFP / Getty Images)
Share via

Eight-year-old Lalani Erika Walton wanted to become “TikTok famous.†Instead, she wound up dead.

Hers is one of two such tragedies that prompted a linked pair of wrongful death lawsuits filed Friday in Los Angeles County Superior Court against the social media giant. The company’s app fed both Lalani and Arriani Jaileen Arroyo, 9, videos associated with a viral trend called the blackout challenge in which participants attempt to choke themselves into unconsciousness, the cases allege; both of the young girls died after trying to join in.

It’s an indication that TikTok — the wildly popular, algorithmically curated video app that has its U.S. headquarters in Culver City — is a defective product, said the Social Media Victims Law Center, the law firm behind the suits and a self-described “legal resource for parents of children harmed by social media.†TikTok pushed Lalani and Arriani videos of the dangerous trend, is engineered to be addictive and didn’t offer the girls or their parents adequate safety features, the Law Center said, all in the name of maximizing ad revenue.

Advertisement

TikTok did not immediately respond to a request for comment.

The girls’ deaths bear striking similarities.

Lalani, who was from Texas, was an avid TikToker, posting videos of herself dancing and singing on the social network in hopes of going viral, according to the Law Center’s complaint.

At some point in July 2021, her algorithm started surfacing videos of the self-strangulation blackout challenge, the suit said. Midway through that month, Lalani told her family that bruises that had appeared on her neck were the result of a fall, the suit said; soon after, she spent some of a 20-hour car ride with her stepmother watching what her mother would later learn had been blackout challenge videos.

When they got home from the trip, Lalani’s stepmother told her the two could go swimming later, and then took a brief nap. But upon waking up, the suit said, her stepmother went to Lalani’s bedroom and found the girl “hanging from her bed with a rope around her neck.â€

Advertisement

The police, who took Lalani’s phone and tablet, later told her stepmother that the girl had been watching blackout challenge videos “on repeat,†the suit said.

Lalani was “under the belief that if she posted a video of herself doing the Blackout Challenge, then she would become famous,†it said, yet the young girl “did not appreciate or understand the dangerous nature of what TikTok was encouraging her to do.â€

On TikTok and Instagram, pregnant women find themselves targeted with videos that prey on their worst fears as expectant mothers, from birth defects to child loss. For some, quitting social media is the only solution.

Arriani, from Milwaukee, also loved to post song and dance videos on TikTok, the suit said. She “gradually became obsessive†about the app, it said.

Advertisement

On Feb. 26, 2021, Arriani’s father was working in the basement when her younger brother Edwardo came downstairs and said that Arriani wasn’t moving. The two siblings had been playing together in Arriani’s bedroom, the suit said, but when their father rushed upstairs to check on her, he found his daughter “hanging from the family dog’s leash.â€

Arriani was rushed to the hospital and placed on a ventilator, but it was too late — the girl had lost all brain function, the suit said, and was eventually taken off life support.

“TikTok’s product and its algorithm directed exceedingly and unacceptably dangerous challenges and videos†to Arriani’s feed, the suit said, encouraging her “to engage and participate in the TikTok Blackout Challenge.â€

Lalani and Arriani are not the first children to die while attempting the blackout challenge.

Nylah Anderson, 10, accidentally hanged herself in her family’s home while trying to mimic the trend, according to a lawsuit her mother recently filed against TikTok in Pennsylvania.

A number of other children, ages 10 to 14, have reportedly died under similar circumstances while attempting the blackout challenge.

“TikTok unquestionably knew that the deadly Blackout Challenge was spreading through their app and that their algorithm was specifically feeding the Blackout Challenge to children,†the Social Media Victims Law Center’s complaint said, adding that the company “knew or should have known that failing to take immediate and significant action to extinguish the spread of the deadly Blackout Challenge would result in more injuries and deaths, especially among children.â€

Advertisement

TikTok has in the past denied that the blackout challenge is a TikTok trend, pointing to pre-TikTok instances of children dying from “the choking game†and telling the Washington Post that the company has blocked #BlackoutChallenge from its search engine.

These sorts of viral challenges, typically built around a hashtag that makes it easy to find every entry in one place, are a big part of TikTok’s user culture. Most are innocuous, often encouraging users to lip sync a particular song or mimic a dance move.

But some have proved more risky. Injuries have been reported from attempts to re-create stunts known as the fire challenge, milk crate challenge, Benadryl challenge, skull breaker challenge and dry scoop challenge, among others.

Nor is this an issue restricted to TikTok. YouTube has in the past been home to such trends as the Tide Pod challenge and cinnamon challenge, both of which experts warned could be dangerous. In 2014, the internet-native urban legend known as Slenderman famously led two preteen girls to stab a friend 19 times.

Although social media platforms have long been accused of hosting socially harmful content — including hate speech, slander and misinformation — a federal law called Section 230 makes it hard to sue the platforms themselves. Under Section 230, apps and websites enjoy wide latitude to host user-generated content and moderate it how they see fit, without having to worry about being sued over it.

In only a few years, the five young Latinos in the Familia Fuego TikTok collective have gone from working day jobs to mixing with Hollywood’s elite.

The Law Center’s complaint attempts to sidestep that firewall by framing the blackout challenge deaths as a failure of product design rather than content moderation. TikTok is at fault for developing an algorithmically curated social media product that exposed Lalani and Arriani to a dangerous trend, the theory goes — a consumer safety argument that’s much less contentious than the thorny questions about free speech and censorship that might arise were the suit to frame TikTok’s missteps as those of a publisher.

An “unreasonably dangerous social media product ... that is designed to addict young children and does so, that affirmatively directs them in harm’s way, is not immunized third-party content but rather volitional conduct on behalf of the social media companies,†said Matthew Bergman, the attorney who founded the firm.

Or, as the complaint said: The plaintiffs “are not alleging that TikTok is liable for what third parties said or did, but for what TikTok did or did not do.â€

Advertisement

In large part the suits do this by criticizing TikTok’s algorithm as addictive, with a slot machine-like interface that feeds users an endless, tailor-made stream of videos in hopes of keeping them online for longer and longer periods.

“TikTok designed, manufactured, marketed, and sold a social media product that was unreasonably dangerous because it was designed to be addictive to the minor users,†the complaint said, adding that the videos that were served to users include “harmful and exploitative†ones. “TikTok had a duty to monitor and evaluate the performance of its algorithm and ensure that it was not directing vulnerable children to dangerous and deadly videos.â€

Leaked documents indicate that the company views both user retention and the time that users remain on the app as key success metrics.

It’s a business model that many other free-to-use web platforms deploy — the more time users spend on the platform, the more ads the platform can sell — but which is increasingly coming under fire, especially when children and their still-developing brains are involved.

A pair of bills making their way through the California Legislature aim to reshape the landscape of how social media platforms engage young users. One, the Social Media Platform Duty to Children Act, would empower parents to sue web platforms that addict their children; the other, the California Age-Appropriate Design Code Act, would mandate that web platforms offer children substantial privacy and security protections.

Bergman spent much of his career representing mesothelioma victims, many of whom became sick from asbestos exposure. The social media sector, he said, “makes the asbestos industry look like a bunch of choirboys.â€

But as bad as things are, he said, cases such as his against TikTok also offer some hope for the future.

Advertisement

With mesothelioma, he said, “it’s always been compensation for past wrongs.†But suits against social media companies provide “the opportunity to stop having people become victims; to actually implement change; to save lives.â€

Advertisement