California lawmakers take aim at social media role in youth fentanyl use and sex trafficking
SACRAMENTO — How much responsibility do social media platforms bear when people use them to sell kids a deadly dose of fentanyl, pay teenagers to livestream stripteases or recruit minors who are sold for sex?
Those are questions California lawmakers will try to answer this year in their latest effort to regulate social media, a debate that will play out amid deliberation at the Supreme Court over whether federal law shields platforms from liability for manipulating what users see.
After a failed effort last year to pass a sweeping measure to allow more lawsuits against social networks for harm caused to children, lawmakers have come back this year with bills that take a more targeted approach.
They’re focusing on some of the most frightening uses of apps many teens report using “almost constantly.†One bill would hold social media companies liable for promoting the illegal sale of fentanyl to youths and targeting them with content that could result in eating disorders or suicide. Another would require that sites permanently delete photos and videos of minors upon their request, and also allow lawsuits against social media platforms for features that facilitate commercial sexual exploitation of minors.
Fentanyl deaths among teens more than doubled from 2019 to 2020, increasing from 253 to 680. Last year, the number jumped to 884, according to a report from the Journal of the American Medical Assn.
Data from the National Human Trafficking Hotline show that technology plays an increasingly significant role in human trafficking, according to the federal Administration for Children and Families, with the internet now the most prevalent place for recruiting victims and webcam video feeds creating a new method for sexual exploitation schemes.
News reports in the last year have documented teenagers receiving payment from strangers who ask them to livestream sexual content on TikTok and dying from drug overdoses after using apps like Snapchat to buy pills they didn’t know were laced with fentanyl. Images of children being sexually abused have proliferated on Twitter, the New York Times reported, despite owner Elon Musk saying in a tweet that removing such content is “priority #1.†And 60 Minutes revealed that Facebook knew its Instagram platform pushed accounts promoting anorexia to teenage girls seeking diet tips.
“The current legal system isn’t protecting our kids,†said Assemblymember Buffy Wicks, an Oakland Democrat who is introducing the legislation Friday aimed at curbing the use of social media to sexually exploit children and teens.
“It’s something we have to tackle. I know they don’t want to be held liable for what happens on their platforms, but as any parent knows, we have to take more action. What we’re doing is not working.â€
The action in Sacramento comes as the U.S. Supreme Court is set to hear arguments in a case that will test the legal boundaries for regulating social media. For years, online platforms have operated with legal protections shielding them from liability for content their users post. The federal law known as Section 230 of the Communications Decency Act is both heralded for fostering innovation on the internet and criticized as a free pass allowing tech giants to shirk responsibility.
At issue in the case of Gonzalez vs. Google is whether Section 230 also protects platforms when they make targeted recommendations, such as serving up videos based on projections of what will keep users on the platform. The Gonzalez family sued Google after 23-year-old Cal State Long Beach student Nohemi Gonzalez was killed in a terrorist attack in Paris in 2015. Her father alleged that one of the attackers was an active YouTube user who had appeared in an Islamic State propaganda video, and that Google should be liable for supporting terrorism by creating algorithms that recommended Islamic State videos to YouTube users.
The father of a 23-year-old woman killed in the Paris terror attacks has filed suit against Google, Twitter and Facebook, alleging the companies provided “material support†to extremists.
The court will hear arguments in the case Tuesday and in a related case involving Twitter on Wednesday that could together upend the legal framework for regulating social media.
The outcome of the cases is highly relevant to the legislation just introduced in California, said Pamela Samuelson, a UC Berkeley law professor and co-director of the Berkeley Center for Law and Technology.
“The issue about targeted recommendations that promote access to dangerous things, whether it’s illegal drugs or child sexual abuse material or things about eating disorders — all of those things about algorithms — that’s actually at stake right now before the U.S. Supreme Court,†she said.
Therefore, Samuelson said, California should hold off on passing laws to regulate social media until the Supreme Court rules.
But supporters of the new legislation say there’s no need to wait. If the court upholds the current law, Wicks’ bill is written to comply with a U.S. 9th Circuit Court of Appeals ruling that says Section 230 does not apply when a platform “actively participates in†or is “directly involved in†unlawful conduct, said Ed Howard, an attorney with the Children’s Advocacy Institute at the University of San Diego, which is a sponsor of the bill.
And if the court changes the parameters of the federal law, he said, lawmakers in Sacramento have time to adjust.
“The bill is an ongoing process,†Howard said. “The court’s decision will come out during this two-year session, and all bills in this area can be amended or tailored to match the requirements of the Supreme Court’s decision.â€
The powerful tech lobby in Sacramento is likely to fight any legislation that will increase the potential for their companies to be sued. Tech businesses successfully lobbied against the bill last year that would have made social media platforms liable for causing harm to children through addictive algorithms and other features. It passed the Assembly but died mysteriously in the Senate Appropriations Committee, where Democrats who control the Capitol often bury bills they don’t want to openly vote down.
Dylan Hoffman, a lobbyist for the TechNet trade association that represents companies including Google, Snap and Facebook, said platforms are working to remove images of child sexual abuse and have partnerships with law enforcement and nonprofits to curb human trafficking. Those are better routes for addressing the issues California lawmakers are raising, he said.
“If there are ways that we can improve, if there are better ways to interface with [nonprofits] or better ways to work with law enforcement, we’re all ears. We want to do it,†Hoffman said. “But I think liability isn’t the right way to try and solve a delicate and difficult problem like this.â€
While the most controversial social media bill died last year, Wicks wrote another that was signed into law. It requires online sites and services likely to be used by children to be designed in ways that promote their safety, such as by defaulting to strict privacy settings and turning off precise location tracking.
The bill made California the first state in the nation to require tech companies to design their platforms to protect kids’ safety. A tech industry group sued the state to block the law before it could take effect.
More to Read
Sign up for Essential California
The most important California stories and recommendations in your inbox every morning.
You may occasionally receive promotional content from the Los Angeles Times.