California lawmakers pass measure to combat child sexual abuse material on social media
California lawmakers on Wednesday passed a bill aimed at combating child sexual abuse material on social media platforms such as Facebook, Snapchat and TikTok.
The legislation, Assembly Bill 1394, would hold social media companies liable for failing to remove the content, which includes child pornography and other obscene material depicting children.
“The goal of the bill is to end the practice of social media being a superhighway for child sexual abuse materials,†Assemblywoman Buffy Wicks (D-Oakland), who authored the legislation, said in an interview.
The bill unanimously cleared the Senate on Tuesday. The Assembly unanimously approved an amended version of the bill on Wednesday and it’s now headed to the governor’s desk for consideration.
Efforts to pass a package of bills to make social media safer for young people faced stiff opposition from tech industry groups such as TechNet and NetChoice who feared the legislation would lead to platforms being overly cautious and pulling down more lawful content.
Child safety groups clashed with the tech companies over proposed amendments to the bill they worried would make it easier for social media platforms to avoid liability for failing to remove child sexual abuse materials. Wicks made changes to the bill last week, delaying the date it would take effect to January 2025. The amendments also give social media companies more time to respond to a report about child sexual abuse material and a way to pay a lower fine if they meet certain requirements.
Tech groups, including NetChoice and TechNet, still opposed the bill after Wicks made amendments, telling lawmakers it would still face legal challenges in court. The groups along with business organizations such as California Chamber of Commerce urged lawmakers to delay passing the bill until next year.
“The bill in print misses the mark and will surely result in litigation,†the groups said in a floor alert sent to lawmakers.
Other legislation targeting social media platforms died earlier this month, underscoring the pushback lawmakers face from tech companies. The battle has extended beyond the California Legislature, spilling into the courts. Lawmakers passed children’s online safety legislation in 2022, but groups like NetChoice have sued the state to block the law from taking effect. X, formerly Twitter, sued California last week over a law that aimed to make social media platforms more transparent about how they moderate content.
Wicks said she’s confident her bill will withstand any potential legal challenges.
“These companies know they have to take more of a proactive role in being part of the solution to the problem,†she said. “This bill is going to force that conversation and require it.â€
Under the bill, social media companies would be barred from “knowingly facilitating, aiding, or abetting commercial sexual exploitation.†A court would be required to award damages between $1 million and $4 million for each act of exploitation that the social media platform “facilitated, aided, or abetted.â€
Social media companies would also be required to provide California users a way to report child sexual abuse material they’re depicted in and respond to the report within 36 hours. The platform would be required to permanently block the material from being viewed. If the company failed to do so, it would be liable for damages.
Social media companies could be fined up to $250,000 per violation. The fine would be lowered to $75,000 per violation if they meet certain requirements, including reporting the child sexual abuse material to the National Center for Missing and Exploited Children (NCMEC) and participating in a program called “Take It Down†that helps minors pull down sexually explicit images and nude photos.
The program assigns a digital fingerprint to the reported image or video so platforms can find child sexual abuse materials. Under the amended version of the bill, they would have 36 hours to remove the materials after receiving this digital fingerprint from the NCMEC. Companies are already required under federal law to report child sexual abuse material to NCMEC and major online platforms including Facebook, Instagram, Snap and TikTok participate in the Take It Down program.
More to Read
Get the L.A. Times Politics newsletter
Deeply reported insights into legislation, politics and policy from Sacramento, Washington and beyond. In your inbox three times per week.
You may occasionally receive promotional content from the Los Angeles Times.