California lawmakers pass new social media protections for minors - Los Angeles Times
Advertisement

California lawmakers pass new social media protections for minors

Facebook, Messenger and Instagram apps are displayed on an iPhone.
New legislation passed by California lawmakers that would require social media protections for minors and make public the content moderation policies of Facebook and other platforms.
(Jenny Kane / Associated Press)
Share via

California lawmakers passed legislation designed to protect the privacy and well-being of minors on social media and shield them from predators and exploitative commercialization on internet platforms.

Legislators also approved a bill under which platforms including Facebook, Snapchat, Twitter, YouTube and Google would be required to publicly disclose their policies on how they screen content, a requirement aimed at combating the spread of hate, racism, extremist violence and conspiracy theories online.

The state Assembly passed the two bills, AB 2273 and AB 587, on Tuesday, a day after they breezed through the state Senate with strong bipartisan support. The measures now go to Gov. Gavin Newsom for his consideration.

Advertisement

“Our children are getting bombarded with information online and they don’t yet have the capacity to understand all that information coming at them,” Assemblymember Buffy Wicks (D-Oakland) said before the vote Tuesday. “We want to make sure that when these products are created, they are by design and by default safe for our children.”

Wicks is the primary sponsor of AB 2273, the California Age-Appropriate Design Code Act, which prohibits tech companies from using the personal information of any children in a way that is detrimental to their physical or mental health. Web platforms that children are likely to use would be required to enact data privacy measures such as making user settings high-privacy by default, describing privacy policies in language kids can understand and prohibiting children’s personal information from being used for anything other than the purpose for which it was initially collected.

“As a parent, you have no chance under the status quo. You have no chance. There’s stuff running in the background. There’s stuff influencing your kids’ minds, the very development of their brain, that you have no ability to control. Most parents are not software engineers,” said Assemblymember Jordan Cunningham (R-Paso Robles). “I can tell you as a former prosecutor, there are predators out there, and they use these tools to try to get at children. It is not right, and it is time for the tech companies to step up.”

Advertisement

A coalition of technology groups, including the Entertainment Software Assn., opposed the legislation. In a statement to lawmakers, they said applying the law to websites “likely to be accessed by a child” was overly broad and would affect far more websites and platforms than necessary.

On TikTok and Instagram, pregnant women find themselves targeted with videos that prey on their worst fears as expectant mothers, from birth defects to child loss. For some, quitting social media is the only solution.

The News/Media Alliance, an industry advocacy group, of which the Los Angeles Times is a member and on the board of which sits California Times President Chris Argentieri, has pushed for changes to the bill over concerns that it would make online news publishing more costly.

Dr. Jenny Radetzky, a developmental behavioral pediatrician and assistant professor at the University of Michigan Medical School, told lawmakers in March that most web platforms are designed by adults untrained in the ways that children experience the digital world. Designers, she said, often focus on monetization or engagement tactics — hooking users by offering “rewards” for watching ads or finding ways to make it hard to navigate off a site — and don’t consider the unintended negative consequences to kids.

Advertisement

“We’re finding that adult design norms are just copied and pasted sloppily into children’s digital products,” she said.

TikTok, Pinterest, Twitter, Twitch, LinkedIn and Discord didn’t respond to requests for comment on whether they support the Design Code Act, how it would affect them and whether there are any changes they’d like to see made to it. Google, which owns YouTube, and Snap, the owner of Snapchat, also did not respond. Reddit, Tumblr and Yelp all declined to comment.

A spokesperson for Meta — the parent company of Facebook, Instagram and WhatsApp — pointed to the company’s “Best Interests of the Child Framework” as guiding how the company builds “age-appropriate experiences” for young users. The spokesperson also cited several platform features that protect young users, including one system wherein teens’ accounts are set to private by default and another in which advertisers can use only age, gender and location to target teens with ads.

“On Instagram, we are testing verification tools ... allowing us to provide age-appropriate experiences to people on our platform,” the Meta spokesperson said in an email to The Times. “We also use AI to understand if someone is a teen or an adult.”

Mark Weinstein, founder of the alternative social media platform MeWe — a small Facebook competitor that has courted users who feel censored by the larger platform — said that the Design Code Act “is an important step forward in protecting our kids’ privacy and critical thinking abilities.”

“Current mainstream social media companies brainwash and addict our kids,” he wrote via email. “The act is thoughtful and necessary due to the blind-siding nature of social media companies whose amoral interest is solely in revenue and sticky eyeballs.”

Advertisement

The bill has also found support from one of the loudest voices in the growing chorus of social media criticism: Frances Haugen, the Facebook product manager-turned-whistleblower who last fall leaked a trove of internal company documents to Congress, the U.S. Securities and Exchange Commission and the Wall Street Journal.

The material in Haugen’s “Facebook Files” included internal discussion among Meta employees of the company’s contribution to various social ills, including mental health problems among teen users of Instagram. (The company maintains that its documents were misrepresented.)

Haugen’s leaks launched a renewed wave of Facebook criticism and propelled her into the public eye. She has since used her platform to advocate for a handful of political efforts to regulate internet companies more stringently, including the Design Code Act. In April, she sat on a panel to discuss children’s online safety with state lawmakers in Sacramento.

Facebook has long emphasized the strength of its efforts to contain misinformation targeted at Latinos and Spanish speakers. A whistleblower’s leaks show employees raising alarms about the problem.

Although the documents she leaked covered a wide swath of problem areas, including online misinformation and political extremism, Haugen said she was not surprised it was the effects on children that most captured lawmakers’ attention.

“The solutions to a lot of the problems outlined in my disclosures are actually quite complicated,” she told The Times in May. But “when it comes to kids, it’s really simple.”

In the wake of Haugen’s leaks, Meta paused development of a preteen-geared Instagram Kids app that would have been ad-free and prioritized age-appropriate design. The company, which initially presented the project as a way to capture children who would otherwise join Instagram by lying about their age, announced in September that it was going to take a step back and discuss the proposed product with parents and other stakeholders before moving forward.

Facebook said it’s not abandoning the idea to create an Instagram experience for kids under age 13, but will discuss it with experts, parents and policymakers.

Major aspects of the legislation that state lawmakers passed Tuesday were modeled after data protection and privacy restrictions already adopted in Europe. For instance, Wicks said, in the United Kingdom Google has made safe search its default browsing mode for anyone under age 18, YouTube has turned off autoplay for users who are minors, and TikTok and Instagram have disabled direct messaging between children and adults.

Under Wicks’ bill, California’s attorney general could take civil action against companies that do not follow the law, including fines up to $7,500 per child for each violation.

Advertisement

State lawmakers also approved AB 587, which would require social media companies to publicly post terms of service — the policies that specify behavior and activities that are permitted, prohibited and monitored — and to report that information to the state attorney general.

Assemblymember Jesse Gabriel (D-Encino), sponsor of the bill, said it was aimed at curbing the spread of extremism, racism and conspiracy theories via social media.

“Consider the recent mass shootings we’ve had in this country,” he said. “One of the themes: They were radicalized, often with a toxic brew of white supremacy and extremist ideology.”

Gabriel on Tuesday also lashed out at the country’s major web platforms, most of which are based in California, saying they have “fought us every step of the way.”

Given the influence California has on policy nationwide, both Gabriel and Wicks suggested other states — and Congress — might use the child protections and transparency requirements in the legislation as a template for adopting their own laws. If the bills become law, Facebook, Google and other web platforms may also enact the restrictions and protections nationwide.

“Would you have a different set of regulations for kids in California than you would in Nevada? No, you would just create a standard that you would adhere to across the board,” Wicks said.

Advertisement