Advertisement

Instagram head faces senators amid anger over possible harm

The Instagram app icon
Instagram CEO Adam Mosseri appeared before a Senate panel on Wednesday as the company faces scrutiny over the potential detrimental effects its social media platform has on young people.
(Associated Press )
Share via

The head of a Senate panel examining social media’s negative effects on young people has dismissed as “a public relations tactic” some safety measures announced by Facebook’s popular Instagram platform.

Adam Mosseri, the head of Instagram, on Wednesday faced off with senators angry over revelations of how the photo-sharing platform can harm some young users and demanding that the company commit to making changes.

Under sharp questioning by senators of both parties, Mosseri defended the company’s conduct and the efficacy of its new safety measures. He challenged the assertion that Instagram has been shown by research to be addictive for young people. Instagram has an estimated 1 billion users of all ages.

Advertisement

On Tuesday, Instagram introduced a previously announced feature that urges teenagers to take breaks from the platform. The company also announced other tools, including parental controls due to come out early next year, that it says are aimed at protecting young users from harmful content.

The parental oversight tools “could have been announced years ago,” Sen. Richard Blumenthal (D-Conn.) told Mosseri. The newly announced measures fall short and many of them are still being tested, he said.

A pause that Instagram imposed in September on its work on a kids’ version of the platform “looks more like a public relations tactic brought on by our hearings,” Blumenthal said.

Advertisement

Rohingya refugees file suit in California against Facebook, saying the platform failed to stop hateful posts that incited violence against them.

“I believe that the time for self-policing and self-regulation is over,” Blumenthal said. “Self-policing depends on trust. Trust is over.”

Mosseri testified as Facebook, whose parent now is named Meta Platforms, has been roiled by public and political outrage over disclosures by former Facebook employee Frances Haugen. She has made the case before lawmakers in the U.S. and Europe that Facebook’s systems amplify online hate and extremism and that the company elevates profits over the safety of users.

Haugen, a data scientist who had worked in Facebook’s civic integrity unit, buttressed her assertions with a trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.

Advertisement

The Senate panel has examined Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative effects. For some Instagram-devoted teens, peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research detailed in the Facebook documents showed.

The revelations in a report by the Wall Street Journal, based on the documents leaked by Haugen, set off a wave of recriminations from lawmakers, critics of Big Tech, child-development experts and parents.

Facebook has long emphasized the strength of its efforts to contain misinformation targeted at Latinos and Spanish speakers. A whistleblower’s leaks show employees raising alarms about the problem.

“As head of Instagram, I am especially focused on the safety of the youngest people who use our services,” Mosseri testified. “This work includes keeping underage users off our platform, designing age-appropriate experiences for people ages 13 to 18, and building parental controls. Instagram is built for people 13 and older. If a child is under the age of 13, they are not permitted on Instagram.”

Mosseri outlined the suite of measures he said Instagram has taken to protect young people on the platform. They include keeping kids under 13 off it, restricting direct messaging between kids and adults, and prohibiting posts that encourage suicide and self-harm.

But, as researchers both internal and external to Meta have documented, the reality is different. Kids under 13 often sign up for Instagram with or without their parents’ knowledge by lying about their age. And posts about suicide and self-harm still reach children and teens, sometimes with disastrous effects.

Senators of both parties were united in condemnation of the social network giant and Instagram, the photo-sharing juggernaut valued at some $100 billion that Facebook acquired for $1 billion in 2012.

Advertisement

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on Instagram. In fact, the company has been working with experts and other advisors for another product aimed at children — its Messenger Kids app that launched in late 2017.

Senators pressed Mosseri to support legislative remedies for social media.

Among the legislative proposals put forward by Blumenthal and others, one bill proposes an “eraser button” that would let parents instantly delete all personal information collected from their children or teens.

Another proposal bans specific features for kids under 16, such as video auto-play, push alerts, “like” buttons and follower counts. Also being floated is a prohibition against collecting personal data from anyone age 13 to 15 without their consent, as well as a new digital “bill of rights” for minors that would similarly limit gathering of personal data from teens.

Advertisement