Here’s how Facebook plans to act against anti-vaccine content - Los Angeles Times
Advertisement

Here’s how Facebook plans to act against anti-vaccine content

Ethan Lindenberger prepares to testify at a Senate committee hearing March 5, to examine vaccines, focusing on preventable disease outbreaks. He said his mother, an anti-vaccine evangelist, relies on Facebook or Facebook-linked sites for all of her information on the subject.
Ethan Lindenberger prepares to testify at a Senate committee hearing March 5, to examine vaccines, focusing on preventable disease outbreaks. He said his mother, an anti-vaccine evangelist, relies on Facebook or Facebook-linked sites for all of her information on the subject.
(Carolyn Kaster / Associated Press)
Share via
Washington Post

In an effort to curb anti-vaccination conspiracy theories and misinformation, Facebook announced Thursday it will no longer recommend the offending pages and groups, and will block advertisements that include false content about vaccines. The company will also stop recommending anti-vaccination content on Instagram.

The tech giant rolled out its plan to combat anti-vaccine content after mounting public pressure culminated in a Capitol Hill hearing this week, when a Senate panel issued a dire warning about the public health danger that vaccine misinformation poses. There, 18-year-old Ethan Lindenberger testified that his mother, an anti-vaccine evangelist, relies on Facebook or Facebook-linked sites for all of her information on the subject. And she’s certainly not alone.

In a blog post, Monika Bickert, Facebook’s head of global policy management, said the company is “working to tackle vaccine misinformation on Facebook by reducing its distribution and providing people with authoritative information on the topic.â€

Advertisement

“Leading global health organizations … have publicly identified verifiable vaccine hoaxes,†Bikert wrote. “If these vaccine hoaxes appear on Facebook, we will take action against them.â€

Facebook also said it would be “exploring†ways to counter false content, whenever users do come across it, with “educational information†about vaccines.

The changes in Facebook and Instagram recommendation systems, along with the company’s proposed fact offensive, may ease the concerns of a growing number of researchers who have noted the fast spread of misinformation online and especially on social media.

Advertisement

The World Health Organization recently dubbed “vaccine hesitancy†one of the top global threats of 2019, a warning punctuated by one of the worst measles outbreaks in decades, which has sickened at least 75 people across the Pacific Northwest — most of whom are vaccinated children under 10 years old.

When measles struck, investigators wanted answers. Instead, some parents lied. »

In the face of this burgeoning crisis, studies and news reports have indicated that Facebook’s echo chambers have made the problem worse.

Advertisement

One group of scientists recently published a study that found the majority of the most-viewed health stories on Facebook in 2018 were downright fake or contained significant amounts of misleading information. Vaccinations ranked among the three most popular story topics.

An investigation by the Guardian newspaper found that Facebook search results for information about vaccines were “dominated by anti-vaccination propaganda.â€

In a statement to the Washington Post last month, Facebook said that most anti-vaccination content didn’t violate its policies around inciting “real-world harm.†Simply removing such material, the company said, wouldn’t effectively counter fictional information with the factual.

“While we work hard to remove content that violates our policies, we also give our community tools to control what they see as well as use Facebook to speak up and share perspectives with the community around them,†the company’s statement read.

On Thursday, Bikert and Facebook appeared to reaffirm that stance, as the new statement made no mention of removing groups or pages altogether — something Facebook has done in the past, notably with content relating to conspiracy theorist Alex Jones and his show Infowars.

A Facebook spokesman told tech site the Verge that anti-vaccination pages were not removed because, “As with a lot of our integrity efforts, striking the balance between enabling free expression of opinion and ensuring the safety of the community is something we are fully committed to.â€

Advertisement

In February, Rep. Adam B. Schiff (D-Burbank) sent letters to the heads of Facebook and Google, which also has been under fire for YouTube’s role in promoting misinformation, asking how they plan to protect their users from potentially dangerous hoaxes.

“As Americans rely on your services as their primary source of information, it is vital that you take responsibility with the seriousness it requires, and nowhere more so than in matters of public health and children’s health,†Schiff wrote to Facebook Chief Executive Mark Zuckerberg.

After Facebook’s Thursday announcement, Schiff struck a cautious note on Twitter, writing, “The ultimate test will be if these measures reduce the spread of anti-vaccine content on their platforms, to the benefit of public health.â€

Lindenberger, who famously vaccinated himself against his mother’s wishes, spoke to the Senate Committee on Health, Education, Labor and Pensions on Tuesday and reiterated Schiff’s calls for reliable information — not the type of stuff his mother was reading on social media.

During Lindenberger’s testimony, one senator asked the Ohio teen if his mother got most of her information online.

“Yes,†Lindenberger replied. “Mainly Facebook.â€

“And where do you get most of your information?†the lawmaker asked.

Laughing, Lindenberger said, “Not Facebook.â€

Advertisement