Exploiting Facebook data to influence voters? That’s a feature, not a bug, of the social network
With each comment, like and share, users provide Facebook with a deeply personal window into their lives.
The result of that voluntary behavior? Advertisers looking to finely target their pitches can glean someone’s hobbies, what they like to eat and even what makes them happy or sad — propelling Facebook’s ad revenue to $40 billion last year.
This trove of rich information is now at the center of a rapidly growing controversy involving one of President Trump’s campaign consultants, Cambridge Analytica, which reportedly took the advertising playbook and exploited it in a bid to influence swing voters.
Former employees accuse the firm, owned by the conservative billionaire Robert Mercer and previously headed by Trump’s former chief strategist Steve Bannon, of taking advantage of ill-gotten data belonging to millions of unwitting Facebook users. News of the breach was met with calls over the weekend for stricter scrutiny of the company.
Sen. Amy Klobuchar (D-Minn.) demanded that Mark Zuckerberg, Facebook’s chief executive, appear before the Senate Judiciary Committee. Maura Healey, attorney general for Massachusetts, said her office was launching an investigation. And the head of a British parliamentary inquiry into fake news called on Facebook to testify before his panel again, this time with Zuckerberg.
The accusations raise tough questions about Facebook’s ability to protect user information at a time when it’s already embroiled in a scandal over Russian meddling during the 2016 presidential campaign and under pressure to adhere to new European Union privacy rules.
They also highlight the power and breadth of the data Facebook holds over its 2 billion users. Whether used to sway voters or sell more detergent, the information harvested by the world’s biggest social network is proving to be both vital and exploitable regardless of who’s wielding it.
“The data set assembled on people by Facebook is unrivaled,†said Scott Galloway, a professor of marketing at New York University Stern School of Business and author of “The Four: The Hidden DNA of Amazon, Apple, Facebook and Google.†“The bad news is, people are discovering this can be used as a weapon. The worse news is that people are learning how to detonate it.â€
The controversy began late Friday when Facebook’s vice president and deputy general counsel, Paul Grewal, announced in a blog post that the social network was suspending Strategic Communication Laboratories and its affiliate, Cambridge Analytica.
Facebook said the companies failed to delete user data they had acquired in 2015 in violation of the platform’s rules. The data were supplied by a University of Cambridge psychology professor, Aleksandr Kogan, who built an app that was supposed to collect details on Facebook users for academic research. Kogan was not supposed to pass that information to a third party for commercial purposes under Facebook guidelines.
Facebook said the data collection was contained to 270,000 people who downloaded Kogan’s app as well as “limited information†about their friends.
But a whistleblower and other reported sources contend the scope of the data collection was significantly larger. Christopher Wylie, a departed co-founder of Cambridge Analytica, said Kogan harvested data from 50 million Facebook users without their permission, largely by mining friends of the people who downloaded his app.
The allegations were first reported by the New York Times and the British newspaper the Observer, whose stories about the breach were preempted hours earlier by Facebook’s announcement of the suspensions.
Wylie, who described Cambridge Analytica as a weapon designed to wage a culture war in the U.S., said Facebook wasn’t particularly adamant about censuring his former company. He said the only effort made by the social network was sending a letter in August 2016 demanding that the data Kogan supplied be destroyed. He said Facebook never verified whether the data had been deleted.
Facebook, which also suspended Wylie, did not respond to a request for comment.
As recently as last month, Cambridge Analytica told a British parliamentary hearing that it never had or used Facebook data. But in a statement Saturday, Cambridge Analytica admitted receiving user information from Kogan and then deleting it after learning it violated Facebook’s rules. The firm added it never used any of the data for Trump’s 2016 campaign when it was hired as a consultant.
Cambridge Analytica reportedly needed Facebook’s data for its so-called psychographic profiling, which combines data collected online to glean a better understanding of voters’ personalities in order to tailor ads to them.
In many ways, it’s not unlike what Facebook can do for advertisers and a growing number of political campaigns willing to pay and play by Facebook’s rules. By micro-targeting users down to what charities they donate to, what device they play video games on and where they stand on the political spectrum, Facebook says its reach is expertly tailored to its clients’ needs.
That kind of granular data helped increase Facebook’s advertising revenue last year by 49%. Advertising accounted for more than 98% of Facebook’s total revenue in 2017, according to company filings.
None of that would be possible without hundreds of millions of users willingly sharing enough details about themselves to be categorized by advertisers.
That business model is now under threat within the European Union, where the General Data Protection Regulation set to be introduced in May will prohibit companies like Facebook from leveraging user information on subjects such as race and politics without consent.
Facebook is adamant that the Cambridge Analytica controversy does not amount to a security breach. An admission would further sour the company’s reputation in Europe for lax privacy standards. There’s also a risk of running afoul of the Federal Trade Commission.
“Platforms like Facebook need to be very, very careful with data, and they will come under more scrutiny by the government going forward,†said Rich Raddon, co-founder of Zefr, a Los Angeles start-up that helps brands target YouTube content for advertising. “In Europe we are seeing a reaction to these platforms leveraging personal identifiable information.â€
Raddon said by virtue of its size Facebook will be heavily scrutinized by lawmakers for how it analyzes personal data. But smaller firms like Cambridge Analytica can fly under the radar doing virtually the same thing.
Facebook says it has beefed up its review of third-party apps like Kogan’s, which tap into the social network’s fire hose of data. That includes requiring developers to first “justify the data they’re looking to collect and how they’re going to use it,†said Grewal, the Facebook attorney.
Experts say Facebook will increasingly diminish access to the most valuable data to third parties like app developers as it strives to protect its own ad business and reduce security risks like those exposed by Kogan, Cambridge Analytica and Russian operatives tasked with sowing discord in American society.
“Facebook’s business model is actually focused on not giving third parties data about its users,†said Aviv Ovadya, chief technologist at the Center for Social Media Responsibility. “If it owns the data, and you can only target people through its platform, then you must spend money on its platform. Facebook also wants people to be as comfortable as possible giving them data, so they want to ensure that the data is protected from being used in problematic ways.â€
Facebook’s critics now say devoting attention to those who exploit the platform, such as Russian trolls, is shortsighted. More attention should be directed at the social network itself, which provides the tools for exploitation, they say. Kogan, for instance, didn’t break any rules when he accessed information from millions of users without their consent. He only broke the rules when he shared that information for commercial gain.
“The data that Facebook leaked to Cambridge Analytica is the same data Facebook retains on everyone and sells targeting services around. The problem is not shady Russian researchers; it’s Facebook’s core business model of collect, store, analyze, exploit,†Maciej Ceglowski, a prominent San Francisco web developer and leader of grassroots activist group Tech Solidarity, wrote in a tweet Saturday.
Follow me @dhpierson on Twitter
UPDATES:
3:55 p.m.: This article was updated with details about calls for investigations into the matter.
This article was originally published at 1:10 p.m.