This is how it, and the wider industry, should respond,
News that Cambridge Analytica (CA), a firm linked to President Donald Trump’s 2016 campaign, got data on 50m Facebook users in dubious, possibly illegal, ways has lit a firestorm (see article). Mr Zuckerberg took five days to reply and, when he did, he conceded that Facebook had let its users down in the past but seemed not to have grasped that its business faces a wider crisis of confidence. After months of talk about propaganda and fake news, politicians in Europe and, increasingly, America see Facebook as out of control and in denial. Congress wants him to testify. Expect a roasting.
Since the news, spooked investors have wiped 9% off Facebook’s shares. Consumers are belatedly waking up to the dangers of handing over data to tech giants that are run like black boxes. Already, according to the Pew Research Centre, a think-tank, a majority of Americans say they distrust social-media firms. Mr Zuckerberg and his industry need to change, fast.
Facebook’s business relies on three elements: keeping users glued to their screens, collecting data about their behavior and convincing advertisers to pay billions of dollars to reach them with targeted ads. The firm has an incentive to promote material that grabs attention and to sell ads to anyone. Its culture melds a ruthless pursuit of profit with a Panglossian and narcissistic belief in its own virtue. Mr Zuckerberg controls the firm’s voting rights. Clearly, he gets too little criticism.
In the latest fiasco, it emerged that in 2013 an academic in Britain built a questionnaire app for Facebook users, which 270,000 people answered. They in turn had 50m Facebook friends. Data on all these people then ended up with CA. (Full disclosure: The Economist once used CA for a market-research project.) Facebook says that it could not happen again and that the academic and CA broke its rules; both deny doing anything wrong. Regulators in Europe and America are investigating. Facebook knew of the problem in 2015, but it did not alert individual users. Although nobody knows how much CA benefited Mr Trump’s campaign, the fuss has been amplified by the left’s disbelief that he could have won the election fairly.
But that does not give Facebook a defense. The episode fits an established pattern of sloppiness towards privacy, tolerance of inaccuracy and reluctance to admit mistakes. In early 2017 Mr Zuckerberg dismissed the idea that fake news had influenced the election as “pretty crazy”. In September Facebook said Kremlin-linked firms had spent a mere $100,000 to buy 3,000 adverts on its platform, failing at first to mention that 150m users had seen free posts by Russian operatives. It has also repeatedly misled advertisers about its user statistics.
Facebook is not about to be banned or put out of business, but the chances of a regulatory backlash are growing. Europe is inflicting punishment by a thousand cuts, from digital taxes to antitrust cases. And distrustful users are switching off. The American customer base of Facebook’s core social network has stagnated since June 2017. Its share of America’s digital advertising market is forecast to dip this year for the first time. The network effect that made Facebook ever more attractive to new members as it grew could work in reverse if it starts to shrink. Facebook is worth $493 billion, but only has $14 billion of physical assets. Its value is intangible—and, potentially, ephemeral.
If Mr Zuckerberg wants to do right by the public and his firm, he must rebuild trust. So far he has promised to audit some apps, restrict developers’ access to data still further, and help people control which apps have access to their data.
That doesn’t go nearly far enough. Facebook needs a full, independent examination of its approach to content, privacy and data, including its role in the 2016 election and the Brexit referendum. This should be made public. Each year Facebook should publish a report on its conduct that sets out everything from the prevalence of fake news to privacy breaches.
Next, Facebook and other tech firms need to open up to outsiders, safely and methodically. They should create an industry ombudsman—call it the Data Rights Board. Part of its job would be to set and enforce the rules by which accredited independent researchers look inside platforms without threatening users’ privacy. Software is being developed with this in mind (see article). The likes of Facebook raise big questions. How does micro-targeting skew political campaigns? What biases infect facial-recognition algorithms? Better they be answered with evidence instead of outrage.
The board or something like it could also act as a referee for complaints, and police voluntary data-protection protocols. Facebook, for example, is planning to comply worldwide with some of the measures contained in a new European law, called the General Data Protection Regulation. Among other things, this will give users more power to opt out of being tracked online and to stop their information being shared with third parties. Adherence to such rules needs to be closely monitored.
Tech has experience of acting collectively to solve problems. Standards on hardware and software, and the naming of internet domains, are agreed on jointly. Facebook’s rivals may be wary but, if the industry does not come up with a joint solution, a government clampdown will become inevitable.
Facebook seems to think it only needs to tweak its approach. In fact it, and other firms that hoover up consumer data, should assume that their entire business model is at risk. As users become better informed, the alchemy of taking their data without paying and manipulating them for profit may die. Firms may need to compensate people for their data or let them pay to use platforms ad-free. Profits won’t come as easily, but the alternative is stark. If Facebook ends up as a regulated utility with its returns on capital capped, its earnings may drop by 80%. How would you like that, Mr Zuckerberg?
Shout outs to the Ecomonist for this article.