“THEY ‘trust me’…dumb fucks,” Mark Zuckerberg, the boss of Facebook, wrote in an instant message to a friend in 2004, after boasting that he had personal data, including photos, e-mails and addresses, of some 4,000 of his social network’s users. He offered to share whatever information his friend wanted to see. Mr Zuckerberg may use less profane language today, but many feel he has not yet outgrown his wilful disregard for users’ privacy. On April 11th he testified before testy politicians in Washington about the firm’s latest privacy controversy, first to a joint hearing of two Senate committees that lasted around four hours, and then again on April 12th to a House of Representatives committee. Not since the 1990s, when Microsoft was taken to task for its monopolistic behaviour, has there been such “intense public scrutiny” of a technology firm in Washington, as Orrin Hatch, a Republican senator, informed Mr Zuckerberg.
Some of his inquisitors appeared annoyed by Mr Zuckerberg’s rehearsed responses, but that did not stop many onlookers from being chuffed by his smooth, albeit robotic, performance. Facebook’s share price closed nearly 5% higher after his first day of testimony. Investors may be betting that the worst of “Facegate” could be over, but it is too soon to count on it.
The immediate scandal is the most acute and far-reaching crisis in Facebook’s 14-year history. Last month it was revealed by Britain’s Observer and the New York Times that a researcher from Cambridge University, Aleksandr Kogan, had obtained information about some 300,000 Facebook users by encouraging them to download an app and take a survey in 2012. He then shared these data with Cambridge Analytica, a political consultancy, which reportedly made them available to others, including Donald Trump’s campaign. Some 87m Facebook users are affected, because Facebook’s policies at the time were so loose that people using a third-party’s app often shared details not only about themselves but also about their friends without their knowledge. Facebook changed its policies in 2014.
These revelations are especially damning because Facebook first learned about this problem in 2015 and did little to address it. In fact, instead of focusing on Cambridge Analytica’s bad behaviour, Facebook threatened to sue the Guardian Media Group, which owns the Observer, if it published the exposé. Only after a media backlash and public outcry did Facebook start to take action. It has started making it easier for users to control their privacy settings, reduced the amount of data that are shared with third parties, and promised to audit suspicious third-party apps. But these are things that many users wrongly believed Facebook had long been doing anyway.
Politicians and users want to know more about how Facebook will adequately safeguard people’s privacy and offer enough transparency about how it operates. While encouraging its users to overshare minutiae from their own lives, the firm has been guarded in the past about sharing details of how its extensive data-collection machine works and how many data it tracks beyond what users provide directly. The company’s business depends on observing users’ online behaviour and selling their attention to advertisers, who pay money to reach specific groups of users based on minute details gleaned about their identities, their interests and where they are. This requires a delicate balancing act between catering to users, whose attention Facebook must keep, and advertisers, who pay the bills. To date the firm has mostly favoured growth over careful checks that its “community”, as it calls its 2.1bn users, is being properly protected.
Sorry seems to be
Facebook’s corporate tradition of evasion was on display on Capitol Hill. When asked during the Senate hearing about whether Facebook tracks users who have logged out, Mr Zuckerberg said he did not know and would have to supply the answer at a later date (although many advertisers believe Facebook does exactly that). It has recently been revealed that Facebook collected Android users’ call logs and messages without most users’ knowledge, which offers another example of the firm’s disregard for people’s right to control and see their data. Even in Silicon Valley, which is known for producing eerily predictive algorithms, people find Facebook’s stealthy tracking and targeting of users creepy.
In addition to privacy, the Cambridge Analytica scandal points to two big concerns. One is the lack of transparency in political advertising. Corporate and political advertising are being “mushed together” as a single topic of discussion, but it is political micro-targeting that is most bothersome to consumers, says Karen Kornbluh, senior fellow for digital policy at the Council on Foreign Relations. Users are probably willing to see advertisements from car companies, but it feels more sensitive and invasive to be targeted with ads based on what is known or presumed about their views on divisive political issues, such as immigration, race, religion and gay rights. The company has vowed to start showing who is behind political ads and verifying the buyer’s identity.
Another issue is foreign meddling, and the risk that hostile governments and non-state actors may harvest users’ data. Already Facebook has disclosed that Russians were responsible for targeting ads and content to Americans in the lead-up to the 2016 election. It is becoming clearer that foreign governments, including Russia and presumably China, may have obtained rich data sets about Facebook users from the likes of Cambridge Analytica or other groups. Christopher Wylie, the whistleblower who sounded the alarm about Cambridge Analytica, has said that the company may be storing its data in Russia, suggesting a close connection.
The easiest word
Mr Zuckerberg will have plenty to grapple with in the coming months. One risk is that Cambridge Analytica is just the first of many outfits that receive scrutiny and media attention. According to someone close to the firm, the social-networking giant is already aware that Cambridge Analytica is only one of many outside groups with political motivations that stealthily gained access to detailed data about Facebook users. More revelations will probably become public, especially if politicians and investigators press Facebook on this point. If one of Facebook’s employees decides to become a whistleblower in the vein of Mr Wylie from Cambridge Analytica, it could mean further apologies from Mr Zuckerberg and another summons for him to give congressional testimony.
Another risk to Facebook is action from American regulators. Bruce Mehlman, a lobbyist in Washington, says Facebook’s Cambridge Analytica data spill could be much like the Exxon Valdez oil spill, which brought public scrutiny and regulation to an industry that had previously operated without much oversight. Mr Zuckerberg insists that his firm is open to new laws, especially in areas that are sensitive, such as facial recognition. But it has been fighting state-level privacy laws, in California and elsewhere, that could restrict its normal course of business.
America’s Federal Trade Commission (FTC) has launched an investigation into Facebook for its privacy practices. This is not the first time. As part of a consent decree agreed to in 2011 after the FTC charged it with deceptive practices, Facebook promised to be more transparent with consumers about the data that were gathered and shared publicly. The Cambridge Analytica fiasco appears to have been in violation of what Facebook promised. According to one former FTC official, Facebook could be facing a fine of around $2bn or more, which could be the largest fine in history for violating an FTC order.
Some openly wonder whether America will eventually pass restrictions like those that will come into effect next month under the General Data Protection Regulation (GDPR), a European law that requires companies to obtain consent to gather and share users’ data. If principles like this spread and American users are required, for example, to opt in to Facebook’s tracking, it could dent Facebook’s revenues, although by how much is unclear.
While in the long term some sort of regulation is inevitable, it seems less likely in the near term. Laws take years and sometimes decades to come into effect for burgeoning industries: people started talking about regulating telecoms firms in the 1970s, but America did not pass a law to regulate them until 1996. Today Republicans, who control both houses of Congress, do not have much appetite for restricting business. Because of Republican opposition, a benign bill that would require disclosure of who pays for online political ads, called the Honest Ads Act, has not even been granted a hearing.
For Facebook to change in any meaningful way, Congress will have to change too. One of the most stunning revelations of the highly choreographed hearings was not anything Mr Zuckerberg said, but how little America’s politicians seemed to know about Facebook and the way the world of digital communications operates. There is little hope for smart regulation that will protect users’ privacy until the people who would draft laws understand the ecosystem they need to tame. The Cambridge Analytica scandal gave Mr Zuckerberg a crash course in political diplomacy, but the education of politicians about the opaque, labyrinthine world of digital data is only just beginning.