What Mark Zuckerberg Will Be Grilled On at the Congressional Hearings
When Mark Zuckerberg appears Tuesday afternoon before members of two Senate committees, it will be the culmination of a Facebook public relations blitz in the face of a series of controversies that have tarnished the reputation of the world’s largest social network.
Beginning late last month, Mr. Zuckerberg, Facebook’s 33-year-old founder and chief executive, suddenly granted interviews to The New York Times, Vox, Recode, Reuters and other news outlets before a conference call with dozens of reporters last week. Sheryl Sandberg, Facebook’s chief operating officer, has been on her own, more limited apology tour, mainly on television. The company has also moved aggressively to announce changes to its rules, introduce a social media research initiative and promise additional disclosures to try to get ahead of congressional demands and possible regulation.
It will be the first appearance before Congress for Mr. Zuckerberg, who looks only slightly older than during his giddy first television appearance in 2004 to explain the unexpected growth of the college service he called “The Facebook.” The company he built into a global behemoth, with more than two billion users, now faces the most intense scrutiny of its 14-year history. Here are the major issues he is likely to be asked to address:
The New York Times, The Observer and The Guardian reported last month that Facebook data, including private information, of more than 50 million users had been given without their permission to a company, Cambridge Analytica, that had consulted for Donald J. Trump’s 2016 presidential campaign and for other political candidates. Since then, Facebook has updated that number, saying that as many as 87 million people may have been affected by the misuse of data. The social network also acknowledged additional security weaknesses, including one that it said may have allowed “malicious actors” to collect the public profile information on “most people.” Mr. Zuckerberg’s main task may be to persuade members of Congress and millions of Facebook users that their information will be safe on the platform.
The committees are likely to ask about a consent decree Facebook signed with the Federal Trade Commission in 2011, which said it had “deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.” The F.T.C. is now investigating whether the company violated the decree, which could make it vulnerable to fines that, under the regulations, could reach an astronomical sum: up to $41,484 for each of the millions of users whose data went to Cambridge Analytica. Facebook denies that it has violated the 2011 decree.
Facebook has taken heat for months for its slow response to Russia’s hijacking of its platform to influence the 2016 presidential election. Russian military intelligence created profiles of fake Americans to promote emails stolen from Democrats by Russian hackers, and a so-called troll factory in St. Petersburg created pages to denigrate Hillary Clinton, promote Mr. Trump and post inflammatory messages on immigration, race, guns and other topics. It took Facebook until September 2017, 10 months after the election, to go public with its first findings on the Russian meddling, which it says appeared in the news feeds of 126 million customers — equal to nearly half the adult population of the United States.
Mr. Zuckerberg got off to a clumsy start when, two days after the election, he said it was “pretty crazy” to believe that the “small amount” of false news spread on Facebook might have influenced the outcome. Since then, the company has taken a number of steps to increase the reliability of information on its platform, though the moves have garnered mixed reviews. The underlying problem, studies show, is that false, sensational stories spread faster through social media than factual stories do.
Political bias and polarization
Republicans may question Mr. Zuckerberg about reports in 2016 from former Facebook employees that they had routinely suppressed articles from conservative publications. But a larger body of research suggests that Facebook, as well as other social media, tend to reinforce and intensify existing political views by creating an echo chamber of friends and like-minded users. Facebook has published its own research pushing back on that conclusion.