Frances Haugen’s testimony to the US congress was arguably the most serious threat to Mark Zuckerberg’s iron-fisted leadership of the most powerful social media company on Earth. These are the key takeaways from the explosive Facebook Papers
Measures to suppress hateful, deceptive content are lifted after the American presidential election in 2020, as pro-Trump groups disputing the legitimacy of the election experience “meteoric” growth.
A dummy test account on Facebook in India is flooded with violent anti-Muslim propaganda – which remains visible for weeks on the real account of a frightened Muslim college student in northern India.
A trove of internal Facebook documents reveals that the social media giant has privately and meticulously tracked real-world harms exacerbated by its platforms, ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content.
Disclosed to the US Securities and Exchange Commission by whistleblower Frances Haugen, the Facebook Papers were provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions were reviewed by a consortium of news organisations, including The Washington Post, which obtained additional internal documents and conducted interviews with dozens of current and former Facebook employees.
Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets.
In America, the tech CEO is a champion of free speech, reluctant to remove even malicious and misleading content from the platform. But in Vietnam, upholding the free speech rights of people who question government leaders could have come with a significant cost in a country where the social network earns more than $1bn (€800m) in annual revenue, according to a 2018 estimate by Amnesty International.
So Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on the condition of anonymity to describe internal company discussions. Ahead of Vietnam’s party congress in January, Facebook significantly increased censorship of “anti-state” posts, giving the government near-total control over the platform, according to local activists and free speech advocates.
Zuckerberg’s role in the Vietnam decision, which has not been previously reported, exemplifies his relentless determination to ensure Facebook’s dominance, sometimes at the expense of his stated values, according to interviews with more than a dozen former employees. That ethos has come under fire in a series of whistleblower complaints filed with the US Securities and Exchange Commission by former Facebook product manager Frances Haugen.
While it’s unclear whether the SEC will take the case or pursue action against the CEO personally, the allegations made by the whistleblower represent arguably the most profound challenge to Zuckerberg’s ironclad leadership of the most powerful social media company on Earth. Experts said the SEC – which has the power to seek depositions, fine him and even remove him as chairman – is likely to dig more deeply into what he knew and when. Though his direct perspective is rarely reflected in the documents, the people who worked with him say his fingerprints are everywhere in them.
Haugen references Zuckerberg’s public statements at least 20 times in her SEC complaints, asserting that the CEO’s singular power and unique level of control over Facebook mean he bears ultimate responsibility for a litany of societal harms. Her documents appear to contradict the CEO on a host of issues, including the platform’s impact on children’s mental health, whether its algorithms contribute to polarisation and how much hate speech it detects around the world.
For example, Zuckerberg testified last year before Congress that the company removes 94pc of the hate speech it finds – but internal documents show that its researchers estimated that the company was removing less than 5pc of hate speech on Facebook. In March, Zuckerberg told Congress that it was “not at all clear” that social networks polarise people, when Facebook’s own researchers had repeatedly found that they do.
During the run-up to the 2020 US presidential election, the social media giant dialled up efforts to police content that promoted violence, misinformation and hate speech. But after November 6, Facebook rolled back many of the dozens of measures aimed at safeguarding US users. A ban on the main Stop the Steal group didn’t apply to the dozens of look-alike groups that popped up in what the company later concluded was a “coordinated” campaign, documents show.
By the time Facebook tried to reimpose its “break the glass” measures, it was too late: A pro-Trump mob was storming the US Capitol.
For all Facebook’s troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world.
Documents show that Facebook has meticulously studied its approach abroad, and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes.
According to one 2020 summary, the vast majority of its efforts against misinformation – 84pc – went toward the United States, the documents show, with just 16pc going to the ‘Rest of World’, including India, France and Italy.
In her congressional testimony, Haugen repeatedly accused Zuckerberg of choosing growth over the public good, an allegation echoed in interviews with the former employees.
“The spectre of Zuckerberg looms in everything the company does,” said Brian Boland, ex-vice president of partnerships and marketing who left in 2020 after coming to believe the platform was polarising society. “It is entirely driven by him.”
A Facebook spokeswoman, Dani Lever, denied that decisions made by Zuckerberg “cause harm”, saying the claim was based on “selected documents that are mischaracterised and devoid of any context”.
Facebook has previously fought efforts to hold Zuckerberg personally accountable. In 2019, as the company was facing a record-breaking $5bn (€4.3bn) fine from the Federal Trade Commission for privacy violations related to Cambridge Analytica, a political consultancy that abused profile data from tens of millions of Facebook users,
Facebook negotiated to protect Zuckerberg from direct liability. Internal Facebook briefing materials revealed the tech giant was willing to abandon settlement talks and duke it out in court if the agency insisted on pursuing the CEO.
Zuckerberg, who is 37, founded Facebook 17 years ago in his college dorm room, envisioning a new way for classmates to connect with one another. Today, Facebook has become a conglomerate encompassing WhatsApp, Instagram and a hardware business. Zuckerberg is chairman of the board and controls 58pc of the company’s voting shares, rendering his power virtually unchecked internally at the company and by the board.
Even as the company has grown into a large conglomerate, Zuckerberg has maintained a reputation as a hands-on manager who goes deep on product and policy decisions, particularly when they involve critical trade-offs between preserving speech and protecting users from harm – or between safety and growth.
Politically, he has developed hard-line positions on free speech, announcing that he would allow politicians to lie in ads and at one time defending the rights of Holocaust denialists.
He has publicly stated that he made the final call in the company’s most sensitive content decisions to date, including allowing President Donald Trump’s violence-inciting post during the George Floyd protests to stay up, despite objections from thousands of employees.
But even as Facebook is facing perhaps its most existential crisis to date over the whistleblower documents, lately Zuckerberg’s attention has been elsewhere, focused on a push toward virtual-reality hardware in what former executives said was an attempt to distance himself from the problems of the core Facebook, known internally as the Big Blue app.
The company is reportedly even considering changing its name to align better with his vision of a virtual-reality-driven “metaverse.” Facebook has said it doesn’t comment on rumours or speculation.
The fomer employees said it was also not surprising that the document trove contains so few references to Zuckerberg’s thoughts.
He has become more isolated in recent years, in the face of mounting scandals and leaks (Facebook disputes his isolation).
Even criticising Zuckerberg personally can come with costs. An engineer who spoke with The Post, and whose story was reflected in the documents, says he was fired in 2020 after penning an open letter to Zuckerberg on the company’s chat system, accusing the CEO of responsibility for protecting conservatives whose accounts had been escalated for misinformation.
One document, a 2020 proposal that indicates it was sent to Zuckerberg for review – over whether to hide like counts on Instagram and Facebook – strongly suggests that Zuckerberg was directly aware of some of the research into harmful effects
It included internal research from 2018 that found that 37pc of teenagers said one reason that they stopped posting content was because wanting to get enough like counts caused them “stress or anxiety”.
Over the summer, executives in Facebook’s Washington office heard that Zuckerberg was angry about President Biden’s charge that coronavirus misinformation on Facebook was “killing people”. Zuckerberg felt Biden had unfairly targeted the company and wanted to fight back, according to people who heard a key Zuckerberg adviser, Facebook Vice President for Global Affairs Nick Clegg, express the CEO’s viewpoint.
Zuckerberg is married to a doctor, runs a foundation focused on health issues and had hoped that Facebook’s ability to help people during the pandemic would be legacy-making. Instead, the plan was going south.
In July, Guy Rosen, Facebook’s vice president for integrity, wrote a blog post noting that Facebook had missed its own vaccine goals, and asserting that Facebook wasn’t to blame for the large number of Americans who refused to get vaccinated.
Though Biden later backed off his comment, some former executives saw Facebook’s attack on the White House as unnecessary self-sabotage.
But complaints about the brash action were met with a familiar response, three people said: It was meant to please the “audience of one”. (©Washington Post)
© Washington Post