As a longtime professional observer of Facebook, I’ve become steadily more disgusted by the company’s behavior, especially with how it affects politics and social discord. And it distresses me that Facebook has continued to thrive financially despite sloughing off reams of justified criticism and failing to reform. But now the debate about this so-powerful company is shifting, in a way that may increase chances for meaningful change. The world is realizing Facebook’s word cannot be trusted.

A central theme of the new book An Ugly Truth, by New York Times reporters Sheera Frenkel and Cecilia Kang, is that Facebook puts more emphasis on managing its image than on managing its actual effects. (I haven’t read the book yet, so this is based on excerpts and reviews; a full review is forthcoming.) For example, the book chronicles Facebook’s internal investigations about how Russia used it to interfere in the 2016 election. Rather than being eager to figure out what happened, company executives were annoyed to be forced to face up to the facts. Then-security chief Alex Stamos was essentially chastised for revealing details of the manipulation. According to a colleague quoted in the book: “By investigating what Russia was doing, Alex had forced us to make decisions about what we were going to publicly say. People weren’t happy about that.”

Ad

Separately, an excerpt of An Ugly Truth published in the Times points out that after the January 6 Capitol invasion, Facebook Chief Operating Officer Sheryl Sandberg strenuously denied the service had been a primary organizing tool for the insurrectionists. Then a string of federal indictments cited chapter and verse of how the invaders used Facebook to organize and implement their chaotic incursion. It seems Sandberg lied.

The theme of dishonesty emerges again in a column this week by Kevin Roose. He tells the sordid tale of CrowdTangle, a Facebook subsidiary whose own data about engagement on posts has consistently demonstrated that right-wing venom generates more activity than anything else on Facebook. Executives have endlessly argued publicly with Roose and others about the CrowdTangle data, claiming that engagement was a less accurate measure of content impact on the service than “reach,” or how many people actually see a given piece of information.

But Facebook refuses to make reach data public, because, apparently, it also frequently demonstrates that divisive dishonest content wins on Facebook. Roose quotes Brian Boland, an 11-year Facebook vice president who oversaw CrowdTangle, but recently quit: “One of the main reasons I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of the core products…Facebook would love full transparency if there was a guarantee of positive stories and outcomes. But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” Roose also quotes current Facebook employees who are afraid to go on the record saying that “Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content.” 

Ad

Axios columnist Scott Rosenberg calls it Facebook’s “see no evil” strategy.

But it is dawning on the world, finally, that Facebook’s word cannot be trusted. This may bolster a promising direction for regulation–forced transparency. Congress has a raft of bills in process that take small steps in this direction, but I’d like to see Facebook required specifically to publish engagement and reach data, not only in the U.S. but for each country in which it operates. This kind of transparency metric would at least give a clearer sense of the impact Facebook has on our common discourse. The European Parliament is already working on a major bill called the Digital Services Act which includes numerous transparency provisions backed by tough penalties for non-compliance, though it will take at least a year before some version becomes law. (Here’s a thoughtful piece at Brookings about mandating transparency for tech.)

If you have any doubts that Facebook dissembles, misleads, evades, and outright lies, as a quasi-institutional policy, consider these other past examples:

  • This article in the Guardian from April details an investigation of internal FB documents which “show how Facebook has allowed major abuses of its platform in poor, small and non-western countries in order to prioritize addressing abuses that attract media attention or affect the US and other wealthy countries.”
  • From a ProPublica investigation published in February: “Publicly, Facebook has underscored that it cherishes free speech…but behind the scenes in 2018, amid Turkey’s military campaign, Facebook ultimately sided with the government’s demands.” The company “geo-blocked” a site operated by a Kurdish militia, even though it did not in any way violate Facebook’s policies. And even the arguments not to take such actions are soaked in dissimulation. Monica Bickert, Facebook’s head of global policy management, wrote in an email ProPublica unearthed that the company should consider not blocking the site because “our decision may draw unwanted attention to our overall geo-blocking policy.”
  • Reaching back a little further, in 2019 Bloomberg broke news about internal Facebook tools it used to track sentiment about itself and to respond with “quick promotions” of counter-information. According to the article, the company aggressively used such tools in the Philippines and elsewhere to change opinions and correct misinformation–but only when that opinion and misinformation was about Facebook. There is no evidence I’m aware of that Facebook has used these same approaches to counter political disinformation or hate speech. I hope it has, however.

I could go on. There are many more where those came from. I’ve had the sad task of tracking such incidents and articles for a long time. It’s terrible for the world that so much of what Facebook does is conducted primarily as a PR exercise. We need Facebook’s harms to be remediated.

In Axios newsletters like the one where I saw Rosenberg’s piece noted above, Facebook routinely runs disingenuous advertisements insisting it’s in favor of more internet regulation. But laws requiring transparency about things like content reach and engagement are not what they mean. Mark Zuckerberg and Sheryl Sandberg believe that with the biggest lobbying shop of any tech company in Washington and enormous expenditures on PR and policy promotion, they can control how they are controlled. I hope they’re wrong. If the world comes to understand just how dishonest Facebook is as an institution, maybe there is hope.