The enormous set of unseemly revelations known as the Facebook Papers show, among other things, that Facebook is a global media company that does not bear its associated responsibilities. (A full list is here of the now well over 100 articles in over a score of outlets.)

A media company selects programming and presents it to viewers. Just about every one of the Facebook Papers stories is about how the company makes decisions about what content to show and to whom. Yet traditional media companies, unlike Facebook, are required by law and tradition to bear responsibility for the content they present viewers. Internet companies, by contrast, got an exemption in the now-notorious Section 230 of the Communications Decency Act of 1996, which says content created by users is not the responsibility of web platforms. Facebook has applied that logic of non-responsibility to speech and content distributed all over the world.

Ad

But Facebook doesn’t just provide a neutral window for content created by friends and organizations you follow. It curates that content and shows it to you selectively, in the order it chooses – in a way that maximizes revenue. What it shows you is its willful decision.

Many of the unseemly revelations in the Facebook papers flow directly or indirectly from how the company’s algorithms work. It is in the software that decisions about what content is shown to one viewer versus another is actually manifested on Facebook. Such software, like human editors, is making subjective decisions. And the algorithms are themselves governed by human decision making, either by the programmers who created the algorithm or later decisions made by content moderators, reviewers, or public policy executives. As the Washington Post reported in one recent story, “The company may not directly control what any given user posts, but by choosing which types of posts will be seen, it sculpts the information landscape according to its business priorities. “

It is in the poorer parts of the world where the company’s irresponsibility and inattention has the worst consequences. (Facebook revealed in its quarterly report this week that 3.5 billion people globally—half the population of the planet, use its products.) Whistleblower Frances Haugen herself says she was primarily motivated by worries about FB harming the “global south.” Her worries are justified. Here is a detailed and hair-raising accounting of Facebook’s misdeeds in poorer parts of the world, as revealed in the Papers, from the great site Rest of World.

Ad

Many of the problems FB faces are presented by its more forgiving journalistic critics, like Casey Newton, as consequences of its sad and unfortunately vast size, with the implication that it thus cannot be expected really to deal properly with these myriad issues. “There is a pervasive sense that…no one [in the company] is entirely sure what’s going on,” Newton wrote Monday, Oct. 25 in his Platformer newsletter, a little sympathetically.  

But I have watched Facebook over the past decade and a half make decision after decision in the interests of growth that could have been predicted to lead to the problems that resulted today, had any thought been given to that possibility. No thought was given, because growth, and the associated power and profit that flowed from it, was by far the main thing Facebook cared about.

Take for example the rapid expansion over the past decade into country after country, in language after language. Facebook did this enthusiastically, and heedlessly. There was in most cases no thought given to the moderation challenges that would inevitably ensue. But the kinds of content, privacy, and political missteps Facebook has been accused of in the United States are equally a risk, and a likelihood, in every other country.

To this day there is essentially no oversight given by Facebook to most content or user speech in distant countries. For this, in my opinion, Facebook is absolutely culpable. I will repeat a statistic that I cannot stop talking about from the Facebook documents: the company only allocates 13% of its moderators’ time for combatting misinformation outside the United States, even though 90% of users live elsewhere.

Let’s call this what it is: a neo-imperialist mindset that values what happens in the U.S. more highly than what happens in other countries. This has the consequence of causing less damage in the U.S. and a few selected European allies while more seriously weakening societies elsewhere. These neglected countries–where literally billions of the company’s users live–are the same ones the company blundered its way into as it sought growth and global hegemony. It’s not that different from how, for decades, the U.S. methodically undermined and overthrew governments perceived unfriendly to U.S. interests, especially starting in the 1950s, regardless of the consequences for people in those countries.

It is regularly said by Facebook spokespeople that right-wing discord and civic strife in the U.S., India, and elsewhere did not begin on Facebook. That is an idiotic truism. The question is whether Facebook’s actions, oversights, and willful neglect have worsened it. The Facebook Papers offer example after example that the answer is yes.

Facebook, had it been led by people who had an empathetic and ethical concern for its impact, could have grown more deliberately. It could have taken care to only enter a country or support a new language if it had the resources to properly govern its service there. But that would have slowed down the profit engine, and might not have made Mark Zuckerberg the richest 37-year-old in human history.

In the avalanche of material emerging from the Facebook Papers, it’s easy to get lost, or overwhelmed. But another quote from a different Washington Post story offers clarity: “The documents…provide ample evidence that the company’s internal research over several years had identified ways to diminish the spread of political polarization, conspiracy theories and incitements to violence but that in many instances, executives had declined to implement those steps.”

 

Note: The company also changed its name this week, to the arrogant-sounding “Meta.” How apt, that Facebook would trumpet its move towards a fuller escape from reality for its users just as COP26 was about to begin, underscoring how devastated will be the actual world those users live in. The central idea conveyed by the company’s new direction–that life should inexorably become more and more virtual–feels wrong. So with things burning down around us, is the idea that we can quietly and guiltlessly escape into a Meta virtual world where the great Zuckerberg guides us to digital bliss?