Unlike any time in human history, vast power is in private hands. This is alarming whether you log in to Facebook or not.

Facebook’s real-world impact was vividly illustrated in the recent Myanmar military coup and the U.S. January 6 insurrection. In such situations, Facebook is a critical tool both for organizing and for the distribution of disinformation that may lead to further harm, chaos, and in some cases, death.  These problems are too big for any one person or company to address. 


Simply based on the number of people, Facebook is the largest group in the world, far bigger than any country, with 3.1 billion people logging in monthly across its core family of apps. Of those on Facebook, 80% are outside the United States. Yet none of these people have a say in the management of a platform that is integral to their lives.  

Our instinct towards sociality—our natural tendency to form communities—is being pressure-tested as the world becomes more and more globally interconnected. We need to re-imagine our social contract as we learn to live in a world defined by internet 3.0. Facebook and other platforms must somehow become truly equitable, inclusive, and safe for all their stakeholders, and not merely serve shareholders.   

We are not arguing to abolish Facebook, nor is Zuckerberg himself our focus. Instead of questioning the personal morality of the founder, we should question his consolidated power. One person shouldn’t be making “corner office” decisions about the nature of free speech, whether a nation-state president should be de-platformed, or what world leaders can or cannot say on the site.


The core issue is that Facebook has much more power than should be in private hands.

Facebook frames the issues and debates we have about the platform to its benefit and takes control of the narrative. Through that framing, it continues to do what it does best: distract our attention. 

The Trap of Distractions

Facebook’s distractions take many forms. It argues that its technology is too complex for regulators. It presents us with overly simplistic views, suggesting that we can have freedom of speech or regulation, but not both. A similar binary argument is also used for privacy. Facebook often positions itself as the best guardian of free speech. It acts as if self-regulation is enough, even as it buys ads all over news sites arguing that we need more internet regulation, without saying what that regulation should look like. It even funded its own “Supreme Court,” the Oversight Board.

Regulating Facebook is not impossible. Creative thinking on policies is needed from governments, advocacy groups, legal minds, and concerned citizens.  In fact, the U.S. Congress is aggressively moving forward with five different major bills to regulate such companies, even as President Biden just named a major critic of platforms, Lina Khan, to become the powerful chair of the Federal Trade Commission, which is mandated, among other things, to help regulate competition and prosecute antitrust violations. European laws like the GDPR have already attempted to add more privacy protections.

These bills are a start, but more imaginative policymaking is needed. The platforms’ borderless business models must be reformed so people are not commodified as products. On top of this, different protections apply depending on where one lives, which is also problematic in creating second-class digital citizens. One of the greatest challenges posed by Facebook is that it is truly global but regulations are national or at best, regional, in the case of the EU.

Self-regulation is inherently unlikely to work, not to mention that Facebook consistently fails to apply its own terms of service when it attempts to regulate behavior by its members. And certainly, no self-reform by the company to date has come close to matching the amount of damage it has caused. Cosmetic changes will not address the crisis in which we find ourselves, either. Facebook was, for example, by far the most essential platform for those charged in the January 6 insurrection

As for the Oversight Board, it’s not so much that the idea or the structure of the board itself is wrong, but that the entity exists only as a derivative of Facebook, which created and pays for it. In fact, the Oversight Board highlights how Facebook wants to shape its governance post-Zuckerberg. The board again distracts us from the fact that Facebook shouldn’t be determining how it is to be held accountable.

Break Free and Re-Imagine

All of these distractions draw lawmakers and users into a narrative Facebook designs. We must break ourselves free of this line of thinking to address the core issue of this vast power in private hands.

It’s time for the users themselves to be part of the conversation in envisioning a new precedent for power and Facebook. The platform should not have power over us but should share power with us. COVID-19 reinforces the importance of human-centered online connection when we’re physically isolated. But in fact, Facebook’s users are often denied, through the company’s failures to oversee content, accurate health information. Instead, Facebook too often hosts vaccine misinformation. 

When you think about tomorrow, what do you want it to look like? How do you want to connect with people online? Who do you want to determine this experience?

We can proactively help shape the technology that’s integral to our lives, but the first step is to stop falling for Facebook’s distractions.

While we should be critical about tech’s private power, we should never be cynical about the promise and possibility of human-centered digital connection.

Dr. Vigjilenca Abazi is an award-winning legal scholar with expertise in privacy and free speech.

Helen Todd is Co-founder and CEO of Sociality Squared, a social media agency established in 2010 in New York City.