Everybody seems to be talking about privacy. So here’s a question: What does privacy mean to you?  Privacy is one of those words that gets thrown around, without a definition. News reports often suggest privacy is about “the safeguarding of personal information,” or “not sharing personal information,” and protecting against “exploitation,” whatever that means. Concerns are growing fast. The New York Times recently launched a major privacy project examining the boundaries of privacy, further stimulating the conversation. But to understand privacy, we have to define it first.

Privacy, in its truest form, is anonymity—meaning nobody knows who you are, or knows that the information is about you. Functionally, privacy means going about our lives in such a way that allows every behavior or action to be ephemeral and forgotten. It means our information can’t be used to target us for marketing, ads, or by phone, or email.


Think about how, as recently as the early days of the internet (if you’re old enough to remember that), your life consisted of a series of separate, independent networks― a network of work friends, a network from the neighborhood, and a network of favorite stores, offline and online. Each of these networks were distinct, and there was no blurring of lines. Your purchase in an online store didn’t trigger a digital ad from another store. No one at work knew what happened at the neighborhood party, because there was no Instagram geotagging.

So, what happened?

Beginning with the advent of Google, the use of data took two commercial paths. On one hand, data was used to enhance the quality of services. But on the other hand, using data became the service itself. Netflix doesn’t generally earn incremental profit by leveraging your data to recommend shows, but Facebook triangulates around your location, connections, and interests to make money off your data by showing you targeted advertisements.

By 2004, the framework for data-as-a-business (we’ll call it DaaB) was already essentially set, to create the massive data aggregators we know so well today, like Facebook and Google. To capitalize on this model, it became essential for these companies to collapse and consolidate our happily separate independent personal networks into a series of linked, concentric ones.


To illustrate the effect, imagine a dart board with your life at the bullseye. Each circle around the center consists of another network of your life forming a digitally-documented composite of your identity. The tightest circle is your immediate family and friends, and beyond that are your professional connections, affiliations, interests, and any and increasingly all of your behavior, both offline and online. In a sense, now all those circles are connected and form one big circle that companies can look at and use. What was independent is now linked.

Not only are all the networks linked, but internet companies have a great memory. So your every action is permanently recorded. Through this marriage of linked networks and permanent memory, Big Tech can identify you from the outside looking in, and target people affiliated with you from the inside out. This way the linkage of networks feels like not only an invasion of your privacy, but also of your friends. The dictionary defines privacy as “the state of being alone or kept apart from others”. Based on that definition, privacy is, practically speaking, non-existent. That’s why, in one of the Times’ new essays, columnist Farhad Manjoo writes, “You don’t have any privacy online,” and “It’s time to panic.”

Take a look at Facebook’s mission statement: “to make the world more open and connected.” By that logic, the company’s entire purpose is to blend and link our various networks. In other words, to do away with privacy.

Previously, privacy may have existed between networks, but not necessarily within an individual network. But in today’s world, every network is linked and virtually every action and connection enhances the composite of your digital identity. Nothing is distinct, and no action has a shelf life.

At many Big Tech companies, a commitment to “privacy” has become a cost of doing business—for good reason. The impasse we now face, however, is that while most of these efforts make us feel more comfortable about sharing our personal information, they don’t get at the core of the problem–which is that our data is being aggregated in a way we simply don’t understand.

Regaining your privacy boils down to gaining the ability to once again separate your networks and control the length of time your data is warehoused.
The tool that can help the most in working toward privacy is transparency. Just witness the uproar around Amazon’s failure to adequately disclose that Alexa data was being listened to. Amazon was not being transparent. Let’s look by contrast at Facebook: to the company’s credit, it’s beginning to add layers of transparency about why ads are shown to us. But beyond that, transparency should apply to how our data is collected, where it’s collected, and how it is used. Unveiling this information would help manifest where the links between our networks and actions occur and enable users to make decisions and share personal information accordingly.

Big Tech would be well served to remember that competition always lurks nearby. Companies that don’t evolve will deservedly underperform. The winners and losers of tomorrow will be those that can manage an ongoing cycle of earning, retaining, and reinforcing trust. This is why the privacy debate is so mission critical, and for some companies, existential.

As consumers, we should continue to demand transparency to better understand the tradeoffs we face and the level of privacy we are willing to sacrifice.

James Cakmak was a Wall Street security analyst for over 10 years covering the Internet sector. He is also co-founder of Snailz, a digital beauty booking marketplace operating in New York. Follow him on Twitter: @JamesCakmak.

Ryan Guttridge, Adjunct Professor at Smith School of Business, University of Maryland, contributed to this article.