Watch

Internet Giants and The World

Internet Giants and The World

An unprecedented new kind of company is gaining a new breadth of influence. Facebook has data on two billion and its algorithms guide what those people see. Google's impact is similarly vast. And companies in all sorts of industries entrust their most precious systems and data to Amazon's AWS. Commerce, communication, and to a good extent our economy now depends on these companies. We trust them, more or less, but as they command so much data, many wonder about oversight. The EU is already seeking restrictions on how a person's data can be used. How should society best balance this great power against the public welfare?
 
Kirkpatrick: The session is called “Internet Giants and the World.” At one time, I considered calling it “Internet Giants vs. The World,” but then I was also trying to convince Zuckerberg to come, that was one reason it maybe shifted. But anyway, the fact is we are in a fundamentally different reality, and I hope that I politely was pressing Mark Zuckerberg sufficiently on that last night. I believe several people on this panel would agree with me, maybe all of them, that the emergence of a certain kind of company which Facebook, Amazon, and Google perhaps most exemplify, and possibly Apple, and Microsoft, and maybe even in the not distant future Uber, which Bill is on the board of.
This is a fundamentally new reality, to have a truly global enterprise that has gigantic amounts of data about citizens from hundreds of countries that is operating in essentially a lawless environment, because there is no global legal regime for these global enterprises. It’s just something we don’t know how to deal with, and we have to begin that conversation. It’s not because I think these companies are doing wrong. I, as anyone will probably easily identify, have been a big cheerleader for the emergence of this reality. But there is this weird dichotomy, and now I really want to hear anybody—maybe Hemant could start, because he and I got this thing going a lot by dialogues that we’ve been having, that we all love these companies and use their products. I would assert that every single person in this room in one way or another loves and uses all three: Facebook, Amazon, and Google. Yet, I think probably most of us in this room also have grave reservations about certain aspects of their behavior and their future potential risks that they might pose to our lives and society. That’s the weird moment we’re in, and Hemant, either disagree with me, or amplify it, or say what you think about this.
Hemant Taneja, like Bill, is a venture capitalist. He is the managing director at General Catalyst, which manages $3.75 Billion. Among the companies that he’s invested in and is very active in: Snapchat, TuneIn, ClassDojo, Fundbox, Digit, Fractyl, Highfive, Stripe, major companies. He’s on the board of Khan Academy and co-founded Advanced Energy Economy, which is a public policy advocacy group. Another interesting thing about him is he has five degrees from MIT.
Now, we’ll go the other end, sort of similar kind of person Bill Gurley, very long time friend of mine, who even married someone named Kirkpatrick, which I always liked about him just among many things. I helped Bill get started writing, that’s a fact, right? Would you acknowledge that?
Gurley: Yes. That exposes—I’d say the year but it would expose how old both of us are.
Kirkpatrick: [LAUGHS] Bill wrote for Fortune back in the day, when I was there. He’s always been an amazing writer in addition to a very sage analyst and now an extremely successful venture capitalist who, as I mentioned, is on the board of Uber. He’s invested and been very deeply involved in Zillow, Grub Hub, Live Opps, Open Table, Stitch Fix. I mean it’s a very impressive group of companies. Benchmark is a very impressive investment firm. He was “VC of the Year” in the most recent TechCrunch Crunchies Awards; I think that’s the most recent one, in March, this year. He started as an engineer at Compaq.
Marc Rotenberg is a completely different kind of person. Also somebody that I got to know when I was writing my book on Facebook. He is the Executive Director and President of the Electronic Privacy Information Center in Washington. He’s a major expert and advocate on issues of privacy, and focuses very heavily on the technology issue and issues that surround it. He testified before the 911 Commission. So, that’s who we have up here. Now, back to Hemant, what do you think?
Taneja: I think our conversation started because we now have a handful of companies that largely control content, community, and commerce, globally. The products they provide, services they provide are fairly systemic in our lives. One of the things I think a lot about is, well where does that go? Such systemic services end up being regulated, and that’s bad for what Silicon Valley is about, what we do here in the Valley in terms of entrepreneurs and investors. One of the reasons this is a complex issue is: these companies have come to such positions of power really fast. They have young entrepreneurs, they’re learning how to do this while—Bill, you’ve got Uber learning how to do this all over the country, Airbnb, which we’re investors in, that’s trying to figure out what they’re relationship is going to be with the different cities. And so the entrepreneurs are growing up, these companies are growing up, meanwhile their scale and influence is astronomical. For us it really is about how do these companies and then the ones that follow, in terms of the new entrepreneurs that are building businesses, how do we do this responsibly so that the answer isn’t regulation, because when you start regulating sectors, they become uninteresting, because then entrepreneurs are servicing the regulators, not the customers. Look at the NPS of companies that are traditionally regulated businesses like utilities and power companies and so on, everybody universally hates those companies even though they provide great critical services. The reason is because they just don’t think with a customer first mindset. And then, it atrophies. Take energy: power companies provide a great service and we hate them. The lack of regulation is one of the reasons I think we’re in this whole climate change issue, because that’s a sector that cannot embrace innovation. Is that what’s going to happen to our technology sector? Or can we, as an innovation community, do this responsibly and transparently so that we can continue to build on all the great work that’s been done really since the advent of social, mobile, and cloud and how ubiquitous that’s become.
Kirkpatrick: So you don’t have the answer but you’re worried?
Taneja: I’m worried, absolutely.
Kirkpatrick: Bill, what do you think?
Gurley: One thing I just wanted to reiterate that you touched on is that everyone likes the things that are enabled by having the data. We get pulled toward this problem. When I land in Los Angeles and get off the plane and open Uber, it remembers the address of the last place I visited in L.A., and it’s a one-click, and you go “Wow! That’s magic, I love it!” That’s true of all types of personalization, you’re impressed, you like it, so you get drawn in. From the business side, the more personalization they do, the more conversion they get, and the results are better. You sell more goods, you get more page views. There’s a yearning on both sides, to go deeper into this place. One could argue it’s simply an extension of where we were before. Magazine companies have sold lists for years with demographic data. But, I think we all know the amount of data now has gone up exponentially. There’s an article I found yesterday, I can’t prove that it’s true, in The Washington Post, “Ninety-eight Personal Data Points that Facebook uses to Target Ads to You.” Not only the data they’ve taken out, but they’ve linked it to Axiom and these other databases. This list is kind of shocking if you didn’t know it was happening.
Kirkpatrick: Should we point out the irony that the newspaper that wrote that is owned by Jeff Bezos, who is doing many of the same things. [LAUGHTER]
Gurley: But he doesn’t have editorial control, right?
Kirkpatrick: Who knows?
Gurley: It gets tricky. I would mention a couple things that you might find surprising, that I have found surprising over the past couple of years. One, I believe it is vague at best and possibly true that Google will respond to a legal subpoena for Gmail without telling you. So if you’re running your corporate server on Gmail and a lawyer subpoenas that server and Google feels they need to comply, they won’t actually tell you that they gave up your email.
Kirkpatrick: They don’t have a legal requirement to do that as far as you know.
Gurley: And if I’m wrong on this, I’d love for someone to tell me.
Kirkpatrick: Marc probably knows the answer to that.
Gurley: But that’s pretty scary, if you’ve ever sat through a deposition, stuff can be taken out of context. The argument “Well I don’t do anything illegal so I don’t care about that,” you haven’t been in a deposition if you believe that.
So another one that‘s surprising, and I tested it again this morning because I just can’t believe it’s true. If you type someone’s mobile number into search on Facebook, it’ll bring up their profile. See, look at your face: I did your number, your mobile number this morning to double check it.
Kirkpatrick: Is that—?
Gurley: Do it, do it right now: people that have computers open.
Kirkpatrick: And that’s not supposed to be the case.
Gurley: I don’t think anyone knows this, it’s amazing. I use it with recruiting but now it will probably go away. I tested people I’m not linked to.
Kirkpatrick: Really?
Gurley: Yeah.
Kirkpatrick: Weird. Who’s surprised about that?
Gurley: It’s tough. I agree with what was said earlier, maybe I’ll stop at that point, which is: I would love to find ways to self-regulate. I think one of the reasons Silicon Valley works is it’s so far away from Washington DC.
Kirkpatrick: OK Marc, what do you think?
Rotenberg: Well, there’s a lot to work with [LAUGHTER]. It’s interesting to me how many times the word ‘privacy’ came up over the last few days. Just to pick up on Bill’s point, and to defend the role of privacy advocates: I’m a believer in technology and I think it would be great if we can create a world in which technology innovation goes forward, economic growth goes forward, there’s customer confidence, people feel good about the future, but I don’t think that’s the world today. Frankly, I think Silicon Valley has contributed to the problem. We talk a lot about successes, but let’s talk for at least a moment about some of the problems. Let’s talk about Facebook, for example. We were aware back in 2010, when Facebook decided to change the user privacy settings. People take some time to do their privacy settings; I think we talked about this around your book. Facebook said, “Well, you know it’s too difficult. People aren’t signing up for the stuff we want them to sign up for, we’re going to actually sign them up for our new business alliance, and if they don’t like it they can opt out.”
We said, “That’s terribly unfair to people. They take the time, you should respect their privacy preferences.” We actually went to the Federal Trade Commission, we said, “That’s unfair, it’s deceptive.” Someone says, “Respect their privacy preference.” The Federal Trade Commission agreed with us, we got this big huge consent order and settlement, feeling very proud. Again, it wasn’t against Facebook or the platform, it was simply a request to make sure that users’ privacy preferences were being respected. But then, you see, this is what happened: the Federal Trade Commission failed to enforce that consent order. The same thing happened with Google and Buzz, by the way. Remember Buzz? They took their Gmail users and said, “You’re all now part of Buzz, because we need to compete with Facebook, we need a social network service, we don’t have an adequate user base.” They signed everybody up for Buzz, and people said, “Hold it, that’s my private email address, and you’re going to make that public to seed a social network service, that doesn’t make any sense at all.” And we went to the FTC on that one as well, and again we got a good settlement. So we were actually feeling pretty good at the time. We’re thinking that companies are going to respect user privacy preferences, but the FTC didn’t enforce those orders. The word got out that you can pretty much ignore the regulators. And companies continued to change privacy preferences, continued to take user data. I would actually have favored self-regulation if it could be made to work, but I don’t think it works. I just don’t think it does.
Kirkpatrick: What I really want to press you on is, “what would work?” You look at what’s happening in Europe. Something that doesn’t work seems to be growing. It’s a lot of different things. The whole idea if data sovereignty, which has come under a lot of criticism on this stage in the last two days, indirectly and directly, simply doesn’t work in a truly interconnected global economy, and yet many countries are moving in that direction anyway. If you were to look at the direction for much of the political talk in Germany, Facebook shouldn’t be allowed to operate there based on its presumption of how identity ought to be handled, etcetera. And that’s been true for a while but the momentum seems to be growing for a more restrictive attitude, particularly in Europe. I think it’s possibly going to come here. The point I’m trying to get you to respond to is, despite that, for those of us who do believe that the value these companies bring is so unassailably genuine, a middle ground is hard to envision. Have you figured out a middle ground?
Rotenberg: I don’t know if I‘ve figured out but I’ve certainly been working at it for a long time. We favor a comprehensive framework for privacy protection, a global framework that enables the flow of personal data, much like a communication network. Would you use a communication network if you thought that your private communications were going to be overheard by others? Probably not. So here is actually one of the paradoxes of privacy: if you can ensure privacy, if you can ensure security, you actually create a network effect that is trustable. And today, we don’t have—we have a network effect to be sure, but very few people trust it, and that’s the problem. That’s why you need a global legal framework. Last point, just on the data sovereignty thing: Europeans have a real concern, it’s a real economic concern. The data of their users is going to the US, it’s not protected and, of course, post-Snowden, they have the worry that the data is going not only to the internet companies but also to the NSA. If you’re a European leader trying to explain to your citizens what’s happening to their personal information in this new world, you don’t have a lot to say, unless you say, “Well, we’ve got to bring our data back home to at least protect the information of our people.” I think it’s actually quite easy to anticipate.
Kirkpatrick: I have question for either Hemant or Bill. Union Square Ventures is just two blocks from our office in New York, and their very good friends of ours, we talk to them a lot. I know that at that company, even though I don’t know what they’ve actually done about it, there is a belief that the blockchain could be a tool here that really might make a difference. What Marc just said about a global authentication based something-or-other, I don’t know how it would be implemented, but whenever anybody’s ever made even the mildly plausible way of explaining what it could do, they often try to say the blockchain could be involved. Bill seems to be skeptical, so what do you think about that?
Gurley: I don’t know. I think that a lot of VC investors that aren’t getting rich off Bitcoin have come to this narrative [LAUGHTER].
Rotenberg: I’ll second that.
Kirkpatrick: I don’t want to say that those people there are doing anything. It’s just, they hypothesis they’ve said. How can you deal with Facebook, maybe with blockchain?
Gurley: The problem, I mean, I’d love to hear his answer to this, but one of the problems is that if you build something that’s inherently secure, it gets the governments all wigged out. The tool that’s going crazy now in the messaging space is Telegram because people perceive it to have—and I don’t even know if it does—but they perceive it to have higher privacy, but it also has a stigma because people say, “That’s what ISIS uses.” I don’t know if that’s a marketing—
Kirkpatrick: Not to mention, came out of Russia.
Gurley: Yeah, that’s another conundrum. Do we want that kind of thing to exist or not?
Taneja: I don’t think it’s a technology problem. If the privacy needs to be fixed because that’s the right answer and that’s what consumers want over the convenience of these services as you described, it can be done. I think that the network graph that Facebook has is not going anywhere. People will still communicate, at this point that all exists. I think the question is, “What does the consumer actually value more?” And the reality is, in the end, is less about—they value these conveniences over privacy. That’s the fundamental issue.
Kirkpatrick: Especially outside the developed world.
Taneja: That’s right. So the question is, “What is the responsible framework with which we consult regularly? What are the core principles that these companies should follow?” To me, that framework doesn’t exist and we need to think about defining that.
Gurley: And one other thing to think about as we attempt to solve it is it’s very easy, as I think about it, to extrapolate, even five or ten years from now. It’s going to be way worse, the amount of data that’s being collected.
Kirkpatrick: Speaking as a Board member of Uber.
Gurley: Nvidia blew out their numbers yesterday, do you know why? Because they’re selling servers that are being used for data science. And they sold every one they could make. The stocks are up 300% a year. That’s what’s going on. People are crunching more and more data.
Kirkpatrick: When you talk, Hemant, it sounds like, are you just trying to create a movement of inquiry, or do you have an idea of what we should actually do?
Taneja: I think this is the missing conversation and probably this conversation just became harder in terms of what happened on Tuesday, Wednesday with the elections. In my opinion the regulators don’t really understand the magnitude of what’s happening and to the extent that they think it’s a problem, how much worse it’s actually getting as Bill is saying. This is on an exponential curve in terms of how much data is getting created and how much we’re starting to understand about every single thing about our lives, including health and education, and everything else. All of that is now getting captured somewhere in some cloud and being crunched on. The question is, “What is the license to use that data in these companies, and what is the transparency around it so we can regulate?” Some of these companies are monopolies. In the last era you thought about regulating monopolies either by making them utilities or breaking them up reasonably, we were talking about this earlier. What is the framework to say, “These monopolies aren’t going to be anti-competitive.” I think that conversation needs to be in the open and needs to be figured out, in my opinion. We’re very far away from understanding that.
Kirkpatrick: Well, one of the things I wish that, I don’t know how Zuckerberg could have responded to it, but I honestly do worry about what would happen if he were hit by a bus. I think he’s a great guy, I really think that came through very clear last night, the values are almost unassailable. And yet, what is the governance mechanism that would ensure that if the current leadership was by any means replaced—and it’s just another shareholder owned company, that can definitely happen under some circumstances—how do we know that our data wouldn’t be treated completely different?
Gurley: I don’t even think you have to go that far because of cyber security. If Facebook is gathering more profile data on the individual than any company on the planet, and they then get hacked, and then that data is distributed, the liability to the individual is much, much higher.
Rotenberg: I think blockchain’s going to fix that. [LAUGHTER]
Gurley: I was just wondering for this panel, if the more data you store, should your cyber insurance be higher.
Kirkpatrick: We’d love to see their insurance bill.
Gurley: But you know what I’m saying, right? You could have a catastrophic hacking that then could expose all the data and Mark could still be alive and not hit by a bus.
Kirkpatrick: I have to believe there are a lot of people in this room who would like to enter into this conversation, please, identify yourself?
Anthony DiMare: Thank you, name’s Anthony, I work at a small startup in New York. We are collecting data from commercial ships. That conversation specifically around data collection, data storage, who has access to that data, balancing value versus risk, essentially, that’s something at the very core we have discussed because there is no legislation yet, specifically in this space, around who owns that data. I cannot help but think I’m very lucky that I can have that discussion with these customers, but someone like Facebook cannot have this discussion with their customers. I just agree and I think this is important.
Gurley: One thing that could potentially happen, he’s probably written whitepapers on it but there could be more transparency as to what is had.
Kirkpatrick: What is?
Gurley: What they have. And not just Facebook, any of these companies. You could take it to an ultimate limit where the consumer gets to see what is had.
Kirkpatrick: Would Uber be cool with that?
Gurley: I don’t know.
Rotenberg: Can I jump in on that? I think that’s a really important point. I mentioned one paradox of privacy. I actually do think it enables the flow of information if you want trust. If you don’t care about trust, then you don’t have to have privacy. But it enables the flow of information with trust. Now here’s the second paradox: transparency helps with privacy because it gives people the opportunity to know what information about them is being collected and how it’s being used and actually makes the company more accountable to the customer.
What I’ve observed over the last few years is that, as internet users have lost privacy, companies and governments have gained a lot of secrecy. They’re gathering a lot of information, we don’t know how it’s being used, and suddenly we feel a great asymmetry about how decisions are being made. I don’t think that’s sustainable, actually, I think that’s going to lead to a bad outcome. One of the ways you restore the balance is putting in place a window where there was once a mirror.
Taneja: I think this transparency issue is where I am somewhat focused. Where is the government’s AI department, that actually can see that these algorithms are being unbiased? Yesterday, I think, I wasn’t here but I heard Mark was talking about how there’s biases on both sides, there’s probably incorrect data focusing on both sides of the election—
Kirkpatrick: Fake news on both sides.
Taneja: —fake news on both sides. The reality probably is true, but what if that was actually verifiable? What if there was actually measurable so that we truly understand what’s going on? Right now, it’s more that you have to rely on these companies that have other interests, even if they have the right intentions. That becoming more transparent and measurable, I think it’s a tall order and we need to think about the capabilities of the government today.
Kirkpatrick: That one is measurable and the data is there was fake news on both sides, but was two-to-one, Trump over Hillary.
Taneja: Systemically building that into everything, whether its news or commerce or whatever else, that sort of a capability doesn’t exist yet.
Kirkpatrick: If that were to happen would it be a voluntary thing?
Taneja: I think it will help us self-regulate
Kirkpatrick: Self-regulate. I wanted Strat to say something. What do you think about this?
Sherman: First of all, Marc, I’m just so happy to see you up on the stage, if I’d known who you were I would have let you ask every question you wanted this morning.
I think this is the best conversation I’ve heard in a long time. It’s about fifteen years too late [LAUGHTER]. David, you and I were together, I remember, at a security conference about fifteen years ago in New York City. Everybody knew all of this was going to happen. The opt-in, opt-out thing: totally known. The lack of privacy and the asymmetry that Marc’s talking about: totally known. And what happened in the meantime? We’ve created a whole bunch of new billionaires.
Kirkpatrick: But we’ve also created a whole bunch of great services that we love.
Sherman: Which we’re not paying for with their true economic cost. I was at one of your conferences in Tucson or something a little while ago and—God I wish I could remember her name, the genius woman we’ve known our whole lives was out there, and I was saying that the cost, the value that Facebook is taking from us for our individual privacy rights which it is then monetizing, is significant. There’s a huge asymmetry there in tricking a 13-year-old into giving up her privacy rights before she really can even think about it properly, and them making a ton of money off of her. And she said, “Eh, I’d be happy to lose the $3.21 that my privacy is worth.” I suspect its worth more than that.
Gurley: I think, once again, the drive to do this is really high. There’s a feature that Facebook had this week which said, “Find out who won the elections in your neighborhood.” And rather than asking for a zip code, it asked for a full address in order to see the results. I suspect that was to enhance the data profile.
Kirkpatrick: You know about things like this, your suspicions could be valid.
Gurley: It’s going to keep going. I do think there are cases, where, when trust is eroded, you actually lose the customer. I would suggest, fundamentally, that SnapChat exists because Facebook eroded its trust with its users, absolutely. The kids were told, “Get off of Facebook because you won’t get into college because this stuff will expose you,” and it created an opportunity for a new company.
Rotenberg: Can I just jump in? Because that’s a really interesting example. In 2014, Facebook acquires WhatsApp. Facebook doesn’t have such a great privacy policy, but WhatsApp actually did. We wrote to the FTC and others and said, “Hang on, just a moment, we’ve got to be sure to protect the privacy interests of those WhatsApp users because, by the way, in the self-regulatory free market world, they chose WhatsApp as opposed to Facebook.” We said, “You can’t let Facebook get access to that user data, because if it does, the whole model collapses.” In August of this year, WhatsApp says, “Well, we’ve kind of rethought this and now we’re going to give the verified telephone number”—there’s the telephone number—“over to Facebook.” Which is, of course, the key to the profile, right?
Gurley: It’s the most useable UID.
Rotenberg: Without question. But this is where it gets interesting. In some ways, people predicted that, they said that was going to happen. We tried to stop that from happening. But now you see the regulatory responses around the world. It’s Europe, not surprisingly, but it’s also Japan, it’s also India, it’s also South America. The FTC is not sure what to do. We went to the FTC. All we’re asking them to do is enforce the commitment that the company made in 2014.
Taneja: I do think this is an interesting point. When Facebook acquired Instagram, the entire focus was on, “Fifteen folks, $1 billion dollars for fifteen folks: that was a crazy move.” Reality is, if we were mature enough about it as an industry, we would say that is actually an FTC conversation. Now you’re talking about extrapolating hundreds of millions of people, becoming part of the same consumer platform, and there’s a consolidation of consumer access. As opposed to thinking about this as revenues, because there were no revenues, that mind share quotient that these companies got by consolidating, that’s tremendous. I don’t think people thought that way, three, four years ago when this was actually happening.
Kirkpatrick: I just want to layer back in something that I tried to say at the beginning, and then I want to hear from David Chaum, who had his hand up. It’s not just the privacy question. Let’s not get too carried away that that’s all we’re talking about here. I would reiterate: this is a new type of entity in society that is truly global and has inordinate power in many cases, probably in most cases, more power of a variety of types than governments. It’s really a question of, “What is the oversight capability for this new type of entity?” You and I have discussed this. I just want to keep that in the mix here. David: Identify yourself.
Chaum: David Chaum, of PrivaTegrity. I was the inventor of eCash, Digicash was my company back in the day. I think the way to cut the Gordian knot here is simply to—the cat’s already out of the bag, we cannot secure the privacy that’s already been lost and the willingness of people to surrender information and the value they’ve received for doing so. I think what we can do, and is probably essential for a resurgence and survival of democracy in fact, would be to create a protected sphere. A separate, a new Facebook in which there are very strong guarantees of privacy but where there must be protection against abuses, as Marc said. Where the policy that is governing it is cast in code, not in the good will of a corporation at the moment, and where the data sovereignty is spread over a number of jurisdictions so that in order to compromise the privacy of individuals, any government would have to respect the rules which are baked into the code. This is, in fact, what PrivaTegrety aims to do, and there was an article about it in Wired some few months ago, and you might wish to reread that keeping that in mind. There are many aspects of people’s interactions that deserve this kind of protected sphere and it is from those that the empowerment of the individual to set public policy and participate in democracy derives, so we cannot afford to erode this.
Kirkpatrick: Okay, quick comment, and then I want to hear from Tara.
Rotenberg: Two quick points. One, David Chaum is a very important reminder that there is a lot of innovation in the privacy field. And I think it’s very important not to set innovation on one side and privacy on the other and always talk about tradeoffs. David has been a big hero to many of us who are saying “Let’s look for technical solutions in addition to legislative solutions.” And I quickly want to pick up, David, on your big picture, what’s going on with our new world issue. I was listening to Mark talk, and the issue about the newsfeed and the balance, and do we have the algorithm right. It occurred to me, this is actually not the right question if you believe in a distributed democracy and multiple information sources. You can’t have one entity, any entity, even with a benign leader, saying “Let’s find the right setting on the dial and be satisfied with that outcome.” Because, of course, what a democracy requires are many, many different sources of information and lots of competition.
Kirkpatrick: Well, then you would celebrate him on his investment in SnapChat then. Unfortunately or fortunately for the time being the world’s population has, more or less, chosen. I’m not saying there wouldn’t be an alternative that could emerge, and if you listen to some of the promises about Uber, which Bill knows so well, even Uber has aspirations to do a whole bunch of new stuff down the road that could even be part of this conversation in some quite intriguing ways. It’s not that we can’t expect that competition might emerge but we have to admit it doesn’t really exist now at the scale we’re talking about.
Gurley: I do think it’s reasonable to suggest that the minimum bar might be to disclose specifically to the user what is being kept and what is not.
Kirkpatrick: I think Facebook would say that they already do that. It takes a little looking to find. They would claim that maybe there’s other subtle subsets of things that they won’t tell us.
Gurley: I guess you could argue about who should be the judge of whether that’s being done or not.
Kirkpatrick: There you get into regulation. Who should be the judge?
Gurley: I don’t know
Kirkpatrick: That’s what I mean.
Gurley: One place they don’t go is they don’t show you the data that they’re keeping. That’s a level of transparency that’s not there.
Kirkpatrick: They show you—
Rotenberg: But it should be there.
Gurley: Not verbiage that says what we’re storing. Like, “This is explicitly what we have, and uncheck this box if you want us to stop storing.”
Taneja: The things you don’t want to use.
Kirkpatrick: They have things like that now, “We think that you like fishing, if you don’t really like fishing, uncheck it.”
Gurley: That’s to make the ads run. [LAUGHTER]
Rotenberg: If you had a bank statement that didn’t show you your transactions for the month, and said “we’re transparent” because it’s got your address at the top.
Gurley: And by the way, we shouldn’t just talk about Facebook. That’s Google, that’s Amazon, that’s Apple.
Kirkpatrick: I actually admire Mark for coming and knowing that I was going to ask some of those tough questions. I don’t think Jeff Bezos would be willing to sit for the kind of interview that I had with Mark last night. I actually think Amazon is as much a part of this conversation as any, if not possibly more so. The reason about Amazon, my thing about Amazon is AWS. If you talk to companies, it is hardly a company you can name that isn’t using that service these days. And all that data “owned” by all those companies is being held by another company whose policies on that may be disclosed to those customers but haven’t really been disclosed to society.
Taneja: This is the issue. They have all the consumers, they have all the data, and therefore they are going to have the best AI: this just sort of exponentially compounds on itself.
Kirkpatrick: And do I have an Amazon Prime Account and just love their ass? Yes I do! But I’m just saying, it’s weird. [LAUGHTER].
Q1: I have a couple of points.
I’ve been working with Mark and others on the privacy advocacy since 1995. I started ten companies, invested in many more so I work on the entrepreneurial side, but worked so hard on the privacy because we knew what we were doing as people who were building it. I ended up becoming the president of the Electronic Frontier Foundation because I was so unhappy with what we were doing. We formed Trustee, a self-regulatory body, early on, to do exactly what Bill you pointed out what the transparency should be. The industry pushed back so hard on us, even with a blended board, that self-regulation under that model did not work, as much as I wanted it the same way you’re advocating it now. So it was not very effective under that structure, just to give us a bigger context to it. People were pushing on, “What are the harms?” And the harms, we haven’t even gotten to here. I think the movement away from the use of the word privacy to notions of identity: soft identifiers, hard identifiers, intellectual property, networks and associations, your wake, where you’ve been and what you’ve done. What is the value to that, how does it affect your transactions in the world, how does it affect what you see and how do you put people back, is important. It’s important for us now, those of us who have been in it a long time, to step back and say, companies come and go. The life of a public company is very short. We own the government, and the government will be here for a long time, so as opposed to saying, “Regulators are bad, policies are bad, companies are good,” I think we really need to revisit that and revisit all the language on it and look at what failed in our first 20-year effort and try to recraft it for the next part.
Kirkpatrick: You know something occurred to me as you were saying that, it’s weird when you hear a thought like that, that we now have a president who is a capitalist who often talks like a socialist. It may be, for better or for worse, and all the other things that he does are very complicated and controversial but we may be at a moment—and I think all of us would say, probably everyone in this room would say, that some form of social responsibility could be more formally instantiated, I keep using this word instantiated, but put into the way we think about shareholder owned companies, etcetera. A lot of them, themselves, are hungering for that. If you talk to people like Unilever, a company we know well—maybe there’s a whole new we have to—unfortunately or fortunately, maybe this very issue will force us to a new concept of the relationship between business and society.
Rotenberg: David, we’re actually hopeful on this and I agree with what you said. We entered this election year—EPIC, I should say, is a nonpartisan and we’ve always had the view that privacy is a nonpartisan issue. Doesn’t matter where in the country you are, what your party affiliation is, it matters for you, your family. So we launched a campaign. This is my button. The phrase we took was simply “data protection.” The icon however is an encrypted iPhone, so there’s a little joke there. But we said data protection is an important issue and we would really like to hear the candidates say what they think the government should do—
Kirkpatrick: Ha Ha Ha Ha [LAUGHTER]
Rotenberg: You’re right, in a way, it became very weird because it was her private speeches, his private tax records, Podesta’s email. We had the privacy conversation it was just about all the wrong stuff.
Kirkpatrick: Trump did criticize Apple for not cooperating with the FBI.
Rotenberg: I think that was a mistake. You actually go back and you find that one of the reasons Apple put in place strong security measures was because law enforcement said, “We’ve got a problem in this country with stolen cell phones, and if you don’t do more to protect the security of stolen cell phones, we’re going to have a lot more crime.”
Kirkpatrick: We actually have to end, but it’s interesting, one thing I often say, and Hemant and I have talked about this too, Apple and Microsoft in some ways should be coming up more than they do, but the reason they’re not coming up is that they are both more mature companies that have both decided that protecting the privacy of users is a key part of their corporate identity and they’ve made that very, very clear and I admire them tremendously for that. In some ways, I sometimes wonder whether the whole issue about this kind of company is just an issue of corporate maturity. Let’s face it, Facebook is the youngest of all these companies. It happens in some ways also to be the most successful, with the most data.
Gurley: Look, I think there are two things that will be universally true. If the harm does increase, any one of these companies that are storing a lot of data, if there’s a breach that causes major harm, they’re going to take a huge hit. Their business will suffer, I would argue. The second thing is I always go back to what people call the “Wall Street Journal Rule.” If you’re doing something, and it was exposed on the pages of the Wall Street Journal, would you be embarrassed by it? I think that’s a test that all of these CEOs should be using to self-regulate what they’re doing internally, because they’re putting their brand at risk if that gets exposed.
Kirkpatrick: And with the new hacking environment we’re in it’s more and more likely to get exposed. So you advise your companies to think that way?
Gurley: Yes.
Kirkpatrick: Any final thoughts from anybody?
Taneja: I’m a bit more optimistic on self-regulation. I think the world is in a very different place and it has to be taken a lot more seriously today than ’95, so I’m hopeful we will make progress on that in the next few years.
Kirkpatrick: Marc?
Rotenberg: If self-regulation could work, I’d be all in favor, but in Tara’s experience and my experience is that it doesn’t and I think we need to find a new solution, one that’s pro-innovation, pro-growth but most certainly pro-user and pro-privacy.
Kirkpatrick: I’m glad we had this conversation. It was especially gratifying that you, Marc, agreed to come out fairly last-minute, and I really think he added a critical element. So, thank you for that.
Rotenberg: Thank you for that.
Kirkpatrick: Thank you Bill.
Gurley: Thank you.
Kirkpatrick: Thank you Hemant, thank you all of you.

[END]

Transcription by RA Fisher Ink

Participants

Scroll to Top