Cloudflare TV

Fireside Chat: Alissa Starzak + Cindy Cohn (EFF)

Presented by: Alissa Starzak, Cindy Cohn
Originally aired on October 13, 2020 @ 12:30 PM - 1:00 PM EDT

A special conversation with Cindy Cohn, joined by Alissa Starzak (Head of Public Policy, Cloudflare)

Cindy Cohn is the Executive Director of the Electronic Frontier Foundation. From 2000-2015 she served as EFF’s Legal Director as well as its General Counsel. Ms. Cohn first became involved with EFF in 1993, when EFF asked her to serve as the outside lead attorney in Bernstein v. Dept. of Justice, the successful First Amendment challenge to the U.S. export restrictions on cryptography.

English
Fireside Chat

Transcript (Beta)

Welcome everyone. I'm Alissa Starzak. I'm Head of Public Policy at Cloudflare and I'm here with Cindy Cohn, who's the Executive Director of the Electronic Frontier Foundation.

And I'm so happy to have you here today, Cindy. Oh, my pleasure.

Thank you so much for inviting me on. Well, so Cindy, I want to start back.

We started chatting about how you started at EFF and the whole history and how long you've been there.

So I want to start back there and just explain how you got started at EFF, what you started doing and how you got involved.

So EFF was founded in 1990.

So we're 30 years old today. I got involved about three or four years later, although I knew some of the founders when they asked me to take on a lawsuit involving freeing up encryption technology from government control.

And I remember the call I got from EFF founder John Gilmore when he said, you know, we want you to take on a lawsuit because the government says that that that if you publish this computer code, you'll go to jail as an arms dealer.

And I was like, well, what does it do?

Does it blow things up? And he said, no, it keeps things secret.

And I said, well, that sounds like a First Amendment problem to me. You know, my brand new baby lawyer.

I was like four years at law school. And he said, us too. Would you take the case?

And I said, sure. So really, that's that's kind of what happened.

And, you know, in retrospect, John has said that, you know, he contacted me because I was one of the only lawyers he knew.

So, you know, it helps to be fortuitous, I have to say.

And he chose well, even if it just happened to be on association.

Yep. And so so we started putting together this case. And very quickly, Lee Tien, who's still at EFF with me, joined, joined with me.

And, you know, we have joked for a long time that I'm kind of Captain Kirk and he's Spock in terms of our legal strategy.

And every Kirk needs a Spock, let me tell you. So we put the case together and pulled together a few other people and we launched it.

And, and we were successful.

We were able to, we were able, we were winning in the courts.

And as a result of that pressure, plus a lot of other pressure in the late 1990s, the government stopped treating encryption technology as a weapon.

We later learned that that did not stop the government from trying to deny us access to strong encryption, but it did force them to change tactics pretty dramatically.

And as a result, you know, we have a lot of encryption on the Internet right now.

And, you know, in all sorts of things, we're on Zoom here, you know, we, we were part of a coalition that helped pressure Zoom to really make their encryption available to everybody and not just certain users.

But but so much of what we do online now is dependent on encryption.

Now the government's still trying to take that away.

And that's one of our current fights, both here and in Europe and in Australia.

It's really a, it's really a problem all over the world. But, but that first victory really set the tone for a lot of what happens on the Internet now.

And about six or seven years after we started that case, and I got that phone call, EFF, the person who had been the legal director at EFF, Sherry Steele, became the executive director.

And she asked me if, and Lee, if we would join full time and come work full time at EFF.

And we both thought that would be so fun, we'll get to keep doing this.

So so we joined. So I've been at EFF for 20 years now.

Wow. Well, so I actually want to go back a little bit and think about what issues were like back then.

Because I think I think that's one of the struggles, if you think about what was what was happening, you know, you mentioned, you basically got involved before the World Wide Web was even started.

So what does that mean?

So how do you see things change? How do you think about those early days compared to where we are now?

What what's changed? Well, I think the first the thing that was really prominent in the 1990s was the role of government versus people on the Internet, the Internet, you know, started by being kind of built out of a bunch of universities and government funded research, DARPA, and things like that.

But it slowly got available to people who weren't in those institutions. And it was never really controlled by government.

You know, once it once it kind of got freed a little bit.

And so a lot of the questions had to do with governmental control, and, you know, people doing things on the Internet that made governments nervous.

And I think that it's easy to forget that and think that the Internet started with the rise of the tech giants like Facebook and Google and those things.

But those are, you know, that's 12 or so years after, you know, really, in the 2000s.

And so there weren't, you know, there were companies, there was AOL and CompuServe.

But these were companies that helped you get on the Internet largely, or in AOL's case, you know, tried to get you onto a tiny little walled garden piece of the Internet that, you know, failed as a business model, once people realized there was a big, wide Internet out there.

So it wasn't like there weren't big tech companies, but they weren't the kind of, they weren't exercising the kind of control that they are now.

And so the focus was on governments. And specifically, EFF was founded because of government raids on people who were either using or owning, running bulletin board services, which are kind of the, you know, early precursors to what a lot of what happens on Facebook or Google groups right now, but they were not controlled by one company, they were, they were hosted in various places.

And the government was really overreacting to some things that happened, and they would show up and, and still, you know, arrest everybody and seize everything that was plugged into the wall.

We still have that happen today, but far less often.

And so EFF was founded by folks. It still does happen. EFF was founded by folks who realized that we needed to have somebody, we needed to have an organization that was going to help those people when this happened, help explain what was really going on with the technology, as opposed to the scary, scary story that was being floated by law enforcement, and then help, help us frame, you know, how are we going to think about this new technology going forward from the context of the users, and the developers of it, which were not the big companies, but, you know, ordinary folks who were developing things.

And I often tell people, you know, if your story of the Internet was like these starry-eyed guys thought that the Internet was going to make everything perfect, it's a little, you know, it's a little more complicated than that, right?

You don't found an electronic frontier foundation if you think everything's going to magically be perfect.

Our job is to try to help make that happen, but in the face of things going wrong.

And I also kind of point out that it's a little simplistic to think that the early Internet, folks who, you know, thought about the Internet in the future were all guys, because, you know, hello, and I'm not the only one, you know, EFF, and the Center for Democracy and Technology, and a lot of the groups that are dedicated to helping protect you when you go online have long been led, or had leaders in them who were not, you know, the stereotypical guys.

So if you only point at John Perry Barlow, and you think that's what the early Internet looked like, I think you're missing a lot of what Barlow himself did, but you're also missing all the people that Barlow brought in to help him, because we don't look like that.

Well, it's so easy to gloss over some of that history, right? And that's what some of what you see happening, because people want a narrative.

And all of those important things that happen that actually shape user rights today, there are things that you are involved in.

Yeah, forget about them. And I don't mean to say that we got everything right, or that we predicted everything perfectly, we clearly did not, you know, the rise of the tech giants, and the way that the advertising business model was going to become the thing that ate our privacy, I would say, did not occur to me in the 1990s.

And didn't occur to a lot of people. I mean, again, I don't mean this as an apology, as much as like a broadening of the conversation from the, you know, the single narrative that seems to have taken over.

Right? Well, I think that goes back to that question of how do you sort of adjust as an organization, specifically one that has such a strong mission to focus on users?

So how do you look at what's happening and figure out how you readjust as an organization about where you get involved?

What issues matter? And what is actually something that will matter for users?

Yeah, it's an art, not a science, I will tell you, anybody who tells you there's a set of rules, and you just look at them, and then you follow them, or that there's a simple algorithm, you know, data in and decision out, like, that is not how this works.

In, you know, late, the late 1990s, EFF recognized that we were going to have to get involved in intellectual property, especially copyright law, because that was going to end up, you know, I wouldn't, we wouldn't have thought that, I don't know that, you know, I mean, maybe Pam Samuelson knew in the early 1990s, but it was really when Congress passed the Digital Millennium Copyright Act in 1998, that we realized we had to center a huge chunk of our work on copyright.

It was probably the middle of the 2000s, when we realized that we had some strategies that we could do to deal with stupid patents.

That one was one where we kind of knew that was a problem for a long time, but we didn't really get the tools that we needed to think we could take it on until the middle of the 2000s.

And, but it's an ongoing, it's an ongoing conversation.

I mean, the whole, what I think of EFF is, is being on patrol, right?

We're on patrol, we're watching what's going on on the Internet. And when something comes up where it feels like it's really going to dramatically harm users or harm innovation, especially stuff that has to do with rights, honestly, you know, we're a civil liberties organization.

So we care a lot about lots of things.

And we do, you know, the, we're constantly negotiating, you know, what's in scope and what's out of scope for the organization, but the heartland are your first amendment rights, your fourth amendment rights, the, you know, the way in which freedom of expression and privacy and surveillance play out are kind of core issues.

They've always been there and they will, we will always do them. We add on, sometimes we'll, we'll, we'll ramp down depending on, you know, where we think we're needed, frankly.

Yep. So, so you have, we obviously Cloudflare and EFF obviously have a, have a, have a history together.

And we, you, you actually represented us in a really important case for us from a privacy standpoint.

So I'd love to ask you about that, how you got involved in that and kind of how it played out.

It predates my time at Cloudflare, but I know this story, at least from our, our perspective, but I'd love to hear from yours.

Well, you know, I, and this is something where I really, you know, EFF and Cloudflare have this long relationship, most of which is secret, right?

So it's funny because, you know, I have a couple secret clients and, and two of them I can talk about, but Cloudflare, you know, we were clearly worried about national security letters after the Patriot Act.

They were a tool of the government that were very narrow in scope and how they could be used.

And we knew that when the Patriot Act kind of messed around with some of the definitions and the things that there was a chance that they were being used much more broadly and, and in ways that were very troubling and that they have this huge cone of silence on them where the companies cannot tell the users that the government has come knocking.

So we heard from an early Cloudflare person.

I'm not sure that I'm supposed to say who the person is, but that Cloudflare had received a national security letter and would we be interested in representing them?

And we very quickly said yes. And so we mounted this secret fight for many, many years to try to get the statute declared unconstitutional and to try to free up Cloudflare so that it could tell its customers, but more importantly, to tell the world about what's going on.

You can't, you can't reform a government system if people don't know how it works.

And we were worried that the NSL statute was being overused.

I still think it is. And, but we've gotten some reforms in, but we got those reforms in because we were able to unearth enough about what the government was doing and show Congress and Congress amended the statute.

Now, again, I don't think they made it all the way better, but they made it a lot better.

We are still waiting for the ninth circuit court of appeals to think about whether the new statute is still unconstitutional or not.

We certainly think it is.

It's been sitting on it for a very long time, which, you know, when you're me and you're kind of the, you know, the tiny little team that's fighting the gigantic national security infrastructure.

If the courts are going slow, it probably means you're right.

I certainly think we're right legally, but you know, if, if national security was just going to win slam dunk, I think we probably would have had a decision by now.

But I don't know. And, you know, I don't want to jinx it, but we really appreciate the, the opportunity to stand up for Cloudflare and, and frankly, you know, not many companies would be willing to do this thing when they couldn't tell anybody about it.

It's, it's, it's the measure of, you know, it's, you know, they often say, you know, the measure of true charity is what you do when no one's looking.

And I think the measure of true commitment to civil liberties is also what you do when no one's looking, or in this case, when you're actively gagged.

But, you know, for us, I mean, there are things that help shape us as a company and they help define who we are today.

So, I mean, EFF in some ways that the arguments that you make, that the information that you put out there helps us think through what our commitments should be and are, which is incredibly helpful to us.

So, I mean, we're not, we may not be the most obvious company for you to do that with.

You obviously have lots of other things where you try to encourage your companies to take certain positions, but you're incredibly influential in that space, just generally.

So thank you for that. And thank you for continuing to defend us years and years later.

Well, thank you. We really, we really appreciate it. You know, we are really excited to try to push and make the law better, but we can't do it without clients.

So, you know, unless people are willing, you know, I can't just show up in court and be like, I don't like this law.

Let's do, you know, let's fix it.

It doesn't work that way. There's this thing called standing. And so, you know, having people really let us tell their story, in this case, tell it in secret before we could tell it in public is really how we make the world better.

And, you know, we really appreciate it.

And we know that the, you know, the kind of people with the kind of technical chops to work at a place like Cloudflare, they're the people who we're, you know, we feel we fundamentally represent, right?

We're trying to explain what the world looks like from where those few people sit to the kinds of people who could, if they get properly informed, make either good decisions or really bad decisions that impact all of us.

Well, so I'm actually curious about that piece, because obviously some of what's happening on policy developments in the courts, but lots of other stuff is happening as well.

So everything from lots of developments in Europe on both sides, you mentioned encryption, but we also have privacy.

So how do you think about how EFF gets involved in the policy space from that perspective?

So how do you tell the stories to policymakers in DC and Brussels so that people understand what's at stake?

Yeah, I mean, we're right now for the European conversation, what we're really trying to do is talk about what we call the public interest Internet, talk about all the companies that aren't big tech giants, because one of the dynamics that we're seeing in Europe right now is that the European regulators and lawmakers are so unhappy with the big tech giants that they are making rules without regard to the fact that that's not the whole Internet.

You know, so we're spending a lot of time trying to point out that the whole Internet isn't Facebook, Google, Amazon, Apple, and Microsoft.

And that if you're going to make rules, thinking that that's what the whole Internet is, you can actually paradoxically make them more powerful.

You know, first of all, you probably won't solve the problem. But more importantly, you'll make them more powerful.

You need to think about the Internet Archive and Wikimedia and the smaller companies, I think Cloudflare being one of them, that are going to be impacted by the rules you make, unless you're really careful about how you draft them.

And so that's a huge push that we're making.

My colleague, Christoph Schman, he's in Europe now, but he was in London now, but he was in Brussels for many years, trying to make sure that the impulse to try to do something about the tech giants doesn't end up hurting people and small companies.

And in the U.S., I would say that the policy push, it's similar.

We certainly do some of that, but there are other organizations. You know, there are lots of people who do that as well.

Again, you do this on behalf of Cloudflare.

You don't need us as much to be your voice, although we're happy to join with people.

You know, in the U.S., I think that a lot of what we do is try to come in and really have an alternative and talk about the technology, how the technology really works.

I think that U.S. lawmakers get a lot of spin. They get a lot of spin by the big companies.

They get a lot of spin. And what we try to be is a trusted voice.

I mean, you know, we obviously have a point of view, but we try not to let our point of view get in the way of explaining how tech works.

So on encryption, for instance, we just try to point out that you can't control math.

Math exists without you, despite what the Australian Prime Minister said, you know, that math didn't work, that, you know, the government of Australia was going to control how math worked.

But more importantly, and that's a really important thing, but the other important thing is that, like, as a technical matter, you can't build a door, whether you call it a front door, a back door, or a side door, you can't build something that only lets good guys in and doesn't let bad guys in.

That's not a technical decision.

That's policy decision, who you let in. Building the door makes the weakness happen.

And then you just got to, that weakness is there.

You can't both have strong encryption, you know, strong encryption that actually protects people and a doorway in for law enforcement.

You can't do that.

That just doesn't work. And trying to explain that as a technical matter, and this is one of the changes, we used to show up just to explain what encryption was.

And now we have to kind of counter a really strong push by law enforcement to kind of mischaracterize things.

There's a whole back door, front door, you know, they try to say, oh, we're not hurting the encryption.

I mean, that's just silly, right?

You know, like, it's not about the encryption. It's about whether you have privacy and security in your communications.

And they're like, oh, we don't want to touch.

We want the encryption to be strong, but we want law enforcement to have access.

And it's like, well, you're misunderstanding why people want encryption, right?

Like, that's not, they don't want encryption because they, I mean, some of us care about math, but they want encryption because they want security.

And so you can't undermine the security and claim that you're not, you know, encryption is just a tool.

It just drives me crazy. I'm not good with my, you know, it's all about metaphors, but it really just doesn't work.

I mean, that's all we, we actually face some of this in Europe too, where there's this desire for privacy.

And because, you know, post GDPR here, we have a world of privacy. So you try to build in all these technical measures, but then the reality is that there's pressure on the other side on the law enforcement piece, as you mentioned, and the tension between those people don't resolve.

So it's very hard to figure out how you translate between those two competing interests from a, just to explain.

Yeah. I think it's really interesting. I mean, Europe is interesting because I think Europe, you're right.

They're very, very strong on privacy, but they mean consumer privacy.

Yeah. Right. And they don't mean law enforcement. They're really, I mean, again, there's some interesting stuff going on in the European courts around law enforcement and privacy field and some of this stuff.

But in general, when Europeans talk about how great they are on privacy, they are absolutely not talking about law enforcement, national security, or any of the other things.

And so it's like they have this gigantic blind spot about how the tech is going to work.

That, you know, the technology doesn't know whether you're good law enforcement, bad law enforcement, a company, you know, that wants to have access in order to protect its network, or a company that wants to have access because it wants to sell your data.

You know, the tech doesn't know that. Again, you have to differentiate between what the tech knows and what the policy decisions are.

And that can be hard for people to grasp. Yeah. So how do you think we sort through those in sort of a forward-looking way?

We have these underlying tensions. I think one of the challenges that we're seeing, and this is relevant to the user privacy space or the user interest space, is the effect of some of the tensions is causing sort of more of a push for localization, for example.

So data localization, digital sovereignty, you know, all these sort of buzzwords that you have out there because of pressure on privacy and pressure on law enforcement.

How do you sort through those from a user perspective and think through what's good for a user?

Yeah, I mean, I think that I guess I start with a couple of things. I usually start with the idea that if you're basically trying to make the Internet turn into broadband or TV, then why bother, right?

Let's just use TV. So I think that it's important in some of these that, you know, like data localization.

And one of the great things that the Internet did was it let people who were living in repressive places or places where they were uncomfortable with their governmental decisions have access to getting information and providing information to the outside world.

We still work with dissident groups all over the world for whom access to the Internet so they can get their story out and so that they can get real information in places of propaganda is tremendously important.

So data localization to me, that idea really breaks one of the things that the Internet brought us.

The Internet brought us the fact that you're not captive to your local government.

And, you know, to me, one of the great stories of the Internet is whether you're a kid that was questioning their sexuality or a person of color in a place where there aren't very many other people or where the local laws are not on your side.

You had access to create communities and build communities that were not dependent on your local laws.

If we break that, I think we lose a whole lot of things.

Now, of course, some of those communities can be awful. Yeah, right. You can build communities around really bad ideas, too.

But I don't think that the answer is to put governments in charge of what ideas people get access to and what they don't.

We need other tools in order to do that. And it turns out that broadcast media spreads hate really well, too.

So it's not an Internet only problem. You know, there are, you know, radio spreads hate, TV spreads hate, you know, the United States spread hate, you know, these things happen.

So it's not tech dependent.

And I think we need to think about our answers in a non technical way as well.

So how do we think about that in the tools and sort of forward looking if we want a better Internet, if we want something that's friendly to users and privacy oriented and something that is a place we want to be?

What kind of tools do we establish?

What who you know, where does that work? What level does that work at?

How do we think through that? I mean, that's, that's the goal. Well, I think that's right.

And I think that's where you start, right? What tools are you giving the user?

How easy are they for them to use? And I think that, you know, if you're talking about something like Facebook, or some of the other things, you know, we think a lot about, you know, what we call adversarial interoperability, or competition, CC competition compliant, making tools, you know, first of all, making your bigger tools open to users to be able to choose their own adventure about how they want to do that, that that requires some technical work to make happen.

It also requires some policy work, like we got to get rid of, you know, the computer fraud and abuse that gets in the way of interoperability, the digital money and copyright act and user license agreements, there's lots on the policy side, we can do to try to free up people to build tools that let people control their own investment, build smaller communities, you know, when the big tech companies talk about themselves as a community, I kind of want to gag, right?

Like, that is not that that isn't what a community is.

A community isn't, you know, your business model on top of a you know, those aren't customers, those aren't communities, those are hostages.

So I think interoperability, that's one of the ways you do it, because then you free up, honestly, the part of the Internet that I know and love, which is all those crazy people with the crazy ideas who can give you a better, better adventure, like free those people up, they want to do it, and some of their ideas will succeed, and some of them will fail.

But we will, you know, that's what I want, I want, I want a race to try to build better, better stuff for us.

Now, there's some problems in that, you know, some things fail, and some things don't scale.

I'm not sure scaling is the right metric. For a lot of these things.

I think different, we got to play around with different business models, too, I think that advertise, you know, and we need to make the advertising business model not quite so again, these are policy decisions, not technical ones, but the average, you know, we need real privacy law in the United States.

And I think we really need to continue to tinker with the European one, because the European one was a good is a good effort for its time, but we're going to keep have to keep tinkering with it, it hasn't ended up doing the kinds of things that we, I think a lot of people had thought they were going to do.

The California privacy law hasn't ended up doing some of the things we think we're going to do, this may not, this may have to be an iterative process.

But I think we have to try to get at the business model that, you know, and the most interesting work on this right now that I'm very excited is Tim Wong is got a new news was one of my interns a long time ago, this is one of the fun things about being around for a long time.

You get to see people you know, who you knew when they were baby law students kind of go out in the world and do good things is, you know, beginning to really take on the advertising business model and whether it really is, you know, is doing the things that it's promised it's doing.

And I'm going to be very curious to see that develop, because I've often thought that it's a little snake oily.

And maybe we can shrink it down to size, it doesn't mean you can't do advertising.

But maybe this idea that, that, you know, knowing everything about you, identifying you all the time, everywhere you go, and what you do, as the best way to sell you soap is, is maybe, maybe not so good.

Yeah. So we have a little less than two minutes left. So I want I want to end on sort of an optimistic note.

So let's like go out 10 years and say we fix some of these policy problems.

What kind of Internet do we have?

What does it look like? I mean, what what what's what's good? What's bad? I think I hope anyway, that we have an Internet where you feel like you're in the driver's seat about the experience you want to have, and that that doesn't doesn't limit what you get to do, or who you get to talk to, or how you get to talk to them.

And in, in ways, I think that I think if we get it, right, we have taken a serious look at some technologies like facial recognition, and we've just decided to ban them.

We banned landmines, we banned, you know, you know, gas after World War, you know, as a as a thing.

I mean, we still struggle with these things. We never come entirely win most battles.

But that we've taken a look at some technologies that are really dangerous for our civil liberties.

And we've decided that as a society, we just don't want to use them anymore.

And we ban them, which is a hard thing for me to say I'm a big pro tech person.

But I think facial recognition technologies in the hands of law enforcement or government are really dangerous.

And we're starting to see that now. I think that I think that it's interesting, because EFF is working on a podcast now called how to fix the Internet with the electronic frontier foundation.

Watch this space for a couple a couple more months.

To see some of these things where we're trying to think about, you know, I think broadband becomes ubiquitous and available that it would it, you know, you know, showing up someplace and not having lights or water.

Same thing for broadband, you know, I've been traveling a little bit the last month and and but even so, you know, code, this is one of the things that COVID has taught us is that universal real broadband, like fiber, is tremendously important.

And these little hot spots are.

Thank you so much for doing this. I really appreciate it. It's awesome.

It's so much fun to talk to you, too. So you're amazing history. I was looking over all your background.

Well, it's been lots of fun. It's a fun ride. This is a hard time, I think for everybody, but it's still fun.

So well, thank you again.

All right. Thanks a lot.