🎂 Ben Wizner & Alissa Starzak Fireside Chat
Presented by: Alissa Starzak, Ben Wizner
Start at Â
Originally aired on October 1, 2020 @ 1:30 PM - 2:00 PM EDT
2020 marks Cloudflare’s 10th birthday. To celebrate this milestone, we are hosting a series of fireside chats with business and industry leaders all week long.
In this Cloudflare TV segment, Alissa Starzak will host a fireside chat with Ben Wizner, Director of ACLU's Speech, Privacy, & Technology Project.
English
Birthday Week
Fireside Chat
Transcript (Beta)
I'm Alissa Starzak. I'm the head of public policy here at Cloudflare and I'm here with Ben Wizner to celebrate birthday week.
Ben is the director of the ACLU's project on speech privacy and technology.
So Ben, it's so great to have you here today.
So nice to see you again, Alissa. How are you? I'm good. I'm good. So, you know, it's our birthday.
So I think I want to start with going backward in time. So we've been around for 10 years.
You've been at the ACLU much longer than that. But I'd love to hear how you sort of got started at the ACLU and then also got started in technology projects.
Yeah, it's hard to believe that I've been at the ACLU now for 19 years, which almost sounds like a career.
One of those scary words for someone who still feels young at heart.
I went to work at the ACLU in August of 2001.
First in Los Angeles, thinking that I was going to be working on jail and prison conditions in L.A.
County. And five weeks later were the 9-11 attacks. And so really most of my career at the ACLU has been shaped by that day.
And more than that, by the country's response to that day.
In 2004, I was one of the lawyers who started what is now our national security project in the national office at the ACLU and litigated cases involving torture, extraordinary rendition, secret prisons, surveillance programs, watch lists like the no-fly list, that sort of bucket of work.
I was feeling a little burned out in 2011, 10 years after 9-11, when I felt like the country's conversation about terrorism had not advanced and thinking, what should I do next?
And the ACLU approached me and said, would you head up a new project that we're putting together that combines old strands of our work with new strands of our work?
This is the speech, privacy, and technology project. If it sounds like that mandate is impossibly broad, it is.
It is the most traditional speech and privacy work that the ACLU has done for 100 years.
We had our birthday this year, too.
We're 100. Well, okay. Make us feel young. I appreciate that.
Sorry. Happy birthday to you, too. Appreciate it. And then the work that's more at the cutting edge, which looks at how advances in science and technology intersect with human rights and civil liberties and how we can make sure that they're harmonized and not discordant.
The way that most people describe this, which I find very frustrating, is how can we make sure that law keeps pace with technology?
I think that's really upside down. We shouldn't expect our law and our values to be chasing technological innovation.
We should be requiring technological innovation to be consistent with our norms and our laws and our values.
And so that's where we've planted our flag. So, okay, you threw that one out there.
I'm going to have to go into that one, because how do you do that?
So how do you make sure that technology companies actually do things that are consistent with values if it's not from a government perspective?
Well, I mean, I think it often is from a government perspective, right?
There are all kinds of sophisticated technologies for extracting fossil fuels from, say, the ocean floor.
And I'm sure the people who are developing those technologies would love to be unregulated.
They would love to be able to say, don't give us regulation. You're going to stifle innovation.
We really need to be, which is what technology companies say.
And then, of course, in the next breath, they say, don't regulate us.
It's too late. You're going to break the Internet, right? It's never the right time for regulation if you're in the technology world.
It's always either too early or too late.
But what we've said to energy companies is you're going to innovate around environmental protections because we're going to balance the public value with your private value.
And so that's what I mean in the biggest sense.
Now, we don't have to start from scratch. A lot of what I'm talking about is just interpreting core laws like the Fourth Amendment, right?
And if you've seen, there's been a revolution in Fourth Amendment jurisprudence over the last half decade.
And that hasn't involved amending the Fourth Amendment.
That's just involved looking at the words and saying, what does that mean, right?
A generation ago, if you wanted to know where I was at all hours, you'd have teams of officers following me in shifts.
And now, you just want to find out where this thing was, right?
And how do we impose friction where it wasn't needed before to ensure that the balance that the Fourth Amendment established is respected when surveillance that used to be incredibly expensive is now trivially cheap?
Well, and on the privacy side, I mean, what happens when people give it up? So what happens in that exact same discussion?
So how do we think about values in a world where people consent to all sorts of things?
You consent to carrying around your phone in your pocket and recognizing that that may end up with tracking information.
So how do we think about those? How do we make sure that the sort of values that we think we have are actually the values we have?
And this has been a really hard problem.
It's just that question of consent. And I think that we've dealt with it differently in the context of government than we have in the context of corporations.
And that's probably correct, right? In the context of corporations, if you're using the Internet, you are clicking through privacy agreements without reading them.
And I know that that's the case, even though you're a lawyer who writes privacy agreements, because if you read them all, it would take four months of your year for the number of websites you use and tools that you use.
So none of us are actually reading those. Nonetheless, they have to be enforceable.
Because if they weren't enforceable, it would be impossible for people to make investments and to do business in that way.
So we can't say that's just a contract of adhesion, because no one's reading it.
We need to sort of impose the rules from the outside and not use consent.
Now, you said we all consent to carrying these around.
We know that these are trackers. And so therefore, why isn't that enough consent to the government watching us?
And there, you know, the Supreme Court has said, it's really not meaningful consent.
Because this is really required for so much of modern life.
People use this for their work, they use this to apply for work, they use this to get health care, they use it, you know, to say that it's optional to use a technology this universal, in a constitutional sense, is really myopic.
That's not consent that would vitiate your privacy right with respect to the government.
So. So but look, I think these are really, really hard concepts, obviously.
So I guess that goes back to that same question.
I mean, how much of it is about what's collected in the first place?
And how you how you restrict that? Because, you know, we actually often believe in our privacy policies that the less you collect, the better off you are, sort of just generally from a privacy standpoint.
And how much of it is things like government access?
And when do those two collide? Right? When did those where is the when does it become something that is government oriented?
Is it when it's available and potentially subject to legal process, for example?
Or is it? Is it later?
Is it? You know, what? How do you think about those two? I mean, they seem to be melding in a way that I don't think we all anticipated a long time ago.
And obviously, that question was put front and center by Edward Snowden in 2013.
I would say, right, right, because the first program that was disclosed was a program in which the government was collecting the metadata from almost every phone call in the United States every day and storing it for five years.
And, and the way that issue had been considered by the Foreign Intelligence Surveillance Court hearing only from the government was that it was not a constitutional event.
Because for a number of reasons, I mean, number one, it's just metadata.
It's not content. But you know, number two, if the information is just being, you know, put in a big data center, that's not even a seizure or a search until a human being goes through and looks at it and reviews specific information.
Look, I think and the ACLU thinks and and other courts think that that that's essentially a general warrant.
I mean, this is exactly what the framers of the Constitution had in mind when they said you don't get to, you know, collect all the mail, and then decide who is guilty of something afterwards, right, that you have to have the suspicion before you collect everything.
How much is that cat already out of the bag? Right? How much are we in a world where near universal collection is the norm, and the best that we can do is impose back end controls on who can get into it?
You'll hear different opinions about that.
But I think that we should be focusing more on collection and less on protection that we really do need to decide.
It's not just a liability from the standpoint of, you know, you're creating a big target for hackers.
It's a liability from a rights perspective as well. Because because you may say right now, oh, don't worry, it's all in a lockbox.
But when there is the next 911, and the intelligence community comes to us and says, we could have connected the dots, if only we had been allowed into the lockbox, the dots will always connect in hindsight.
And if we're collecting all of them, and they always connect in hindsight, then we will have Patriot Act two, which says no, actually, we should tear down that wall.
And this information was lawfully seized in the first place.
Why shouldn't we be able to look at it right now? And so it's an attractive nuisance in a number of ways, both to foreign adversaries who might want to gain access to it, like the, you know, OPM hack.
But also, I think that it just sort of continues the one way ratchet.
You will always see a lessening of the restrictions, because, frankly, it is valuable for forensic investigatory purposes to collect all of the world's information.
Couple of thoughts on that. And then I want to go back to something I want to go back to the reference to Edward Snowden, which you didn't actually you didn't, you didn't point out that, that you are actually the one of the attorneys for Edward Snowden.
So I have to go back to that, just because I can I do a book plug.
Our man record is just out in paperback this week, and I strongly recommend it sort of feels like he's there with you when you have that book cover, you just kind of keep it up there.
So how did that happen?
I mean, so obviously, obviously, the references are, make sense, sort of strategically, sort of from a privacy perspective of all the things that you're talking about, but but how did you end up as Snowden's lawyer?
Well, so actually, it happened because of my relationship with Laura Poitras, the documentary filmmaker and journalist who won an Academy Award for Citizen Four, which is her documentary about about Snowden.
I had met her when she was filming The Oath, which is an Academy Award nominated documentary about Guantanamo, and Salim Hamdan, who was Osama Bin Laden's driver who ended up in Guantanamo.
And so Laura also had consulted me when she ended up on a government watch list that that meant that every time she returned to the United States from being abroad, she was taken into secondary screening, interrogated for hours, they asked her questions about who she was meeting, she got to a point where she would just hand them my business card and not answer questions.
So when she received the first encrypted email from an anonymous source who referred to himself as Citizen Four, she came to my office at the ACLU, asked if I would leave my phone in the drawer and said, can we go somewhere else and have coffee, I want to talk to you about something confidential and showed me these first communications and wanted to know if A, I thought perhaps this was legitimate, B, this was just a crank, and I get email several times a week from people who claim to be the most important whistleblower in US history, and usually they are not, or three, whether this might even be an attempt to entrap her.
And my take was that this was not someone who was making bizarre and grandiose claims, seemed to be someone who was communicating carefully, but that she would benefit from bringing into the loop a more seasoned national security journalist like Bart Gelman, who might be able to help her assess the source's credibility before she took any steps.
So I knew in January of 2013 that Laura was communicating with someone who purported to be a senior intelligence official who had this kind of, you know, earth -shattering information to share, but I did not know his name until you did.
So I was sitting there watching the Guardian video through my computer screen like everyone else in the world and seeing this, like, impossibly young Harry Potter -looking person coming forward as the source when I had been expecting someone near the end of his career, given the kind of access that he had.
So after that, when, and I'm sure many of the people watching this will have seen Citizenfour and seen just how completely chaotic and disorganized this event was, and that really no one had a plan for what would happen after Snowden met with the journalists in Hong Kong, it was when he was spending 40 days essentially trapped in a no-man's land in Sheremetyevo Airport in Moscow that Laura was able to put Snowden in touch with me, and I began helping to coordinate a global legal strategy for him.
That's a fascinating background. So I'm actually curious on that, having been so close to this Snowden thing, now that we're a few years out, although never entirely out from what I can tell, his name pops up plenty of times now, what do you think has come from those disclosures?
I mean, do you think, what's the good, what's the bad, what's the sort of long-term consequence from your standpoint?
So I think, you know, we should start with how President Obama defended the U.S.
intelligence community in June of 2013 when this was the biggest story in the world, and what he said, which was absolutely true, was that the programs that were being disclosed had been approved by all three branches of the U.S.
government. That was true, but that was also the problem, because all of those approvals took place in secret, without any kind of adversarial process.
And once Snowden's disclosures to the press and the press's disclosures to the public brought us into this conversation, you saw all three branches of government change course as a result of what Snowden did.
And then we'll get to the changes in tech companies, but if just starting with the government, right, you saw the executive branch, President Obama, appointed an expert panel of former high-level intelligence officials, and they recommended hundreds of changes, including putting the first restrictions on surveillance of non-citizens, which wasn't even required by the law or the Constitution, but changes in encryption policy.
With the courts in Congress, it's even more obvious, right? You saw with Congress, the first time since the 1970s that Congress legislated to restrict rather than expand the surveillance authorities of the intelligence community, and the courts, right?
In March of 2013, the Supreme Court had thrown out an ACLU challenge to an NSA surveillance program on the ground that we didn't have standing to be there.
We couldn't prove that our plaintiffs had been subject to these programs, we couldn't use discovery to get that evidence because of state secrets, and therefore neither we nor any other plaintiffs could challenge these programs.
And as a result of Snowden's disclosures, we had the Second Circuit Court of Appeals in 2015 say that the collection of telephone metadata was illegal, had always been illegal, was probably unconstitutional, a ruling that was affirmed just a few weeks ago by the Ninth Circuit, and another decision that went even deeper into the unconstitutionality of that program.
So I think in a very ironic way, Snowden's law-breaking, and he does not deny that he violated the law, I am not conceding anything by saying this, revitalized democratic oversight in a way that was really necessary.
Now, I think the thing that has most upset the U.S. government has been that his disclosures made tech companies more adversarial to the U.S.
government, because tech companies are global, they're not just U .S.
companies. And so when they realized that in addition to turning over their customers' data through the front door when they got court orders, they were being hacked through the back door by the NSA.
There was a lot of outrage in Silicon Valley, and you saw a lot of hardening, both technical hardening to close the gaps that the NSA had exploited, but also policy hardening, where you saw the general counsel of these companies saying no, pushing back against a lot of these surveillance requests, building their business brands on the basis of their ability to protect their customers.
I don't think you would have seen the 2015 case of FBI versus Apple play out in such a public way if you hadn't had this note of disclosures.
And whether you consider that a feature or a bug depends on where you sit.
If you were in the FBI, obviously, you wish that things go back to 2013 in the way that these companies interacted.
If you think, as I do, that it's better from a rights perspective if governments and powerful corporations are adverse to each other, and I mean to each other.
I don't just mean companies being adverse to government when it comes to intelligence requests.
I mean the government being adverse to tech companies when it comes to FTC, anti-monopoly regulation.
We want to have more friction between powerful entities so they can provide checks and balances on each other, and I think that might ultimately end up being the biggest part of it.
Do you think, so obviously one of the things that we're dealing with this year is the Europe component of that, right?
So the huge amount of fallout on the European side, many people think it sort of led to the General Data Protection Regulation, GDPR, the big privacy law in Europe, and even this summer we saw sort of the fallout from surveillance programs after Max Schremms brought a privacy case, for example.
Do you think, how do you think those things play out?
How do you think the sort of geopolitics component of that fits together, and do we get to a better place?
Do we get to a place where, and this is partially from an Internet perspective and also from a privacy perspective, so how do we think about those?
Because one of the challenges for us is trying to make sure that, yes, we obviously, the goal is to protect our customers' data and we're absolutely dedicated to doing that, but it's also making sure that you can have data flows, that you have an Internet that works, that you're not sort of restricting locally just because people are so convinced that they are going to be surveilled, for example.
I'm very tempted to just say, you know, I work for the American Civil Liberties Union.
Don't make me talk about the GDPR and Schremms and all of these other things.
But look, I mean, I think, you know, candidly, I have mixed feelings about it.
So, you know, in the monopoly competition sphere, I'm glad to see an active Europe because I just don't think that the US now has a framework for dealing with the kind of threat that an Amazon, for example, presents.
You know, our monopoly law is all organized around the prices for consumers.
And so something like Amazon, which is wonderful for individuals and devastating for societies, is not something that our monopoly law can really deal with.
Sorry, there's a little bit of a motor in the background. And I'm hoping that- It's getting made hard just in case, right?
If I ask my questions, you know, do I go in a bad direction?
If I ask more about Europe, that's what's- No, but my hope in this area is that European competition law will be a little bit more flexible and, you know, adept at being able to, you know, deal with the new challenges of, you know, companies like Amazon or Facebook, that again, I would say, really provide a lot of benefits to individuals and a lot of harms to societies.
And so I'm glad to see that kind of activity in Brussels. Look, when it comes to free expression, I have a lot of differences of opinion with the way Europe deals with it and Germany saying that 10 companies have to take down content within an hour or 24 hours and we're going to get a notice and take down regime for speech the way we have for copyright.
And I think that's a real problem and will ultimately do more harm than good.
And look, yeah, I mean, adding more speed bumps can be very helpful for privacy and very threatening to freedom of expression, right?
I'm really happy with the US framework on the freedom of expression.
So I can't say that I have a nuanced and coherent way of kind of harmonizing all the tensions that I just brought to bear.
In practice, we're probably opportunistic like others and, you know, we'll be happy to use Europe as a club when we can and to, you know, push back against them when we have to.
Yeah, well, the freedom of expression issues are really interesting right now, because they are going back to your global tech is global.
The pressures, of course, that people face in Europe have an impact on how they operate in the US as well, right?
Yeah, and not just Europe.
And I mean, Turkey, yeah, it would be an example of Philippines, right?
And then not even to mention, China, which is a completely different question altogether.
Yeah. So I'm actually curious, your thoughts on that on the freedom of expression side.
So private companies obviously have their own rights of freedom of expression to some degree.
So how here they are in this sort of world of moderating sometimes on government based on government pressure, as we just suggested, how do you think about that?
I mean, what is the role of private companies?
Another really, really hard problem, but you started in the right place.
I think that people who have started to suggest that a platform that's as large as Facebook, maybe has become somehow a constitutional forum subject to First Amendment law, I think that's really wildly misguided.
If we're talking about constitutional free speech rights, they belong to the companies, right?
The the First Amendment right is Facebook's.
Facebook can has an associational right to decide what kind of speech community it wants to be.
And what kind of speech it wants to allow and disallow and, and the government really should not be weighing in there as much as in a way, it might make it easier for Facebook.
And what you saw Facebook do is, you know, appoint this outside advisory board, a sort of Supreme Court of Facebook, to help them make some of these hard calls.
But what is the common law of this Supreme Court?
It's the common law of Facebook's terms of service. But there's no outside body of law that's being applied by these distinguished people.
They are just sort of there to ask to say whether Facebook is following its own rules or not, or maybe to suggest some changes in those, those rules.
But but that doesn't end the conversation that really starts the conversation.
So and here, I think we need to treat these platforms not monolithically, but very individually, on the basis of what they are.
So and sort of start at different layers.
So the the entities that are really the sort of infrastructure of the Internet, in my view, should be doing the least moderation or even none.
And partly that's because their tools are the bluntest.
Facebook can remove one post, or it can suspend an account for a short period of time, or it can deprioritize it.
But if you're the infrastructure of the Internet, you only have an on off switch, you only have the speech death penalty, you don't have the timeout room.
And you don't have the you can say this, but not this, it's much, much harder to do.
And also, look, we don't we don't expect the phone company to say that neo Nazis shouldn't be able to make phone calls to each other, right?
Or to others, right? That's not where we think that our value should be imposed.
And that's how I feel about the infrastructure.
Now, when you start talking about speech communities, even there, I think we need to distinguish if there is a network of yarn enthusiasts that has, you know, 3000 yarn enthusiasts who get together on a social network, they should be able to have whatever rules they want, right?
If they want to say no cuss words on our yarn site, I have no problem with if you're Facebook, and you have 2.4 billion users, and you are holding yourself out as the place where the world has a conversation, and you are a place that is not only connecting people to each other, but delivering a lot of the news that people get and connecting people to media stories and all of that, then I think you need to be much more careful, careful.
I am not more comfortable having Mark Zuckerberg define the appropriate boundaries of political speech than I am having Donald Trump or someone in government define the appropriate boundaries of political speech.
And that's why I've been very nervous about the calls for Facebook to be more aggressive in fact checking political speech.
I mean, do we really want Facebook to say that they are not going to show the presidential debate this week, because everything that came out of Trump's mouth was a lie?
Now, it was a lie. But we should be able to see it.
It's newsworthy, and it's important, and it's someone else's job to come in and give that analysis and that kind of verdict.
Now, that's different, I think, than saying that they should allow people to use their platform to say that election day is on November 5th and not November 3rd, or that non-citizens are not allowed to participate in the census, because then they'll be arrested.
That kind of speech might not even be protected if the government chose to regulate it.
There may be enough harm from that false speech that it could even be regulated under the Supreme Court's decisions and Alvarez and others.
But so my view is that if you are one of these small handful of platforms that really have become the place where people gather for political debate and conversation, Twitter, Facebook, maybe YouTube, the tool of censorship should be used very sparingly.
And even when you decide you're going to moderate speech, you have different gradations of doing that.
So better to deprioritize the post than to remove it altogether. Better to remove the post than to ban the speaker altogether, right?
So there's a lot of tools that these platforms have before they just say you're kicked off altogether.
They can make the speech harder to find, and they can just remove it on an individual basis rather than saying this person is beyond the pale and shouldn't be allowed to speak here.
And I say this because we have this sort of underlying assumption that these companies will more or less have the same views that we do about what's appropriate and what's not.
But I think you need to do a thought experiment and imagine Trump at 75% approval and not 35% approval.
And do we think that Facebook and Google and Twitter would be dealing with political speech in the same way?
And if you don't, you see the problem. So we're getting close to the end.
So we only have a couple of minutes left, which makes me sad, actually, because I want to go back into all of that.
But I'm actually curious. So again, at birthday week, here we are.
Where do we go from here? So 10 years out from now, so if you had to look forward, do we end up in a better place?
Do we end up in a worse place?
Where do you think we end up? So if you look back in pre-Snowden 10 years ago, we're in a very different place now.
What does it look like 10 years from now? Yeah, I mean, I think we're going to make progress and we're going to have setbacks.
The nice thing about working at a place like the ACLU is that we're always losing and winning at the same time.
So we may be getting crushed in one area, but we're making historic advances in another area.
And I sort of see this space in the same way.
If we had had this conversation 10 years ago, and we had talked about the Fourth Amendment, we would have said, the Fourth Amendment is a dishrag that has essentially been just trashed by the war on drugs.
10 years later, we've had three landmark Supreme Court decisions on the Fourth Amendment and the digital age.
There is a stable conservative majority on the Supreme Court in favor of Fourth Amendment rights in the digital age.
So you just don't really know. Obviously, I have some optimism about our ability to work these problems out, or I would be doing something else.
But I do think that with AI, which we haven't talked about, we're going to see the capabilities of these surveillance systems magnified so much.
I'll tell you, I'll end on something that I'm optimistic about.
What we're seeing with AI is the conversations that we should have been having at the dawn of the Internet age.
At the dawn of the Internet age, we said, let's put off all of these hard policy questions, all these regulation questions, and let's just see how this thing develops.
With AI, people realize that it will be too late. If we just build this technology without having the conversation about what the controls need to be, what the norms need to be, what the laws need to be, then it might just get ahead of us and get the better of us.