Cloudflare TV

🎂 David Kaye & Doug Kramer Fireside Chat

Presented by Doug Kramer, David Kaye
Originally aired on 

2020 marks Cloudflare’s 10th birthday. To celebrate this milestone, we are hosting a series of fireside chats with business and industry leaders all week long.

In this Cloudflare TV segment, Doug Kramer will host a fireside chat with David Kaye, former UN Special Rapporteur.

Birthday Week
Fireside Chat

Transcript (Beta)

Welcome, everybody, to another conversation as part of Cloudflare's celebration of its 10th birthday this week.

I'm really excited about the session that we're going to have now with David Kaye, who has been a leading thinker and advocate and fighter and consultant in all things around freedom of expression and freedom of opinion online.

David is a professor at the University of Irvine School of Law, where he runs a clinic on international justice and a number of different international issues.

But a lot of the work David's done over the last several years is as UN Special Rapporteur for Freedom of Expression and Freedom of Opinion, a position he started in 2014 when a lot of these issues were just gaining prominence and then really sort of flared up and a position he held until July of this year.

So, David, thanks for spending some time with us this morning. And thanks for all your work on a lot of these issues.

Yeah, Doug, it's really great to be here.

Thanks. Good. Well, my first question is an easy one, because I think it sort of it meshes with your experience, because in a lot of these birthday conversations, the first conversation we're having is, you know, Cloudflare started 10 years ago, the Internet was very different, the company was obviously very different.

And we've seen a lot of change over that time. During your tenure as UN Special Rapporteur for six years, the issues and the complexity and the importance really sort of ratcheted up on all of that.

So can you talk to us a little bit about a what really that role means, you know, the UN itself can be somewhat mysterious to a lot of people, but especially these special projects and special rapporteurs.

So explain a little bit of what you did there.

And then what you saw in sort of the evolution of those issues you were working on over over that period.

Yeah, I and I completely get that the UN is mysterious to people.

The position that I had, it was an appointment by the UN Human Rights Council.

The Human Rights Council is basically the central human rights body in the UN system.

It's actually a part of the General Assembly, like 47 states are a part of it.

They're elected by the members of the of the UN, basically. And, and the Human Rights Council created these mechanisms for individual experts in different areas of human rights, or working groups to address different human rights concerns around the world.

There are about 50 of these different mechanisms.

And mine was focused on freedom of opinion and expression. And it involved communicating with governments, when we saw an alleged violation.

So an example might be China, and its adoption recently of its national security law and the impact that that's having on Hong Kong, we would send a letter, both complaining, or identifying concerns about the law and its impact on people in Hong Kong.

And we did those all over the world, not just China, but, you know, countries in the West, the United States and others.

And I would also do country visits, places like Turkey, Tajikistan, Ecuador, Liberia, very interesting places that are in transition, Ethiopia, most recently.

And finally, and this might be the area that is the most relevant, maybe to academics and to some in the corporate sector, were these thematic reports, where I would address an issue.

For example, the first one, so this will get into your question.

The first one was on encryption and anonymity, right close to Cloudflare's heart, I guess, in a lot of ways.

And, and, you know, and so that was a report to the Human Rights Council on the human rights implications of digital security.

And, and, you know, so as far as things changing over time, you know, I started in 2014, it was a year after the Snowden revelations, people had started to focus on the fact that digital space wasn't only this, you know, kind of halcyon environment, like a nirvana for access to information, it was also a threatening environment for people.

And I think that, like that recognition, which wasn't original on my part, but that recognition kind of took me through the six years, and we saw it develop in different ways, whether it was digital security, you know, social media and, and attacks on social media and content moderation, and issues like that.

Yeah, no, I think that sort of the halcyon days, if we look back sort of 10 years ago, we were probably, you know, smack dab in the middle of all of that, to some extent, and then the conflicting forces of as you sort of expressed it, you know, content moderation, on the one hand, security, on the other hand, privacy, on the other hand, are all sort of, you know, values we're trying to maximize for right now that don't necessarily, you know, you don't get very far off the ground with any of them before they start crashing into each other a little bit.

And I think the complexity in that space has just been, you know, so much more significant.

So, you know, if it's fair to say, you know, freedom of expression and concerns about freedom of expression and opinions, focus to some extent, you know, I just I go back to jumping out trips to China and Turkey and Tajikistan, you know, there's sort of a common thread running through those that were more of a threat, or more of a focus 10 years ago, when there was sort of the China Internet and the Great Firewall, and then sort of the global Internet that was all very similar after that.

And I think what we're seeing now is that stark division between China and the rest of the world is is breaking down a bit or is, is coming a little bit closer.

And so, you know, do you see issues of freedom of expression, freedom of opinion, sort of seep and challenges to that seeping into other parts of the world?

And in what ways do you see that, that happening, maybe over the last five years, as opposed to 10 years ago?

Yeah, I mean, I think if you go back five or six years, or even to the beginning of Cloudflare, I think the sense was, there were two worlds, maybe three, right, there was the authoritarian world.

And this has been around for a long time, right, whether we're talking about, you know, governments like Turkey, saying, you know, shutting down Twitter, or, or, frankly, over the last several years, shutting down Wikipedia, and not allowing it to be available in Turkey to to the, you know, the Chinese Great Firewall, which we've all known about for, you know, for quite a long time, we and I think we tended to see that as being one kind of Internet that was over here, it was like parked in this particular neighborhood.

And that neighborhood wasn't really going to, you know, worry us in the same way that it might in our own neighborhood.

And and I think we've seen over, over the last several years, real concerns.

And actually, I think it starts with there's two different aspects to it.

One, at least when we're talking about governments, one is the governmental, I think, attack might be too strong of a word, but, but real pressure on digital security.

And and that debate that, you know, was very much tied into counterterrorism, but also, you know, child exploitation and other areas where, you know, law enforcement with that, which I totally recognize, they want access to data, they want access to traffic, but but they were making these arguments without really thinking about the proportionality of those kinds of efforts, right?

I mean, to use a legal, right, legal term. And, and so while you did have the authoritarian Internet over here, you also had this pressure on fundamental forms of privacy that people really require in order to do and it's just, you know, the privacy of their iPhone or their Android device, you know, it's the privacy of their browsing, the end and that I think, that ticked up in terms of its, I think, understanding like the public understanding of that kind of threat.

And then the other threat, of course, is the one that that you all have been focused on for many, many years.

And that's, you know, non state actors.

So it's not just the state actors, but it's, it's almost like, you know, the equivalent of paramilitary, right, like the idea that non state actors would weaponize the Internet and and attack it in ways that would, you know, with DDoS attacks and other other forms that would really be designed, like they're security threats, but they were also threats to freedom of expression into privacy and the fundamental rights.

And I think, I think it just took a little bit of time to get people to understand that these security issues aren't just security, they're also fundamental rights issues.

Yeah, it's interesting.

You said it reminds me of something, you know, Matthew, Matthew Prince, our CEO has said about the founding of Project Galileo.

And to give a quick Cloudflare plug, which we're not trying to do this week, because it's but it just it's relevant here is, you know, when we founded Project Galileo, which is a project by which we provide, you know, our services and support to a lot of, you know, socially, culturally, politically sensitive groups that otherwise have run up to attacks.

And, and part of what Matthew does to describe the genesis of that is like, if you look at attack data, it's a pretty straight line that, you know, the more visit site visits you get, the more attacks you're going to get a lot of DDoS attacks, and other cybersecurity attacks, phishing attacks are going to be just sort of dumb attacks that you see across the entire Internet.

And when we saw outliers, when we saw smaller sites with not a lot of visits that were getting a great number of attacks, relative to their number of visits, it was these groups, you know, as the LGBT groups and repressive societies, it was the independent journalist organizations.

And, and, you know, it's hard to do attribution, but whether it was, you know, state actors, or instead of the non state actors you were talking about, that's where we saw, you know, the particular vulnerability was a lot on a lot of these groups, you know, sort of doing that.

And so it was, it was a weird way in which, you know, not trying to do AI on topic words, or, you know, behaviors, or all that, but just looking at attack data, we were, you know, became a fairly reliable predictor, as to the sorts of folks that were disproportionately getting hit.

So we, we've always found that interesting, and a real area for concern and effort.

Yeah, absolutely. And so, I mean, and the, the kind of feedback loop between those attacks, and, like, you don't want to put all of the onus on the user, like the end user, or the website owner to do, you know, to invest all this, all of their resources into securing themselves, you know, that's that that seems at some basic level to be a mismatch, right?

Because individuals in an environment that is so digitized, and so kind of suffused with with technology, you know, you can't simply say, well, you know, you need to maintain your own security, right?

I mean, there have to, there has to be values and rights that are built into that process.

And I mean, your example of LGBT communities and organizations that suffer attack, it's a really good example, like, why should they be the ones who are simply claiming, like, as if it's a special claim to security, when in fact, the entire system needs to be secure, so that people can get access to that information so that they can seek, you know, guidance about different things that they're feeling, particularly if they're in a sexual minority or a gender minority in a particularly repressive place.

It seems really, you know, the the incentive structure is completely wrong, if it's only on the user to make those kinds of choices.

Yeah. Well, so let me ask you something based on your expertise, and maybe sort of wrap up a little bit of this segment, where we're talking about, you know, looking back over the last 10 years, you know, based on on you really being, you know, neck deep, if not above where you can breathe in all of these issues.

What do you think when you think about the challenges to freedom of expression, the problematic challenges to freedom of expression, freedom of opinion online?

What do you think the casual observer does not see that is a big threat as big, if not bigger than they would anticipate?

And what are the threats that you think get a lot of time and attention and effort by regulators or the press or all of that, that really are things that that we shouldn't focus on too much that are distractions?

Any, any experience you've had on on where we should put our time and energy as opposed to where it may be going now?

Yeah, I mean, it's hard, because, you know, these all of these issues, and you've seen this, right, they went from being kind of niche issues for, you know, Internet lawyers, or, you know, to like the biggest public policy issues of the day, in some respects.

And, and so on the on the one hand, I would say that the thing, the big overarching question that I think people don't really ask enough is the question of who decides.

And actually, I hadn't planned on, on like, you know, using this as like a plug for Cloudflare and Matthew.

But, but, you know, he raised this question directly after the Charlottesville killing of Heather Heyer.

And he's like, why should we be making these decisions?

And, you know, why aren't these public policy decisions that should be made by, you know, a democratic community as to like, who gets access to the Internet and what the rules should be around, whether it's content or, or simply the access issues?

And, and I think that's the big overarching question that we don't address enough, we tend to, I feel, skip directly to the companies aren't doing enough, governments are too politicized, this, that, or the other thing, when we're not asking that fundamental question of in a democratic society, who should make a who should be making decisions, particularly given the kind of, you know, chokeholds, or a choke point, I should say, that, you know, that that private actors play, because chokehold is not the right word, choke point, like that point of gatekeeping is the best word, or maybe, right?

If companies are the gatekeepers for content, that's great. But should they be making the decisions as to what gets through that, that gate?

And what we get access?

That is a big question. And I think, like moving to the second part of your question, that the regulatory environment is, is, at least in the United States, is completely messed up, because it's so politicized.

And so either the debate is, well, we're conservative, and, you know, and the companies are, are treating us poorly, because they're all of these, you know, wine cave drinking liberals in Silicon Valley, or the peninsula, or whatever.

Or, you know, it is, you know, focused on, you know, on other questions about, you know, hate speech, and disinformation that themselves are also politicized, and there's pressure on the companies to do things, which I completely am sympathetic to.

But we still haven't answered the question of in a democratic society, who should be making those regulatory decisions.

I think if you compare the debate in the United States, to the debate in Europe, I mean, the European one, not that it's a perfect debate.

But, you know, they're having a pretty sophisticated regulatory discussion about the different options that are available to at least in Brussels to the European Commission.

And, and that's just not a debate that we're having, we're so wrapped up in, in the politicization, I hopefully after November, I mean, one can hope or dream, things will change, at least in the United States.

But right now that you know, the model of having these discussions, which I think at root are about who decides, it's, it's frankly, a European led conversation.

So a couple of different, I'm trying to figure which road to go down here.

So let me actually, I was going to go down there.

Second, I'm going to switch it to first. And that is that, you know, on that point, you know, I think there is, you can't have too many conversations in this space before the US First Amendment comes up.

And not to make this too lawyerly or this discussion, but it comes up.

And I actually find it is brought up more so by non US people who are like, you know, we have to remind you, the rest of the world does not have the First Amendment does not have this core, which I always think is a little bit of a misname, because there are a lot of international documents, you know, treaties on human rights, you know, all of those sorts of things that have similar sort of standards.

I think the value of the First Amendment is not in the principle it states, but the way the United States has worked through how to deal with problematic anecdotes in a larger, you know, sort of process and thoughtful set of principles that even though I may not like this speech, am I comfortable regulating this speech in a way that I'd also be comfortable in regulating speech that I like, right?

And I think there's enough jurisprudence in the United States around the First Amendment to sort of walk us through that in a way that other countries in the world may not have grappled with on a deeper level.

So I think, I think that's a helpful tool and framework if the principles are different.

But I think so much of that is just getting lost here, even though I think it's, it's highly applicable, again, even if we're not saying the principle of free expression that to the degree of the First Amendment is out there, the way that courts have gone through developing a structure for evaluating that has been helpful, and it's something that just does not seem to be at all sort of carrying the day in the way that we're looking at some of these issues online.

Yeah, I so I'm tempted to go down the, you know, we could nerd out as when you when you mentioned John Rawls, I'm going to press the stop.

Like, well, we'll vote. That's when we've lost the audience. No, I'm not gonna go.

Feel free. I'll keep you in bounds. Not gonna nerd out that much.

But what I would say is, you know, first, so just comparing the human rights law on freedom of expression, to the First Amendment, I think it's actually useful to look at those, even if it's just a matter of, like principle and, and framework, right.

So the First Amendment says, Congress shall make no law, right, abridging freedom of speech or the press.

That's, that's super interesting, just if you think about that language, because it's, it's a constraint on government.

And that's fine.

And jurisprudence has developed in the way that you you suggest, and then you have freedom of expression, which, under the International Covenant on Civil and Political Rights, which is a treaty that the United States is a party to, under Article 19, it flips it around.

It's it doesn't say government shall not do x, it says, everyone shall enjoy the right to seek, receive and impart information and ideas of all kinds, regardless of frontiers and through any media.

I mean, that's virtually, it's not exactly a quote, but it's close. And the thing that I think is interesting about that is, it, it sort of forces us to look at the individual's right.

So what it's not just what government can't do, it's that individuals have the right, both as a speaker, right, to impart information, and they have a right as an audience, or as a student, whatever you want to call it, right, to seek and receive.

And I think the jurisprudence has developed in a way that it might actually be a little bit more robust in some ways than First Amendment jurisprudence, because built into the law is this idea of, of speech as both speaker and audience.

And that's, I think, really important. And, and there's actually a rich jurisprudence that governments have really grappled with over the years that regional human rights courts have grappled with over the years.

Interestingly, they occasionally refer to US First Amendment jurisprudence.

So like New York Times versus Sullivan is the most quoted US jurisprudence, you know, Supreme Court case of any case, it's actually the most quoted case of like a foreign case in a, in a state court anywhere in the world.

And that's, I mean, as Americans, we should be proud of that.

But, but it's not to say that there isn't this same set of values.

And I, so I, this is sort of the First Amendment exceptionalism that we often hear people talk about in the United States, I think is overstated to a significant, it doesn't mean that they're not, there aren't restrictions on speech in other parts of the world, in the democratic world, that we might find problematic under the First Amendment.

But they're not, they're not, you know, necessarily, they're not repressive.

They're just, you know, different democratic societies deal with speech issues in different ways.

Yeah. You've got me thinking maybe now that you're done with, with, with, you know, sort of the job you just ended, you know, starting to sort of think ahead, there are a lot of, in the US, a lot of regulatory proposals right now on content moderation and speech and amendments to CDA 230.

Those are gonna have to go through, and then we'll end up in the courts, you know, at some point and making sure the courts are ready for those very different sort of things, but, but can come out with, you know, something like the New York Times case or something else to sort of, you know, set us on the right path and think about that the right way.

So you can start working on that. That would, that would be great.

Try to get that set up. Well, I mean, it is, it's funny you mentioned that because, you know, with Section 230, and all of the kind of the noise around it, there, there is a process in the FCC right now, right where they issued a call for public comments, as you, I'm sure you know, and in that comment process, we actually, so my students and I, and a colleague of mine at UC Irvine, Jack, Professor Jack Lerner, our two clinics submitted a comment to that process that said, look, of course, First Amendment is the, is kind of the key question here.

But, you know, in thinking this through, the United States also has obligations under human rights law.

Here's how you should think this through as a matter of US commitments under Article 19 of the ICCPR.

So, I mean, I'm not particularly hopeful that that will be the kind of guiding principle for this discussion.

But, you know, one can dream a little bit. Yeah, no, and we certainly we made a submission to that process as well, because we do think that, you know, figuring this out is hugely important.

And going back to just put a finer point on it, because I think, you know, you nailed it with it, you know, the question of, you know, who decides, you know, there, there are so many different links on the chain that you could sort of put this on, you know, choke points, as you sort of suggested, you know, who decides?

Is it everybody? Is it particularly well placed people?

There's that. And then the second one that we drifted into a little bit is, is how do you decide?

And I do have to at least give you the, the gold star, and you can prepare your Oscar acceptance speech.

I think the tweet of the year that I read was one that you read, you know, taking a very complex issue, and putting it on a bumper sticker, because it's so difficult to do this with with all of it.

But, you know, it was around the time that someone, you know, had released a deep fake of Nancy Pelosi looking intoxicated, it was spreading across Facebook, and everybody looking at that anecdote was like, you have to get that down.

Why wouldn't you get that down?

It's obviously not true. It's obviously disparaging. It's got to go.

And your tweet said something along the lines, if I was better at this, I would I would be able to sort of put it on the screen right now.

But my production value isn't there yet.

But it sort of said, you know, here's your homework, said like a true, you know, professor, come up with the rule that can deal with this circumstance, but not really go after satire and comedy, and all sorts of other things in a way that wouldn't be hugely problematic.

Because, you know, characterizations of political figures, making them look a little bit foolish, even if you distort some language, dub over them, do anything like that, you know, are prevalent.

And where do you draw that line? And I think that's what so many people don't see is they point to a circumstance and say, this seems easy.

But if you're looking at a trust and safety terminal, if you're looking at the variety of different things that come in, that can be really, really hard to do.

So congrats to you on that, because I think it really, for me, I and I saw people then picking up on it really twisted that conversation that day, which really was taken on sort of a pitchfork and torches sort of, you know, tone.

So yeah, well, you know, if you could write for my promotion in 10 years, that'd be if tweets, He tweets exceptionally well, he takes really hard concept, which might be that underreported part of tenure boards, but but increasingly, you know, publication and footnotes are out.

Yeah, difficult academic concepts in a in a tweet is in and David nails that.

Right. So I, I mean, I'm glad you mentioned that, because I think, so on the one hand, anybody who studies content moderation, or like the gatekeeping function that we're talking about knows that it's hard, you know, content moderation is hard, it's, it's hard at the level of rulemaking.

Because you do have to draw lines.

I mean, there's no question the companies. And and by the way, you know, the companies, I've been advocating for the companies to use something called the UN guiding principles on business and human rights, which is a non buy, it's a non binding set of principles that says, you know, states, you know, need to protect space for freedom of expression and other human rights.

But also corporate actors have a responsibility, if not an obligation, because human rights law applies to states, not directly to companies, they have a responsibility to ensure that the steps that they take, you know, don't interfere with basic rights of their users, or the public, or whomever, it's not directed just to tech, it's, you know, all corporate actors out there.

And, and if that rule, the drawing of the lines for rules is really hard.

And, and I just think it's important for the public conversation to recognize that any line drawing is a involves a set of trade offs.

And, and that's fine. It's just that, who should be making those trade offs?

How should we be discussing those trade offs? Where should we be falling in terms of implementation of the actual rules that are adopted?

Those are just hard questions. And it's not that we shouldn't draw those lines.

I just want people to focus on that question that I mean, the Nancy Pelosi example was perfect, because, you know, who doesn't like a good mocking of a public figure?

I mean, that's part of our political tradition. And, and how we separate, how do we get and how do we get a company to figure out malicious intent versus satirical intent, or, you know, other kinds of motivation that might be problematic versus those that are not problematic.

Now, that doesn't mean that in some spaces, it should be relatively clear how to do that.

And a perfect example is today, election interference, voter suppression, you know, genocide promotion, those kinds of things, those are easier to say, look, this kind of content that is part of a pattern of, you know, suppression of the of a democratic right, like we can understand that more easily.

And maybe, given, particularly given the scale of some of the platforms, we should be taking a more aggressive approach to those things.

But, you know, again, that is, that's still private actors making those decisions.

And we have to figure out, like, what does the regulatory environment look like for those kinds of questions?

Well, obviously, we could we could keep this has been great.

And we're down to about two minutes left. So I want to give you one final question sometime, at least to sort of do that to take us out.

But this is this has been great. And the time has really flown. But you did sort of come to the end of your tenure, you know, in the role of Special Rapporteur.

I've got to think that that led you to some reflection, some thought, you know, we're, we're not only looking back 10 years, but forward 10 years, you know, if you had the sort of magic wand, if you were, you know, able to sort of push through on an international basis, any set of sort of, you know, two or three rules or standards or changes you'd like to see implemented, coming out of that for the health of the Internet for the health of the issues, you know, you were working on any, any sort of things that you will take away that you think are, are clear should do's and that you'll advocate for?

Yeah, I mean, I mean, there's many, but the but to I would just mention is, is kind of imperative and immediate in terms of the concern.

I think one is, when it comes to the decision making over content, I would, I really do hope that governments, and that means international organizations to don't move down the path of saying, you know, you have to deal with this content, but not that content, like the, you know, viewpoint, discriminatory rules, always end up supporting, you know, the more repressive.

And so what I really would urge, you know, public authorities to be thinking through carefully is, how do we make this world more transparent, and more information rich, so that the debate isn't so asymmetric, because now, basically, the private sector has all the information, and regulators have very little of it.

So for me, smart regulation means improved disclosure, transparency, and so forth.

The other part, you've got about 10 seconds left for number two, so it may have to be a headline.

But before we get to the other part is, we need rules for the private surveillance industry.

And those mean export controls, but they also mean real bite, real teeth in rules that apply to private surveillance companies like NSO Group, and others around the world.

Well, that's perfect. Perfect timing. David, thanks so much for your time.

This has been absolutely great. Really appreciate it.