Private Companies, Public Squares
Originally aired on June 10, 2020 @ 3:30 AM - 4:00 AM EDT
Best of: Internet Summit 2017
- Daphne Keller - Director, Stanford Center for Internet & Society
- Lee Rowland - Senior Staff Attorney, ACLU Speech, Privacy & Technology Project
- Moderator: Matthew Prince - Co-founder and CEO, Cloudflare
English
Internet Summit
Transcript (Beta)
🎵Outro Music🎵 The last panel I did were the pessimists.
I kind of think you guys are the optimists.
No pressure. So, Daphne Keller, to my left, is the Director of the Intermediate Liability at the Stanford Center for the Internet and Society.
She previously was Associate General Counsel at Google, looking at intermediate liability issues.
And then Lee Rowland, over there, serves as Lead Counsel at the ACLU's Speech Privacy and Technology Project in federal First Amendment cases.
She's written a number of amicus briefs, you know, fighting for free speech rights.
And they are both people that I really admire in the world of technology and the law.
And so, I mean, I think what... Technology and the law seems like they're coming into collision more and more.
Tech companies are being asked to regulate what content is online.
And I think for a largely non-lawyer audience, could you just give some foundation on what are the basic rules when you have content which is on your network?
I'll take the first stab. So I think the first thing to say is that you guys kind of won, in the sense that we have a communications platform 2.0 that makes the First Amendment almost quaint.
You know, I was goofy enough to go to law school and think I wanted to do First Amendment work.
You know, sure, you have a women's march or a Charlottesville or a soapbox in a park every once and again, but I think we all recognize that the vast majority of speech that we exchange happens online.
And when it's hosted by private companies, the First Amendment doesn't constrain their choices of curating, of censoring, of blocking, of who they do business with.
So this really is a space that's governed entirely by norms and the individual choices of people like Matthew, right?
And companies like CloudForm.
Sometimes I wake up in a bad mood. Yeah, exactly. And I think we all need to, you know, in the wake of Cloudflare's decision to take down its services for The Daily Stormer, which is a neo-Nazi website, I think Matthew has done us all a service by penning a piece that said, it's scary that I have this power, and I exercised it.
And I think we all should be scared, right? At least from a free speech values point of view.
We now have a completely unaccountable private medium of communication, which is where all of us as Americans actually do our communicating and talking.
And not just Americans, that's obviously globally true. Well, and it's even beyond that.
Like there are shields and protections for companies for that.
And so, you know, what is sort of, first of all, what is intermediary liability?
Like, I mean, it's sort of a meaty phrase. And why was that a position at Google?
And why is that a position at Stanford? Well, it's a horrible word.
Nobody knows what it means. If anybody has a better idea, please tell me.
It's the set of laws that tell platforms, from, you know, Facebook to Cloudflare, when they have to take down user speech because that speech is illegal.
And as Lee was alluding to, in the US, for most things outside of copyright, platforms don't have to take anything down, or they can choose to take anything down, and it's entirely up to them.
But outside the US and for copyright in the US, the rule is generally that platforms, once they find out about something and it breaks the law, they have to take it down or face liability themselves.
And the problem with that, as any amateur student of game theory might recognize, is the motivation is every time anybody alleges that something's illegal, the easiest and safest course for the platform is always to take it down.
And there are reams of research showing that's what happens a lot of the time.
So the rules about when platforms have to do this are very consequential for actual, practical, free speech rights of users on the Internet.
And we can't, by the way, undervalue how much these rules that Daphne just explained have completely created the ecosystem online that we now know and love and often hate.
But Yelp would not exist without intermediary liability, without the ability to be immune from lawsuits filed by whoever decides to slander a business on Yelp.
And any kind of content provider platform, we enjoy access to those precisely because of these laws passed by Congress in the late 90s to make sure that content hosts weren't liable for everything somebody said on their platform.
So, you know, in both the U.S.
and the EU now, these laws are coming under threat. There's legislation to sort of carve out a portion of intermediate liability and impose that on platforms pending for the U.S.
right now. And it'd be great to talk about that. And I think that we tend to focus a lot on what's going on in the U.S., but one of the Germanys at the last G7 meeting said, one of our top priorities is eliminating intermediary liability.
So you're supposed to be the optimist, and you're telling me that we won.
Doesn't feel like we're going to be winning for very long. That was your idea, that we're going to be the optimist.
Calling on a lawyer for that. Lee said we won, so I was like, cool, we can go home.
My most optimistic take on it is that there's an opportunity here for companies, particularly the companies that have a deep tie to the United States, as an opportunity to make sure that we don't allow countries like Germany that have much less protective speech regimes to ratchet everything.
It's always a one-way ratchet to the lowest common denominator, right?
And you mentioned Germany. Okay, in Germany, you generally cannot say, have a swastika or agree with Nazi ideology.
But then you got to put in the turkeys of the world, right?
And Thailands and China and North Korea, right? If they're actually participating.
And you can see how the multinational pressures really do risk this lowest common denominator stuff, where everybody gets to pick off the speech that's illegal in their country.
And so I really think that companies like Cloudflare have an opportunity, and I'd go farther and say a duty, to uphold some of the values and norms that reflect our First Amendment landscape.
Because I think we want to look in ourselves and answer the really hard question that maybe with this many minutes left, it's too early to pose the big question, but do we want a world where Nazis cannot have a website, right?
Do we want a world where if you say no, it's a de facto veto on the ability of somebody like the Daily Stormer to be online?
And it's not a comfortable thing to talk about, because nobody wants to see Nazi ideology.
But I will say that I do want the ability to see and find speech that reflects actual human beliefs, because that's how we know it's out there, right?
It doesn't benefit us to be blindsided by the private organizing of white supremacists.
It doesn't, there's not an upshot for society. Enforcing that kind of purity only hides those beliefs.
It doesn't change them. And so I think Cloudflare's in a, and companies like you that are part of the infrastructure of the web, whether it's DNS or hosting services, have a fundamental role in saying, we don't endorse this speech.
We're not gratuitously looking at content to see whether or not you're worthy of a URL or hosting security.
What we're doing is providing a neutral platform, and it's other people's job to look at that speech, to counter it with better speech, and to hold the people speaking accountable rather than those providing a platform.
I think there's also sort of an ugly dynamic between governments and major platforms.
Avril was talking earlier about how private companies are taking over functions that historically have been government functions, like running the public square.
And that's weird because they're not subject to the constraints that governments are.
They're not subject to the First Amendment.
They're not subject to the Fourth Amendment and constraints on surveillance.
That creates sort of an unfortunate opportunity where private companies can do things that governments can't but maybe want to.
And so pressuring private companies to do them or taking advantage of the fact that they already do them, like collecting user data, becomes a very effective way for governments to get more power than their constitutions would otherwise allow them to have.
And as an example of this, you were talking about pressures in Europe. The European Commission reached an agreement last year with four big platforms, YouTube, Microsoft, Facebook, Twitter, the EU Hate Speech Code of Conduct.
And the agreement was that they would voluntarily take down hate speech as described in this agreement, which is not the same thing as hate speech as described in the law.
It's going beyond what the law would actually prohibit, or that's how most people interpret it.
And they're voluntarily agreeing with the government to take down lawful speech.
And many Europeans and many Americans think that's a very strange function of government, to go out and use the cloud of government to get this voluntary removal of lawful speech from private platforms.
And to take decisions about controversial speech that could have been litigated through the courts and held to some kind of democratically accountable standard of what speech we want in our societies, and instead put them in the hands of a team in Mountain View or Menlo Park, or here.
Yeah, over here. No pressure. So is this a fight that we can win?
Like, four years ago, I was like, free expression is inherent to the Internet.
It's the way that the world works. I'm the son of a journalist.
I believe in this. You know, as we now operate in 70 countries around the world, we have equipment in all of those countries.
Very few of them have anything close to the free expression ideals that the one that we're sitting in right now does.
And so they seem like they have said, we're going to start regulating you.
Like, don't be evil doesn't translate well into German. It sounds like a joke. And that's a very, very difficult place, then, to go to operate from, because we have people and machines and all kinds of things in those places.
What's the argument that you think persuades the rest of the world that, yeah, we should be a neutral platform?
I'm not sure that I'm optimistic enough to suggest we can convince the rest of the world.
What I think is important is that at least for, and I don't mean to sound like such an anti-globalist, but we have to recognize that these borders have real impacts on speech.
But I think that at least for American consumers and companies giving Internet access and backbone to American Internet users, that we do have the ability to get on a visceral level for people to understand that some things that create the backbone and the infrastructure of the Internet are so essential that we don't have this kind of race to advocate for ad hoc moral panics.
And what I think of are things that we generally think of as common carriers, right?
No one is out there picketing AT&T because Richard Spencer has a cell phone.
I don't, by the way, know, I have any idea who Richard Spencer has a cell phone account with.
But let's say it was AT&T or T-Mobile. You don't see people out there in front of those buildings or coming at them like they came at Cloudflare to say, we demand that you deny service to people based on their ideologies.
But there are, I mean, we have had a tradition of, if you analogize back, newspapers had an editorial perspective and you bought the newspaper that was the conservative newspaper or the liberal newspaper, is Facebook like the modern newspaper?
Or are they like the printing press? Or what is the analogy that makes sense there?
What did they knew? I think a lot of people, again, particularly in Europe, have an inclination to say, Facebook needs to admit that it's a media company.
I've heard that sentence any number of times. And the difference between Facebook and a media company is that the media company hand-selected everything that it published, whereas Facebook is an open platform where anybody can come along and say anything without being policed.
In this particular case with The Daily Stormer, if you put up a link to The Daily Stormer and you said something which was supportive of it, Facebook took it down.
If you put up something of The Daily Stormer link and you said something that was critical of it, they left it up.
That sounds like a media company. So they take down a lot.
They exercise a ton of discretion and are effectively taking a political position on questions like that.
But that's not the same as saying that, like a broadcaster, they could be legally accountable for every single thing that is transmitted on their platform.
With that standard, they could never have an open platform where we can come along and say whatever we want and then maybe they check it later.
But I do think that people on a gut level generally hold newspapers accountable for their political worldview.
And I think whatever the complications of kind of slotting Facebook into our existing analogs, I do think that that's fundamentally different than what you do, which is Facebook already exists as a content review company, right?
And sure, they're a platform, but the whole time they've had algorithms and curation in terms of who they're elevating for you to see.
Obviously, every one of those is a choice that affects what you hear, who you reach, when you speak.
Yeah, it's the algorithm, it's neutral, has always struck me as kind of, you know...
Horseshit. Yeah, whereas you guys, algorithms are made by humans, in fact, right?
They don't come down from us on high, right? I mean, they are created and choices are made when they're designed.
Does it surprise you there's not a Fox News search engine?
Like Big American Flag, Bald Eagle, Only American stuff, totally serious.
Yeah, it actually does. 5%, you'd get 5% of search overnight and that's a $10 billion company.
Yeah, actually, and this has been a kind of a constant conversation in the net neutrality debates where Internet service providers have said, look, we don't discriminate.
We offer access to all endpoints on the web and that's what we'll always do, but we want the right to not take you to a certain website, right?
And so there really has been this open question about, can you have a bespoke ISP, right?
Will it work in the market? Can you have the Disney ISP that makes for damn sure you never see porn?
Maybe, nobody's done it, which is fascinating and I think we're now in a news ecosystem where the kind of spread of fake news, right?
Or whatever that means, or kind of people's willingness to buy into their own bubble, right?
And just continually replicate their bubble, which Facebook is doing, perhaps people are less honest about the bubble they're creating by doing that.
There actually does seem to be enough of a demand that I am a little surprised that no one has kind of tried the politically bespoke ISP.
I only wanna see Republican search results. But I think the fact that there isn't a Fox News search engine is really important for these public forum conversations because a lot of the people who are saying, wait, Facebook shouldn't be able to kick off my political speech or Twitter shouldn't or whatever, they're saying the only place to reach my friends is on Facebook.
There's not a competing place to go that is equivalent.
And people say the same things about search engines.
As a former Google lawyer, I have no opinion on that. But it really matters whether there's someplace else to go.
And if there were a place to go to be conservative and a place to be liberal and a place to like cats and a place to like dogs, that would be one world.
But if there's only one place to go, then it's easier to imagine having some government-like obligations on them, which I think is the claim that was made about Cloudflare in The Times today or yesterday, wasn't it?
Well, and that is the question. Is there a scale? Like, we have the right to kick anyone else off.
You sure do. Is there any scale at which you would think maybe it's not the right time?
I mean, we've got their, Steve Bannon is sort of proposing that giant Internet companies should be regulated as utilities.
Is there a time at which, and in the telecommunications world, we had common carrier status.
Is there a time when that's the right way that this should be thought of?
Do you get, if you have monopoly status and you are the only place, you are Facebook and you are the only place to truly be able to reach that audience, does that mean that you have a different set of obligations?
I don't think that works.
I think we could apply that logic to your business. No offense. And sort of say, we're going to take away your discretion to kick people off because you're part of the infrastructure and people need that.
But for the service that something like Facebook or Twitter or Google offers, the service is creating a community that people want to come to and they want to come there because it's not full of dick pics and it's not full of hate speech and it's not full of bullying.
And so, without that kind of curation, effectively, they would no longer have the value proposition for their users.
People wouldn't want to be there. So, the ACLU has been, you know, a real force for free expression in this country.
It's the American Civil Liberties Union.
Who's fighting for the free and open web outside of this country?
And could you give me their phone number? So, there are a lot of organizations around the world that work on this.
For reasons I don't fully understand, some of the best experts and organizations I've come across on this are in Brazil, Argentina, or India.
But, you know, in Europe, there are much smaller organizations.
They don't tend to have the presence in Brussels that we expect NGOs and advocacy organizations to have here.
And I think that's something that's important and something that's worth sort of paying attention to.
I think it's also important for, you know, European regulators hear from Facebook, and then Google, and then Facebook, and then Google, and they hate it.
It's important for smaller companies, for, you know, journalistic interests, anybody who's being affected by this, to show up and let them know because they really don't hear that perspective very much except from people they don't trust.
What are the arguments that you've found are persuasive in these conversations to regulators?
There's the old J.C.
Watts, the former, you know, congressman from Oklahoma, used to say that when you're explaining, you're losing.
Yeah. And I feel a lot of time when I say, well, we're not, we're sort of deep infrastructure, so we're something different.
And they're like, no, your tech company must regulate, right? But how do you, what works?
Like, where do you, when have you seen sort of the light bulb go on in people's heads?
I think people get it when you say, you are sacrificing sovereignty and the role of your own lawmakers by standing back and asking an American company to decide this for you.
I think that's an important argument to put in front of people.
I think in some cases, making the economic arguments, saying, you know, the, as Anupam Chandra says, law made Silicon Valley, you know, without these kinds of laws, you will not have a flourishing Internet sector.
Sometimes those work.
I think outside the U.S., American lawyers running around yelling about the First Amendment don't get a whole lot of respect or traction.
Nope. But there are other important points that you can make.
Yeah, and I think if we're talking domestically, right, and how we could convince legislators to think about rules that constrain it, you know, there's an interesting history that the rule we were talking about, intermediate liability, actually comes from a law called the Communications Decency Act, which is itself a hilarious irony because it had a moral majoritarian half of the law, which the ACLU sued and got knocked down in a seminal case.
So just the non- Thanks for leaving the other part. Right. So just the indecent part is left of the Communications Decency Act, which is a great joy in my life.
But at that time, in the late 90s, when Congress got together and passed Communications Decency Act, it was overwhelmingly bipartisan.
One of the reasons was because Republicans and conservatives saw themselves hoisted on that petard because they know Silicon Valley is liberal.
And that's kind of fallen off, I think, in the last 15 years where there's been a lot of moral panics about trafficking, in particular, online, and then you tend to get these alliances, and this is as- Human trafficking, to be clear.
Yes, sorry. Yes, human trafficking, and particularly with focus on women and girls.
And so, you know, at least in my life, I find that some of the most unholy alliances that are powerful and tough to stop come when kind of women's rights advocates on the left and moral majoritarians on the right get together and agree on something.
And I think there are a couple of ways to remind people, and actually, I think your action- This is a piece of legislation which is currently being proposed that would carve out for sex trafficking the intermediary liability protections.
And it looks fairly likely to pass.
It may. I mean, the best prognostications are that it might. And this will be the first time since the late 90s that Congress has successfully amended this Communications Decency Act that gave us the intermediary liability and safe harbor.
So I think, you know, I'm not going to get too excited and say that we're going to convince people on SESTA, which is the name of this proposed bill, but I do actually think that the Daily Stormer example, in a weird way, is a helpful reminder to people on the right that their ox is gored too.
And at least in free expression advocacy work, I can tell you the only thing that's ever effective, besides a lawsuit that goes to the Supreme Court and strikes down whatever moral panic legislation Congress passes, which is a thing, and we will do it again as needed, but the only argument that works in advance is reminding people that they might be the goose or the gander next time, right?
You have a dog in this fight.
When other people get to decide the content, you might not always be on the right side, right?
And so we mentioned the kind of, Daphne talked about how Facebook agreed to the hate speech rules, right?
Well, you know, The Intercept wrote a devastating piece two weeks ago describing all of the progressive political voices that have been silenced under that hate speech code.
Largely Palestinian rights activists and media, you know, members of the boycott, divest and sanctions movement against Israel, but clearly that was not Facebook's intent, right?
And we know that they over-censor and they're over-broad, so I think highlighting those examples, reminding people that there is a reason that this road is a one-way ratchet towards censorship and people are going to over-censor and at some point the things you care about too will become tomorrow's moral panic, those are the only arguments I found that really resonate with folks in getting together against censorship.
I want to leave time for a question or two, so if you have a question, raise your hand and we'll get you a mic, but just quickly, what are one or two things that each of you are worried about, I'm gonna let you be pessimist for a second, that people aren't thinking about enough right now?
Well, some people are thinking about this, but there's tremendous pressure on platforms to build technical filters to automatically find and suppress content and there is widespread belief that if they can build a self-driving car, then they can build technology that identifies terrorist speech and companies are under tremendous pressure and they wind up agreeing and saying, yes, we're gonna try, we'll build something that does the best job it can and the result is the kinds of stories that you're seeing in this newspaper today about videos documenting atrocities in Syria being taken down, that were being used by human rights advocates because the machines or the people or the machines plus the people flagged them as terrorist speech.
So I think the push for mechanized content removal is one of the most dangerous things.
I agree totally and the one I'd highlight goes hand in hand with that, which is apropos of what we were talking about earlier, that companies, private entities aren't constrained by constitutional rules, Daphne mentioned the First and Fourth Amendment, well, another big concept is due process, right?
And we have the idea in a free society, if someone censors our speech, we get to say, hey, wait a second, right?
Well, good luck getting an answer to, hey, wait a second, when Facebook takes down your profile or a piece of speech and finding out what term of service did I violate?
Wait, no, actually my breastfeeding picture is allowed or no, when I put up that slur, I was describing hate speech that was directed at me this morning, right?
Which is happening all the time.
People of color literally talking about experiences and being called a slur and they are censored.
So I think hand in hand, the kind of algorithmic ratchet, combined with the lack of due process because there's not a built in advocacy arm, all the money and all the resources are going to the censorship piece and none on the due process.
And EFF has done a lot with like the Manila principles and other that are really, they're really good.
Come some quick questions, we're out of time, but I'm the CEO.
One of the biggest challenges with the open Internet, I think, is the fact that it is open.
You can basically go back to the origin and find where content is originating.
Now, there is something that's out there, of course, the dark web, Tor, where you don't really know where the content is or where it's coming from.
And in a world where the website is encrypted on the host, it's encrypted in transit and then it basically pops out the other side in someone's browser.
Is that potentially an answer? Where at that point, the censorship ability goes down dramatically, the free speech of regulating that essentially becomes much more difficult because we don't even know where it comes from.
Is that a potential answer to some of these questions?
I think it addresses the free speech values problem, but just as a practical matter.
I mean, Daphne mentioned earlier that people use Facebook because it generally isn't overwhelmed with porn and attacks.
And I just wanna be honest that I think for the average Internet user who's using it casually, probably that's gonna create an ecosystem that is less attractive in many regards.
I've been on the dark web, it's a dark place. I wish I'd spent less time there.
So, I mean, sure, if you want anonymity, that's great, but is it an actually useful web that is both free and effective as a medium of communication?
My answer would be no. And accessible to ordinary people. Right. Thank you so much.
Thanks for being here. Thank you, Matt. Sorry, Ashley. We're gonna take our glasses off so we can save what's in common.
Cloudflare Stream makes streaming high -quality video at scale easy and affordable.
A simple drag-and-drop interface allows you to easily upload your video for streaming.
Cloudflare Stream will automatically decide on the best video encoding format for your video files to be streamed on any device or browser.
When you're ready to share your videos, click the link button and select Copy.
A unique URL can now be shared or published in any web browser.
Your videos are delivered across Cloudflare's expansive global network and streamed to your viewers using the Stream Player.
Stream provides embedded code for every video. You can also customize the desired default playback behavior before embedding code to your page.
Once you've copied the embed code, simply add it to your page. The Stream Player is now embedded in your page and your video is ready to be streamed.
That's it.
Cloudflare Stream makes video streaming easy and affordable. Check out the pricing section to get started.