The Reality of U.S. Privacy Law: Does It Exist?
Originally aired on December 27, 2021 @ 1:00 PM - 1:30 PM EST
Best of: Internet Summit 2018
- Eric Goldman - Professor of Law, Santa Clara University School of Law
- Terrell McSweeny - Former Commissioner, FTC
- Moderator: Alissa Starzak - Head of Policy, Cloudflare
English
Internet Summit
Transcript (Beta)
Okay, I'm here with Terrell McSweeney, who's a former FTC commissioner and is now a partner at Covington and Burling, and Eric Goldman, who is a professor of law at Santa Clara and the co-director of the Hitech Institute.
And we are here to talk about privacy.
And I think one of the things I heard all morning was that privacy kept jumping up, everybody wanted to preempt us.
But the conversations on privacy aren't only happening here, we're seeing them everywhere.
So I'm the head of public policy, and I'm in DC.
And we are hearing this come up with privacy, a privacy framework, and all these divisions of commerce come up with the privacy standards, different entities coming up with privacy standards.
So I guess the question that I want to start with is, what's going on in privacy?
Why are we seeing so much activity?
Are we actually going to see something productive in the United States on privacy?
I'll start with Terrell. All right. Thanks. Well, thanks so much for having me.
It's a pleasure to be here. It's my first Cloudflare Internet Summit, so I really appreciate the invitation.
The short answer to the question is we're seeing all this activity on privacy, partly because of the impact of the GDPR, which came into effect in May, partly because of the passage of the California law here in California, and partly because I think people are really coming to terms with the level of our connectivity and its potential implications for them.
So it's having an effect across the board, and so there is definitely a very vibrant conversation happening in Washington right now.
As you point out, it isn't settled by any means.
There's something like 10 bills pending in Congress right now.
There's the NIST efforts, the NTIA efforts. There's associations and companies coming up with their own principles, so it's a really dynamic space.
Eric? Yeah. The only thing I'd add to that is that we're really in the midst of a structural change in our economy, where so much of the value that's being created on the part of companies is coming from the data they know about their customers.
We see the largest companies in the globe that have ever existed in history powering a lot of their economic model on data, and so it's inevitable that when there's that much power and money in some particular asset, everyone's going to be interested in how that asset's being used, and maybe the regulators want to figure out how it should be controlled.
So that raises an interesting question on the regulation side.
Should we be regulating it? What are the benefits of coming up with a privacy policy from the United States standpoint?
Well, I went first last time.
Should I go first again, or should we- Either way. Okay. Well, let's start with benefits.
One obvious benefit is addressing what I think of as the hodgepodge of American privacy law.
Many of you in the room are probably pretty familiar with what I'm talking about.
The more complimentary way to talk about it is a sector-based approach, which used to make a lot of sense when data lived in a sector and didn't really move out of it, and is increasingly making less sense.
So we have alphabet soup regulation at the federal level.
We have 50 different state breach notification laws.
We have 37 or 38 different state laws around student data. We have major reform of privacy law here in California, some other states following suit.
We also have different laws around biometrics and other things in different states.
So that's a hodgepodge. So one benefit would be articulating a more consistent approach, and especially avoiding conflicting approaches, which can be costly.
I think another benefit is that we really need to have a better way to articulate what the US approach is to the rest of the world.
The fact is, the GDPR, which is pretty easy to understand, has been now looked to by major markets around the world as a potential model.
And we've kind of lost the argument that our approach here in the US is a better approach, because it's really hard to explain what it is, and it's hard to push back on the notion that the US is the Wild West of data.
I don't happen to believe it is, as a former FTC commissioner, but it's hard to win that conversation globally.
So I think that's another huge benefit. And a third benefit would be really coming up with a more predictable framework that reduces costs, is more consistent, and reduces conflicts as well.
Yeah, the only thing I'd add is that, especially for the incumbent companies, the benefit of knowing the rules of the road, understanding what they're expected to do, and then they can do it, is no big deal.
The benefits for smaller, medium-sized companies may not be as clear, but I think we're going to get to that.
So when we talk about wanting to know the rules of the road, it creates predictability, and it also creates incentives for people to understand what the rules are, and then build off them.
But then there are some costs that will come with that as well. So the benefits of understanding the rules of the road, obviously you mentioned GDPR, we're seeing this pop up around the world of different privacy laws.
So can we actually get to a place where there's a consistent set of rules in the road?
Or are we going to be in a world of lots of different privacy laws around the world?
It all depends on, who knows what it depends on actually, where you're operating, where you're targeting, what markets you're targeting, what does that look like?
Well, I think we want to get at least to a place where we have an interoperable approach.
And one of the things that I think is really crucial about reflecting a principled US approach on these issues, is that we need to be able to articulate a framework that really expresses our values.
Right now, what we're seeing are some pretty major marketplaces really engage in what I think of as privacy overcompensation, data protectionism and localization that makes it harder to move data around.
Now the big companies can deal with some of the impacts of that kind of regulation because they're enormous.
But it has a real consequence, especially for smaller innovators.
And I think it has a consequence for consumers too.
I've spent a lot of my career on consumer protection. And I think what most consumers want is their data to be managed safely and securely, kind of consistent with their expectation.
And then they want to be able to move it around.
They want to be able to use their services or they want to be able to port it around.
They want these controls. And so I think we want to get to a framework where we have that.
I like the way that Terrell described it about trying to aim for interoperability.
The idea would be that we have a uniform law across all jurisdictions, then there wouldn't be the need to build different systems or to build different rule sets within the same company.
But that's not going to be achievable, I think, for two reasons.
One, the GDPR probably can't be implemented in the U.S.
the way it's structured. I think there are some legal limits on what we can do in the U.S.
that would make it impossible to do a GDPR replacement. And I think that the GDPR does take a one-size -fits-all approach.
As much as we might malign the sectoral approach, there are some benefits to optimizing the privacy rules in particular contexts.
There are particular industry niches where we might want more protection or less protection or we might want different protection.
And trying to come up with a one-size-fits-all approach, which is basically what the GDPR does, probably won't be an optimal outcome.
So the likely scenario is that the U.S.
will have limits on what it could do, similar to the GDPR, because of its framework, and it will probably try to optimize the law for the edge cases in different niches.
And so I think the best we can hope for is interoperable. And if we don't achieve that, then we actually do create some major barriers for transporter operations.
Well, and what you've just described, the idea of different sectoral limitations, different laws in different places, it's incredibly complex from a regulatory environment.
I guess I want to touch on something else that gets into that question, too.
What information are we trying to protect? So the question on privacy, we're having all these debates, everybody cares about privacy, everybody wants to talk about it, but we don't actually always talk about what exactly...
What do people care about from a privacy standpoint?
What is the point? What are we trying to accomplish with it?
Do consumers care? Why is everybody looking at me all the time?
You're the best. I mean, I've gone first on all these questions. Why don't you take this one?
So we really don't have a single answer to that question, and in part, it's because of the semantic weakness of the term privacy.
That means different things to different people.
It means different things in different contexts.
And so when we talk about privacy, usually it's a melange of policy interests and consumer considerations that get all grouped together.
And having to unpack them and appeal that into different layers then kind of reduces the fun of the conversation at that point.
And then we're like, well, let's talk about this context.
And then all of a sudden, our time is gone.
So the most common thing that people talk about when they talk about privacy is that they want the control over their data.
They want the ability to be able to say, do this and don't do that.
But the reality is that most consumers don't really want to take the time to manage their data.
They don't want to actually invest in controlling it.
They kind of want companies to read their minds, to figure out what they think the deal should have been, and then do that.
And if they change their mind, to give them an option to change it.
And so in that sense, I think what Terrell said was actually pretty insightful about the idea.
What we really want as consumers is we kind of want them to do the right thing.
And the hard part is we don't know how to legislate that.
We don't know how to regulate that.
So we know what we kind of want. We kind of want companies to be smart and to be respectful and to give us control when we want it and to not bother us when we don't.
And I don't know how we're going to get that. All right. So let me just jump in if I can.
Oh, yeah. I know. Please. In case the suits didn't give us away, we're a bunch of lawyers up here.
And so we like to think about things in terms of harms and risks and what the laws say.
And so what we're talking about a little bit here is one of the things that's been centrally challenging about privacy, really across the world, which is where are the harms?
What are the harms? What's our language for talking about the harms?
And what are we trying to avoid happening to people with the data that we're collecting and using?
And some of the harms are pretty clear, right?
There's some harms associated that are economic when data is lost or mishandled in certain ways.
There's some harms that are a little bit harder to understand that are more like intrusion than into seclusion, turning a camera on in someone's bedroom when they didn't know that was going to happen or something like that.
That can be a harm. There are emotional harms, especially if data is lost or if it's handled inconsistent with expectation.
So there's some known universes of harms.
Then there's some use-based harms that the FTC has taken action on.
I'm thinking here of like revenge porn, for example, that are pretty extreme cases.
But then there's this intangible concern that I think people increasingly have, which is I don't understand what's happening to my data, and I'm a little bit worried it's being used against me in ways that I don't understand.
How do I navigate those waters? And I think this is the challenging question that we don't really have an answer to.
And I think we're going to agree on this for a second, or maybe disagree, I don't know.
I tend to think that actually trying to get to the answer of that question really requires a broader lens than privacy.
Privacy is super important, please don't misunderstand me. But it's really just one framework.
And in fact, it's too narrow a framework to think through all of the consequences of really powerful technology and uses of data on people and society.
That's why we have other laws and other frameworks. And those include things like equal opportunity laws, civil rights laws, all kinds of different laws that we actually are very familiar with in the brick and mortar world, but we don't really always import into the digital world.
And I think we're at this inflection point where we really need to start thinking through how do we import those values and red lines that we all collectively understand in the brick and mortar world into the digital world.
I do want to add that we do agree on some of that.
The last part, I think I might have lost you a little bit there. But actually for me...
Wait, how did you lose me on that? You're a law professor. The whole idea that we should import the offline laws into the online world, that could be right, it could be wrong, I'd have to know more.
So we're not sure about that. Ah, okay.
Very lawyerly answer. Yeah, exactly. Cautious. That's what lawyers are. Cautious.
But if I had one wish on this front, actually I'd abolish the word privacy from our vernacular altogether.
So in that sense, I think I agree with you, which is let's talk about all the different things we're trying to cover.
Privacy is just such a loose word that actually usually keeps us from understanding what we're talking about actually rather than advancing it.
Okay, so can I just jump in with one thing then to try to clarify what I'm saying about this weird concept I have of analog world laws applying in the digital world?
It's things like civil rights laws, right?
If what someone's concerned about is that their data's being used against them for some discriminatory purpose, not even a purpose, but having a disparate impact, right?
They don't see certain advertisements for credit opportunities or employment opportunities or housing opportunities, then we actually have a whole legal framework about how to protect groups of people and how to protect opportunities and access to opportunities.
We also have consumer protection laws like the Fair Credit Reporting Act and other, the Equal Credit Opportunity Act, right, that I think apply in this space and in fact enforcers have applied in this space, but enforcers continue to really have trouble detecting conduct and knowing when to act on it.
So this actually raises some really interesting questions because it seems to me there are all these efforts going on and everything is about privacy, privacy, privacy, privacy.
And what you're saying is that we're couching things in the term privacy that maybe aren't about private data.
Maybe we're okay sharing with Facebook and then we have to figure out is it the consequences of what they're using against us or what do we care about there?
How does that fit with the connection to what's happening in the regulatory space?
So are we misguided, I guess, as we look at these efforts, the Commerce Department's efforts moving forward?
Is there valuable things, are there valuable things that can come from that effort and what do they look like?
Well, you think they're misguided.
I mean, so yes, circling back on more specifically the privacy world for a minute, I think we do see an enormous amount of innovation in privacy frameworks at the moment as we were talking about at the outset of this conversation.
We've got several processes happening at the federal level, we've got innovation at the state level, so there's a ton of discussion about what are the right policies and how do we get the right balance here, essentially the cost-benefit analysis.
There are, I see some consistency, which is a glimmer of hope here.
I see a lot of coalescing around some of the core, I think of them as control values or consumer rights, access, correction, potentially deletion, I think an American version of that, so let's put that over here for a second.
I hope more focus around portability and interoperability as well will come together.
I think there's some acknowledgement that there needs to be a kind of central regulator at the federal level, that's very likely the Federal Trade Commission, my former agency, and then there's just a tremendous amount of potential differences as we sort through the frameworks that are currently kind of on offer.
Yeah, from my perspective, I think this goes back to what Terrell talked about with all the different types of harms that might be under this heading or rubric of privacy.
If we aren't specific about which harms we're trying to fix, then there's easy enough to do something about privacy but not actually address the harm that was the underlying motivation.
I also think that the constant semantic conflation of privacy and security is really accelerating this, so a lot of times people think they're labeling a privacy initiative, but it's really a cybersecurity initiative, and that may be okay in terms of the ultimate outcomes, but it does make it a little bit confusing, are we addressing privacy or not, are we actually advancing the privacy type of harms or some other kind of thing.
So as we think about addressing harms, we obviously are, again, lawyers up here, that's what you get, sorry, but we think about enforcement mechanisms, so what are the appropriate enforcement mechanisms, and who are the appropriate enforcers for whatever harms that you see?
So if you're trying to redress, you're looking for redress for harms, how do we do that?
So is it a federal enforcer, is it a state enforcer, is it civil action?
You can imagine a variety of different things that you could do to redress harm.
Yeah, well I think this is gonna be a big source of the debate in Washington and at the state level.
So for my money, I would say, if we think about consistency as a goal, then we think about what we call preemption in law, which is the federal government passes a law, and then that is the law, and all the other laws are preempted by that, replaced by that.
Essentially foreclosing access to one whole branch of government is a really big thing, and I think it ought not be done lightly, ditto access to the courts for plaintiffs, so I think those are gonna be really big debates, and it's going to matter the strength of the framework in order to justify that as the sort of law of the land.
Then when we think about harms, there's a variety of different frameworks out there, there's sort of penalties for failure to comply, right, which is nice and predictable.
There's a whole robust conversation about how do we value data, like are there models we can look to in trying to understand the value of data that would help us be more predictive in what is the liability around losing that data, or having that data breached, or having a privacy violation, and I think that's a very challenging area, because there's the value of it, the sort of on the dark web, the illegal value, there's the value a company might place on it, everybody's trying to figure out what is the value of their data, and there's a bunch of different models of how to do that, so I think we're still really thinking through a lot of those questions.
Yeah, I think this is really the hardest question that is the biggest lever for us.
If we answer who is the enforcement mechanism, that'll start to tell us what laws they'll be best at enforcing, so in some sense, I almost would rather answer this question before we answer what the substantive rights are under whatever heading we call privacy, and so we also then know that that means there's going to be turf wars among folks, and so the battle won't always be about who's the best, it will be who's most passionate about trying to get their turf defined for themselves.
I do think that a federal agency like the FTC would actually be a good place for enforcement because of their expertise with dealing with consumer -related matters.
I also think that the worst mechanism for enforcement is private rights of action, which we've seen abused in a wide variety of contexts, and as everyone knows nowadays, any security breach that occurs among companies now instantly leads to lawsuits, and that's only going to accelerate under the new California law, and those lawsuits often really don't redress consumer harm, they don't improve business practices, they're just about allocating money, and I really think that's not the right way to solve the problems.
The other thing that I think is interesting that was a little bit different than the question asked, but is I think central to the question here, do we want regulation at the federal level exclusively, or do we want federal and state regulation, or do we want what could be effectively only state-level regulation, at least on a general basis?
We already have these sectoral bases for federal law, but we might not have a general-purpose federal law, and I have very strong views about that.
I think state regulation of privacy-related matters is a disaster.
It has worked poorly to date, it's only going to get worse.
The state legislators do not have the expertise to figure out what needs to be done.
They do create variations among states that create thickets for any company trying to figure out what to do, and a lot of times we see a lot of regulatory capture at the state level that we don't see at the federal level.
I know that sounds weird, but as much as Congress is in the pockets of the lobbyists, the state legislators have fewer people pushing back on each other, and so you see weird distortions where the lobbying efforts can be co-opted by very small voices to steer the legislation.
For me, the best outcome of this question is that we have a federal law, a preempt state law, and that the FTC is the enforcement mechanism for it.
I just want to heckle here for a little bit, because one of the things I think you're taught in law school is that the states are the laboratory for innovation, and we are ...
Oh, they're the laboratory for democracy. Oh, even better, even better.
Well, we need some help on that front. We're in a world on privacy where we're really trying to figure it out, so shouldn't the states have a role?
Isn't there a way of rethinking privacy that the states should play a role in?
California may not be the best example here of getting ...
I'm sorry. You don't get to take away the bad examples, because in fact, the story of bad examples come over and over again.
I actually have a 45 -minute talk on why states are terrible at manufacturing privacy law that riffs on this idea that they try to be laboratories of experimentation, except that they don't actually run the test as scientific experiments, except that what tends to happen is once one state passes a law, we have what I call regulatory cascades.
Other states copy that law before we get the results of the test.
And often, the state experiments taint each other because so many companies are across borders that it's impossible for a state's experiment to be corralled within state without tainting the experience of other states.
So I'm all for the idea of laboratories of experimentation.
When it comes to states and privacy, we've had horrible results.
So I'm heckling here in part so Terrell doesn't have to, because I think she also has very strong views on the state role in this space.
I mean, I guess I do. I think you're making some valid criticisms, I will say that, but I suppose I would come down on the side of innovation happening at all different levels of government, and then I think what the job is, hopefully at the federal level, is to reconcile some of that.
It is going to be a very, very big debate, I think, and whether a framework is strong enough to justify preempting state-level protections that are stronger is a real policy decision that people are going to need to wrestle with.
So I'm going to ask one more question, then I'm going to turn to audience questions.
My question actually goes back to the federal level.
So if we talk about the FTC, if we forget my heckling for a minute and assume we're going to have federal regulation and not state regulation, does the FTC actually have the capacity to do it?
So as you ask the question of who is the appropriate regulator, the FTC is not that big an agency.
Yeah, well, it's my former agency, and I can tell you from experience it's terrific, very strong, and no, it doesn't have the adequate resources to do this job.
So what do I mean by that? First of all, it has huge gaps in its authority.
It doesn't have jurisdiction over common carriers and non-profits, so one of the things I think we want here is a consistent system from edge into pipes, et cetera, right?
We want everybody, whether they're analog world holders of data or digital edge leaders to be sort of following the same rules.
So we have to fix the gaps in the authority. I think we have to adequately resource the agency, and one of the things the agency has been doing, which is terrific, is bringing more technologists into its work, but there are not enough of them in that work.
So I think it really needs to stand up almost a bureau of technology to really bring the technical expertise in that it needs to help make the right decisions, and if it's granted rule-making authority, to write the right kind of rules.
Great. So I'm going to open it up to audience questions. Oh, we've got a whole bunch.
Okay, starting right there. So we've discussed privacy in the context of consumers and business, but what about the more pervasive government and citizen kind of, with the spying that the NSA does, things like that?
How does that mix into all this right to privacy ideas? Well, Alyssa, that's your world, my friend.
No, I'm turning it right back to you. So I'll take that bait.
So I actually, this is one of my frustrations about the privacy discussions, because we do get all wrapped up about the fact that Google knows everything about us, or Facebook knows everything, or Cloudflare knows everything about us.
But honestly, we lose sight of the fact that government is the enemy, and every time we are spending our time and energy worrying about the private sector to the exclusion and thinking about how the government is in the capacity to abuse us, and is routinely abusing the data that it has about its citizens, I feel like we've kind of missed the forest for the trees.
So we were talking about that here, because I don't have any conception of how the government is going to put handcuffs on itself, but honestly, we as citizens should be demanding that.
We should be grabbing the pitchforks and telling the government that they need to respect us, and I don't feel like they do that.
All right. I'm all for the government respecting people and their privacy.
Fortunately, I worked on the consumer protection side of the government, not the national security side.
I've also been a proponent of, like, strong encryption and no backdoors and a whole bunch of those things as well.
Here's what I will say about privacy policy and the need for an approach that we can really point to in the U.S.
that's strong, because when I talked about privacy overcompensation in major markets and resulting in data localization and bad innovation policy from my perspective, one of the things that can happen is it gives citizens in those markets a false sense of their privacy rights and controls vis-a-vis their governments.
We see a lot of that privacy policy happening in markets where people also are heavily surveilled by their government, and I think it lulls people into a false sense of what their real privacy is.
Okay. Over there. Do you think the government actually has the capacity to technologically keep up?
I mean, because we as the innovators here move so fast.
I mean, you're asking that question as a fighter punch that flies over our heads.
No, but I mean, I'm just curious whether, you know, regulators aside, I mean, the thought of politicians understanding the nuances of Google's collection of search history, for example, seems pretty far-fetched.
And I'm just wondering to what extent you think that government can actually address this in a way that gets ahead of the technology innovation, and to what extent is it for companies like Cloudflare with their wonderful 1-1-1 DNS that gives a tiny bit of our privacy back to actually invent and distribute the technologies which give consumers some protection?
All right. So I think technology is always going to outpace the law, and it's always going to outpace the government, and there's no exception.
What's happening is sort of a Moore's law in that respect, which is that it's exponentially outpacing at this point.
So one of the questions is, how do we collectively get ahead of some of the risks that have other consequences, not just legal consequences, but reputational consequences that are really harmful to business as well?
So I think it's going to require a collective approach. I think one of the things that's absolutely central in policymaking is that the government remain as tech neutral as possible when it's thinking about regulation, because it has to acknowledge that it will always be lagging very dynamic evolution of tech and those markets.
And then I think the government needs to also be really thinking about competition, because competition can be a huge factor in generating innovation, but also in keeping markets functioning properly.
And so it needs to think when it's making regulation, what is the competitive impact of that?
And is this a regulation that's going to introduce more competition and more innovation into the marketplace or less?
And in privacy, there's a real tension there, because if we're locking down the data and throwing up walls and creating walled gardens, and we're creating real barriers to moving the data around, I don't think we're solving that problem.
And I don't even think we're providing the functionality to people that they actually want here.
Which is perfect time, unfortunately.
We are at time. And I'm sorry we can't take more questions, but thank you very much.
Both of you. Thank you. Thank you.