Deep Dive Episode 35 – Examining the California Consumer Privacy Act

Regulatory Transparency Project Teleforum

Listen & Download

On June 28, 2018, the California legislature enacted the California Consumer Privacy Act of 2018 (“CCPA”). This legislation follows more than two decades of debate about potential federal privacy regulation, a tumultuous year of high-profile privacy incidents, and the implementation of the GDPR in Europe. It also is the most comprehensive privacy regulation that has been adopted in the United States. The CCPA was enacted in a record-breaking 7 days, has staggering breadth, and will have national and international repercussions. On this call, we will discuss the substance of the CCPA (including recent amendments) and the process that led to its enactment, along with how it is likely to affect future privacy regulation in the United States, with Eric Goldman and Lindsey Tonsager, two experts in privacy law who have followed the CCPA closely.

Featuring:

  • Prof. Eric Goldman, Professor of Law and Co-Director of the High Tech Law Institute, Santa Clara University School of Law
  • Lindsey L. Tonsager, Partner, Covington & Burling
  • [Moderator] Justin (Gus) Hurwitz, Assistant Professor of Law and Co-Director of Space, Cyber, and Telecom Law Program, University of Nebraska College of Law

Visit our website – RegProject.org – to learn more, view all of our content, and connect with us on social media.

Event Transcript

[Music and Narration]

 

Operator:  This is Free Lunch, the podcast of The Federalist Society's Regulatory Transparency Project. All expressions of opinion on this podcast are those of the speakers.

 

Devon Westhill:  Good afternoon, everyone, and welcome to another episode of The Federalist Society's Free Lunch podcast call for the Regulatory Transparency Project. Join our conversation about where government regulation might be improved by visiting the RTP website RegProject.org, R-E-G Project.org. There you can subscribe to our biweekly newsletter, and also consider following the RTP on Facebook, Twitter, and LinkedIn. My name is Devon Westhill. I'm the Director of the RTP and host of the Free Lunch podcast.

 

      In this podcast episode, we discuss the new California Consumer privacy Act of 2018, which has the distinction of being the most comprehensive privacy regulation ever adopted in the United States. Our guests today will walk us through the who, what, when, where, and how, the wide-reaching CCPA, and also share their thoughts, perhaps, on its recent amendments.

 

      I'm pleased to welcome our moderator today, Gus Hurwitz. Gus is Assistant Professor of Law at University of Nebraska College of Law. He's also the Co-Director of the law college's Space, Cyber, and Telecom Law Program. Gus is an expert in telecommunication's law and technology, including data and cybersecurity, and was recognized as a Cybersecurity and Data Privacy Trailblazer by the National Law Journal. Gus double dips on the RTP by serving on both our Cyber & Privacy and Emerging Technology working groups. More on those at RegProject.org.

 

      Our two experts CCPA discussants today are Lindsey L. Tonsager and Eric Goldman. Lindsey is a Partner at Covington & Burling in San Francisco. She helps national and multinational clients in a broad range of industries anticipate and effectively evaluate legal and reputational risks under federal and state data privacy and communications laws. She co-chairs the firm's artificial intelligence initiative as well. In addition to assisting clients, engages strategically with the Federal Trade Commission, the FCC, Congress, and other federal and state regulators on proactive basis. Lindsey has experience helping clients respond to informal investigations and enforcement actions, including by self-regulatory bodies such as the Digital Advertising Alliance and Children's Advertising Review Unit. Lindsey also helps clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for internet or things of [inaudible 3.05], behavioral advertising, endorsements and testimonials, and advertising and social media, the collection of personal information from children and students online, email marketing, telecommunications, and new technologies.

 

      Eric is Professor of Law at Santa Clara University School of Law, where he is also Director of the school's High Tech Law Institute. His research and teaching focuses on internet law, intellectual property, and marketing law. Before becoming a full-time law professor, Eric practiced law in the Silicon Valley for eight years, first as a technology transactions attorney at Cooley Godward, LLP, and then as General Counsel of Epinions.com—an internet startup company. Prior to Santa Clara, he was an Assistant Professor at Marquette University Law School in Milwaukee, Wisconsin. He's also taught as an Adjunct Professor at UC Berkley Law School and the University of San Francisco Law School. Eric blogs on internet law matters at the Technology and Marketing Law Blog and the Tertium Quid Blog at Forbes.

 

      Okay, in just a minute I'll turn the floor over to Gus. Before I do, I'd like to remind everyone on the call that The Federalist Society takes no position on particular legal or public policy initiatives, and therefore all expressions of opinion on Free Lunch Podcast are those of our featured speakers. Also, as usual our speakers will take questions after their remarks, so please be prepared with any questions that you might have prior to the start of our question and answer period.

 

      Now, Gus, Lindsey, and Eric, thank you very much for joining us today as our Free Lunch Podcast guests. Gus, the mic is yours, sir.

 

Justin (Gus) Hurwitz:  Great. Thank you, Devon, and thank you, Lindsey and Eric. It's really great to have both of you joining us for this call. And thanks for everyone in the audience for joining us. This is a big day of discussions and hearings in the D.C. area and we are glad that Eric and Lindsey are joining us along with Sheryl Sandburg and Jack Dorsey testifing before Congress from California today. So it's an exciting time with all of the tech community issues before us.

 

      The CCPA is a really remarkable piece of legislation. We've been discussing privacy legislation and regulation in the United States for the past 20 years or more. In the United States, we've not had a comprehensive approach to privacy, instead relying on piecemeal and sector-by-sector legislation and regulation to address privacy concerns. And this for many people has been far from satisfying. This includes both individual actors and regulators in the United States, and also our friends around the world. Over the last year, we've seen a number of high profile incidents from the Equifax data breach to the Facebook Cambridge Analytica kerfuffle, let's call it, in the United States. And in Europe, we've seen the enactment of the GDPR and the GDPR going into force—that's the General Data Privacy Regulation in Europe. The California legislation is perhaps domestically most equivalent to the GDPR, so we will get into a little bit about how it relates to both domestic efforts and international efforts to address privacy issues.

 

      This call I think will be a little bit different than some of our teleforum calls because we're going to start with just some background on what the law actually does. This is very recent legislation that has come together very quickly, so we want to just put out there a concise understanding of this is what the California legislation is and does, including up-to-the-minute updates from amendments that were just adopted in the last couple of days. Once we have just that background, we'll get into some discussion about what's good, what's bad about the law, what lessons can we learn from it, and how it's likely to shape things moving forward.

 

      So with that, I'm going to hand things off to Lindsey for some introduction about the law, and then we'll hand things over to Eric for his thoughts on the law as it's evolved and any problems or concerns he may have with it. Lindsey, take it away.

 

Lindsey L. Tonsager:  Great. Thanks, guys. So in case you haven't been living and breathing these issues the last several months, this all started with a ballot initiative in California—because us Californians love our ballot initiatives—that was led by a California real estate mogul. That then, in order to head off the ballot initiative from going on the ballot in November, turned into an effort at the California Legislature to enact actual law. And that was adopted in late June. As Gus mentioned, we have had some late-breaking amendments to the CCPA. So last Friday, the legislature enacted several amendments that are awaiting signature by the governor, which seems likely at this point. And we'll get into some of those specific amendments later on in the Q&A.

 

      But to summarize what generally the law requires, all companies doing business in California—so not just tech companies—will need to basically take three major steps among others. First, it imposes a right to request access to data. So any Californian can request copies of the specific pieces of personal information that the business has collected, along with a variety of other categories of information. Second, the companies have to comply with requests to delete the person's data. Any Californian can request that their personal information be deleted subject to a number of important exceptions. And then, third, there's limits on the sale of personal data. So if a business sells your data, a consumer can request that the business stop selling that personal information. And then if you're dealing with a minor, the business has to get opting consent from that minor, or if the child is under 13, they'll need to get parental consent.

 

      The law perhaps most controversially creates a new private right of action for data breaches. But it's only a subset of data breaches that are going to be covered under the law. They have to involve unencrypted or unredacted personal information. The breach has to be the result of the business failing to maintain reasonable security measures, and the business doesn't cure the violation within 30 days.

 

      One of the big changes by the amendments is that the law will actually take effect immediately rather than having a delayed deadline of January 1, 2020. So the law will take effect immediately, but the attorney general can't enforce any of the provisions until six months after the attorney general enacts implementing regulations or July 1, 2020, whichever is earlier. So with that change, businesses might have up to an additional six months more to come into compliance.

 

Justin (Gus) Hurwitz:  Thank you, Lindsey. A couple of quick questions to make sure we have the details on the table here. What is the personal information that's covered by the law?

 

Lindsey L. Tonsager:  Yeah, that's been another very heavily reported on perspective of this bill. So personal information is defined incredibly broadly to include not just things we've historically thought of personal information like your name, your credit card information, your email address, but also inferences about your online activities, very broad definition of biometric information. It's pretty much anything that's related to or that identifies the individual or interestingly, a household. This is the first law I've seen that expands the definition of personal information to a household rather than just an individual.

 

Justin (Gus) Hurwitz:  And with the compliance with data deletion requests, how broad is the data deletion mandate? Does it apply to backup copies of information for instance?

 

Lindsey L. Tonsager:  Yeah, I think there's still some ambiguity about backups in particular. I think what's most interesting about the deletion right is that it would apply to anything that's covered as personal information. But there are a number of really important exceptions. So, for example, if the information is needed to provide the service to the person, then the deletion right doesn't apply.

 

Justin (Gus) Hurwitz:  Okay. Great. And for anyone who wants to get into more of the weeds and details about the law, both Lindsey and Eric have written substantially and extensively on the law, and we have links to some of their materials on the RTP website for this teleforum call.

 

      Eric, what are your thoughts about the law?

 

Prof. Eric Goldman:  None good. So the question is all the different problems that it creates. Let's start with the most high-level overview. This is a bad law and it was developed to be a bad process. And maybe in fact there's a linkage between the two, you would expect a bad process to produce a bad law. I think that in my circles there's a lot of people who start with a premise that privacy protection's a good thing. We need protection for our private information; more protection must be better. This law, whatever imperfections it might have, still must be a good thing because it's a move in the right direction. And for anyone who feels that way, like it must be good law because it's enhancing privacy, I encourage them to really take a look at the details and understand exactly what it does and how it does it.

 

      One of the biggest criticisms of the law from the privacy advocates is that it doesn't actually start with the premise that businesses must respect consumer0 privacy. It requires some disclosures of information. It allows consumers to opt out of data sales and some of the things that Lindsey described. But the burden's still on the consumers in many cases to protect their own interests. So if you think that more privacy is good, this law only partially advances that goal.

 

      And from my perspective, the cost associated with advancing that goal are potentially quite substantial. This is a law that's going to impact the California economy, and I'm going to venture to say many other aspects of the national and global economy, to advance this only so-so privacy interest.

 

      So I guess that's my topline takeaway for all the people who came to this call thinking, well, it's got the word privacy in the law; it must be a good thing. I actually don’t think you're likely to get what you bargained for, and you're going to pay for it. And it might not actually be the deal that you want.

 

      I just want to supplement Lindsey's excellent summary of the law with a couple of other things that she didn't mention. One is that the law has a very funky anti-price discrimination clause, and I say funky because I really don’t understand how it will apply, and I don’t know how businesses are going to respond to it. The goal was to say that if a business -- a business can't distinguish consumers, pricing, and other services based on their exercise of their rights under the law. So if you go and you say, "I want to be a normal customer for you, but please don’t sale my data," you're not supposed to be charged extra for that. They're a bunch of exceptions to that, and we don’t know how those exceptions play out.

 

      But basically every price discrimination scheme turns on something that is likely to be defined as personal information under the very expansive definition that Lindsey mentioned. So I don't really know what forms of price discrimination are still going to be legal after this law in California, and I don't really know what that's going to do to businesses. So I'm just going to circle that as a big question mark here, which might have nothing to do with privacy—might have a lot to do with the way that businesses behave.

 

      The other thing I'd just mention is that the law delegates rulemaking authority to the attorney general of California with almost carte blanche to that attorney general's rulemaking authority. Basically, whatever they want to make a rule on, they're free to do so. And so a lot of the battleground is now going to shift from the legislature to the AG's Office, and we don’t really know what the law's going to say until the AG actually finishes its rulemaking procedures. Now, interestingly, in the most recent amendments the law's going to go into effect even if the AG hasn't completed its rulemaking procedures. So it is entirely possible that the text that we see today will be the law without any further clarification from the AG, which would I think really bum a lot of people out. But until we know what the AG does with this law, it's a little bit hard to guess exactly what the real rubber-meets-the-road decision making implications of this law will be.

 

      The last thing I'm going to mention is, and I'm sure you're going to get to this, Gus, I just want to flag it now, the question about who's covered by this law. There're two different dimensions on which that's an interesting and complicated question. There's the geographic dimension. When are businesses outside of California regulated by this law? And can they even be regulated constitutionally?

 

      But for a moment, let's just ignore that and talk about the fact that the law covers businesses both online and off. And it was designed to cover medium-sized businesses and larger. So the main threshold is supposed to be $25 million in revenue or above. But there are other tests for what businesses are covered, and the one that I find most interesting is the one that says "if you are selling or receiving information from 50,000 or more consumers, households, or devices," and that can cover a lot of very smaller businesses, including possibly pretty small websites. So this law reaches far beyond the Facebooks and Googles of the world, who are the people that everyone wants to shake their fist at, and probably is going to impact much smaller businesses—the local pizzeria, the local yogurt shop, the small website—and treat them as functionally equivalent to being Facebook or Google.

 

Justin (Gus) Hurwitz:  Yeah, that last point really bears emphasis that the law doesn't only apply to online entities. It applies more broadly to entities that are collecting this information, so the pizzeria doesn't need to be an online Facebook of pizza. If it's doing business in California, it could be subject to the law, which is I think a both fascinating and a troubling issue. We talk a lot in this industry technology neutral regulation, and while this is technology neutral, is that a good approach to have taken here?

 

      So I want to start, Eric emphasizes that the process by which this law has been put into place had been really muddled and rushed, and we see through Lindsey's comments as well, especially most recently with these amendments and the ballot initiatives in the background here, that there has been some rush on the part of the legislature. Can we talk a bit about why this law now? Why has the California legislature decided that it needs to do something? Is it responding to exigent substantive concerns? Is it responding to external pressures or concerns about other legislative authorities enacting legislation? And I'll do the law professor thing and call on someone to respond to the question. Lindsey, if we could start with you?

 

Lindsey L. Tonsager:  Yeah, sure. So as we mentioned the ballot incident, the response was basically an attempt to address the ballot initiative that was pending in California, which I know Eric's feelings on the CCPA, but I think he'd agree with me that the ballot initiative was even worse. And so if the legislature wouldn't have acted by the end of June, then the initiative would've gone on the November ballot. That June deadline was the last date that the sponsor of the ballot initiative could withdraw.

 

Justin (Gus) Hurwitz:  Eric, do you want to add anything?

 

Prof. Eric Goldman:  Yeah, I do. So in California we have a citizen initiative process, where in theory concerned citizens can gather together to change the law, bypass the legislature with whatever gridlock or distorted incentives the legislature might have, and take control over their own government. And from a citizen activist standpoint, that sounds great, isn't it? Wouldn’t it be wonderful if the citizens could run the government themselves? In practice, this would become an even faster track for special interests to pursue rent seeking or their pet agendas.

 

      So in this particular case, the real estate developer at issue sent $3 million to qualify the law for the ballot. And once it qualified for the ballot with a sufficient number of signatures, it created this very high-stakes poker game—either the California legislature acted within a week or the issue we put to the public, at which point the rumors were that the tech industry was going to spend $100 million to oppose it; there likely would be some money spent to support it, and nobody knew what was going to happen then.

 

      But the nuclear consequences were if the initiative passed, it was extremely difficult to change. It would require higher thresholds for legislative approval in order to get an amendment to the initiative-passed law. So effectively, given the unlikelihood of the legislature being able to clear those much higher thresholds, the idea was either the legislature abdicates responsibility to this, let's this ballot initiative go to the voters with unknown consequences and with no chance of fixing its very obvious problems, or the legislature retains control, passes a law that nobody really liked in the legislature as the least worse option. And so one of the reporter's covering the discussions while the legislature was pending in the legislature said that the words "Hobson's choice" came up every other word. Basically, everyone knew that they didn't like either option, but if nobody took action, the consequences could be potentially much worse.

 

      So this is the least worse law that we could've gotten in the circumstance that we are. But that exposes a bunch of problems with both the initiative process and maybe the legislative process in the process of doing so.

 

Justin (Gus) Hurwitz:  So we've been talking about the need for privacy regulation in the United States for 20-plus years. Does the law get anything right? What real concerns is the law responding to, and what lessons might we learn from it or take from it thinking about privacy issues in the United States? Eric, I'll call on you.

 

Prof. Eric Goldman:  So Gus had warned us, Lindsey and me, that he's going to ask us to say something nice about the law. And I feel terrible, Gus. I'm going to disappoint you here. It is so hard to say anything good about this law. The law is structurally defective because of this cost benefit problem that I mentioned earlier. And then if you go through section by section or provision by provision, there's very few things that are actually well-constructed, well-thought through, or good policy.

 

      So here's what I can say that is good about the law: it could’ve been worse, as Lindsey pointed out; that if the initiative had passed with the terms that it had, that would've been a worse outcome. So this is the least worst of the two options that we had.

 

      But I'll point out I guess two things that the law does for us. One, it reminds us of the value of the legislative process. There's a bunch of things that actually go right in the legislative process that didn't happen in the initiative process. Among other things, the initiative process doesn't have the kind of multi-stakeholder feedback mechanism that the legislature has. So this law covers the entire economy and creates thousands of what I'll call "edge cases"—situations where the law wasn't really contemplated to apply to that particular niche or that particular industry, and it doesn’t really make sense there. And the initiative didn’t have any chance for those people to come and speak about how this law was going to end up hurting them or going to end up leading to counterintuitive results. So from that perspective, the law reminds us that the legislature has some value of having the opportunity to sweep in all these voices from a wide range of constituencies. And the fact that the legislature can amend its mistakes, whereas with the initiative process, it's much harder to fix those.

     

      And I guess the last thing I'll say that's good about the law, but it's again a backhanded compliment at best, is that as Gus pointed out, we've had a sectoral approach to privacy. So we've had financial privacy law, health privacy law, and a law protecting kids. And that sectoral approach to privacy can seem really goofy. But it actually has the benefit of customizing each privacy law for the particular unique conditions of that industry and of that particular set of privacy concerns. And this law exposes -- when you try and do a one-size-fits-all law across the entire California economy, it creates thousands of "edge cases," many of which don’t make any sense. And it reminds us, then, that when we try to take that one-size-fits-all approach, actually a lot of things break in the process.

 

Lindsey L. Tonsager:  I'll chime in with it; I'll answer your question. I actually do think it's helpful in the sense that it started a conversation in a time where I think a conversation definitely needs to be had around what data privacy means in the United States in the current digital environment. And one nice thing coming out of the amendments from Friday is that both Senator Dodd and Senator Hertzberg, who are two influential members of the California legislature when it comes to this act, recognized on the floor that this was the first step at a cleanup but that there's more work to do. So although I agree that there're a number of changes that hopefully still will be made to the CCPA, I do think it's a good step coming up a framework to address these issues going forward.

 

      I also think, though, that the question of "who's next" is looming. So California's always been a leader when it comes to laws in general, but privacy laws in particular. It was the first state to enact a comprehensive Online Privacy Protection Act for online websites and services to require them to post a privacy policy. Delaware and Nevada actually subsequently followed suit.

 

      So it's not surprising that California was the first state to move in this space. But there could be a number of other states coming down the pipe, possibly New York, New Jersey, Colorado, and others. And as those differences start to emerge among how the different states are regulating in this area, I think that could be really increasingly difficult for businesses to operate under and will further drive this conversation going forward on whether or not we should have federal privacy legislation or how we should harmonize all those differences.

 

Justin (Gus) Hurwitz:  Yeah, I want to turn to the question of federal legislation in a moment. But first, a follow-up on who goes next at the state level. Lindsey, do you think it could be possible that California, in fact, will go next on the state level, by which I mean is it possible that this legislation is almost placeholder legislation that will be substantially revised in the next legislative session, given the delay in implantation of the law? Or do you think that this is the law that we're going to have in California moving forward?

 

Lindsey L. Tonsager:  Yeah, I think especially given the Senator Dodd and Senator Hertzberg statements, I do suspect that there will be a further conversation in 2019 about additional amendments that are necessary to kind of rationalize and bring this law into a line where it's a little bit more workable to actually comply with. I also think, as Eric mentioned, there's going to be a whole process at the attorney general's office to adopt implementing regulations. So there will be a lot of changes, clarifications, and the like that will come out of that attorney general process.

 

Justin (Gus) Hurwitz:  On the implementation, and I'd like to actually focus more on the enforcement side of things, what consequences are there in the law as it's written for companies that fail to comply with the requirements to reply to data access requests and data deletion requests in particular? And actually, let me flesh that out a little bit. Does the law, as written, envision a great deal of prosecutorial discretion and enforcement discretion on how the law is actually applied? Or does it envision more carte blanche approval of the requirements?

 

Prof. Eric Goldman:  I'm going to defer to Lindsey on that as the first step.

 

Lindsey L. Tonsager:  Okay. So there's two different enforcement mechanisms under the law. There's a private right of action, which as I mentioned only applies to certain data-breach scenarios. So there's not prosecutorial discretion in that sense, you know, the plaintiff's bar will bring actions when they think that standard has been met. On the attorney general's side, the attorney general has broad enforcement authority to bring actions for any violation of the law. And, look, as with anything, attorney general's offices have resource constraints. The attorney general's letters actually indicate -- the attorney general sent a letter to the legislature kind of indicating that there's no budget anticipated currently for implementation of this law, and there might need to be additional money and personnel resources that are provided for this law to be effectively enforced.

 

      And we've seen with other California laws, including the California Online Privacy Protection Act, that there can be challenges in bringing enforcement actions. There was only one enforcement action that I'm aware of that -- against an airline, and that action ended up being preempted in court under the Federal Airline Deregulation Act. So I don't expect that this law takes effect and all of the sudden there's a list a mile long of companies being taken advantage of under the statute. That said, companies will need to take their obligations seriously under the law and implement a number of measures to comply with the data access, data deletion, and other requirements, substantive requirements.

 

Prof. Eric Goldman:  Yeah, Gus, if I can add to that. Just a couple thoughts. First, as Lindsey had mentioned at the very beginning, there's a 30-day cure period and that also applies to things the attorney general could enforce. And I don't really know how that's going to play out for the attorney general's office because they might be all geared up to go and bring an enforcement action against someone due to political pressure, publicity, whatever, but they still have to give a 30-day cure period. In theory, if the business comes back and says we've cured it, then all that initial investigatory work is going to be mooted.

 

      So it's not clear to me how the attorney general's office is going to play that. Are they going to do a lot of work up front for the cases that are meritorious on the presumption that they're still going to have enforcement action—do a lot of work upfront with the hope that the businesses will cure in the regular time period, if that's even possible. In many cases it won't be. Or is the attorney general going to do lots of nibbles, and they're going to send requests for information to businesses saying, "We think there might be a problem here. Tell us if there is or not, and you have 30 days to cure any problems and then we're going to come in full force." So the 30-day cure period actually changes the prosecutorial discretion question in ways that I think are going to be interesting, and I don’t have a good guess about which way it's going to cut.

 

      The other thing I'll add, and Lindsey made this point clear, but let's make sure we didn't miss it. A lot will depend on how much -- how many resources the attorney general's office devotes to this question. And my read on the situation is that they're kind of trying to figure out how they're going to handle this, and it's not something where there's a whole cadre of people who are clamoring to the attorney general's office to get going. I think that, if anything, the attorney general's office is a little bit surprised by what they've been asked to do. And that might again be a reflection of the fact that this didn't go through the normal legislative process; they didn't have a chance to defend their interests as one of the main stakeholders in the multi-stakeholder's conversation. So I think a lot will depend on their attitudes about this law, whether they can put together a cohort of enforcers who are really excited about it, embracing this law. Or if this is going to be viewed as the thing that you get shunted into rather than getting to go work on the more glamorous cases.

 

Justin (Gus) Hurwitz:  The --

 

Lindsey L. Tonsager:  Yeah, can I jump on top of that, actually?

 

Justin (Gus) Hurwitz:  Yes, please.

 

Lindsey L. Tonsager:  Because you raise a really good point about the 30-day cure period. And just to build on that, that idea of a 30-day cure period is not new. It actually also exists in the California Online Privacy Protection Act, and I actually think it's very helpful in the sense that laws like the CCPA, which have very prescriptive requirements in terms of how you communicate with your customers and kind of what you have to do, that 30-day cure period can be very helpful in terms of avoiding "gotcha" kind of actions. So that when the company can comply with the law but avoid kind of "gotcha" cases where there is a technical violation that doesn't actually results in any consumer harm.

 

Prof. Eric Goldman:  And that's a great point, in fact, so let's extend that. One of the things that we might expect from a state enforcement agency is that they would look for those easy-to-find "gotchas" as an entry point into doing a much broader campaign. But if the easy-to-fix "gotcha" is fixable in the 30-day cure period, that might not be the door into a broader enforcement action for this law as it would be for others.

 

      And you're right about the 30-day cure period and the other precedent in California. But for example, the FTC laws—I can't think of any off the top of my head that have a 30-day cure period. The FTC when it wants to come in, it just comes in. It doesn't have to worry about whether or not the business is going to fix the problems after it starts.

 

Justin (Gus) Hurwitz:  Yeah, I was going to go straight to the FTC as well. The 30-day cure period is an important constraint, I think. It'll be interesting to see how effective it is on potential abuses of, as I would characterize them, what amounts to prosecutorial discretion. What we've seen at the federal level with the FTC is very frequently the commission uses really relatively trivial privacy or security incidents as a way to get their nose under the tent so that they can then say, "Well, okay you've solved this one small problem. Now we're going to regulate your entire privacy and security regime and require 20 years of ongoing oversight." So it's an interesting mechanism that could have important and interesting lessons or applications at the federal level.

 

      Turning to the federal level, how is the law likely to play out at the federal level? And I'll add to the discussion at this point, the California legislature has also just voted out a state-level net neutrality bill. And there's an interesting comparison between the net neutrality legislation and the privacy legislation insofar as the net neutrality legislation is likely preempted. Or there is certainly going to be an argument and the FCC has asserted that it is preempted by the Federal Communication Commission's Open Internet Order adopted in the past year. We don’t have a similar preemption argument that I'm aware of at the federal level for the privacy legislation. What are the dynamics of how this is likely to play out or effect discussion at the national stage?

 

Prof. Eric Goldman:  Gus, I'll chime in first if that's okay. So Lindsey talked about the possibility of this law proliferating as a, quote—and I have to put it in quotes—"model" for other states. That process is going to lead to some really terrible outcomes. We're going to take a bad law, and it's just going to get further mucked up in any normal ordinary legislative processes that might take place in these other states. Remember because California didn’t have an ordinary process, none of the advocates had a chance to push on any version of the law. But in the other states, they will, and the results are going to be, I expect, some fair heterogeneity of the implementations across the nation if anyone tries to start with the California template and whatever variation it is as it keeps getting amended.

 

      To me, the only logical solution is for Congress to step in and preempt all of the state laws and get the states out of the business to trying to come up theses very comprehensive solutions that ultimately are maybe beyond any individual states' scope, and certainly if every state's doing it a little bit differently, creaking this patchwork of state laws that are going to be a nightmare for businesses to try and comply with.

 

      So to me, the logical outcome is that Congress takes over this issues. And that's a crazy statement because Congress's mass will be dysfunctional, and this is a huge project for a Congress this dysfunctional to take on. So the two pressure points are on the one hand Congress is broken and there's no possible way they could get a good result on something this complicated. And the other point is if they don’t step in, the results are going to be, I think, even worse. And so I think that there's actually some good reasons for Congress to move on a law that everyone will say -- or a topic that everyone will say this is just beyond Congress's ability today. So believe it or not, I'm actually hoping for Congress to step in here, and I recognize how crazy that sounds.

 

Lindsey L. Tonsager:  Yeah, I agree that there will be a lot of attention shifting to the federal level. We already know that the U.S. Chamber of Commerce has published some proposals for outlining a federal framework for data privacy in the U.S. and a number of companies have been meeting with the administration and influential members of Congress to start laying the groundwork for that effort.

 

Justin (Gus) Hurwitz:  Are there any clear conflicts at this point or likely conflicts in how the California law will be implemented between current federal privacy regulations, activity at the FTC or FCC for instance? And it might be too early to have a definite answer to that question.

 

Lindsey L. Tonsager:  Yeah, I think it's too early, especially given that so much is likely to change before the law actually takes effect.

 

Prof. Eric Goldman:  I would only add that the law does try to establish some hierarchies of the different overlapping laws and tries to carve off certain topics that are regulated under other laws and defer to those other laws. But the way that the particular drafting was done was not really great. So, for example, it says "this law will step back in abeyance if there's any conflict with these other laws." And conflict is an ambiguous term. There can be implicit conflicts without them being expressed, and I don't know how a court or other decision maker will interpret these implicit conflicts.

 

      So how this law situates into the broader firmament of privacy regulation is actually, I think, a really complex, interesting question that I expect is going to be tested, effectively on a sectoral by sectoral basis because the other law is a sectoral. So we're going to see friction points in the healthcare industry, and they've already tried to amend the law. Or we'll see friction points in the financial industry, and we've already seen them paying attention to that as well.

 

      So I expect that we're going to see those friction points emerge on kind of a rolling basis as each industry works through its own catharsis.

 

Lindsey L. Tonsager:  And some good news on that front is the Friday amendments did fix some of those ambiguities. So the conflict-with-other-laws language has been deleted in some sections of the statute.

 

Prof. Eric Goldman:  Right, but there's still the catchall, and the catchall I think is the problem. It reminds us that we don’t really know how this law sits and plays nicely with other laws.

 

Justin (Gus) Hurwitz:  And it seems entirely likely then, and correct me if I'm wrong about this, that we could end up in a situation where healthcare providers are subject to HIPAA and federal privacy regulations, but California-based pizzerias are subject to stricter privacy rules in the State of California.

 

Prof. Eric Goldman:  Yes. Although, I think that HIPPA has a fair amount of teeth in it. But I would frame it a little bit differently and say it's likely that we're going to end up with California-related businesses, not necessarily just based but others that might be touched by the law even if they're not in California, that have to comply with both HIPPA and with the California law and figuring out which piece of data is covered by the federal law or the state law or not regulated at all is going to be, I think, a high-priced legal question.

 

Justin (Gus) Hurwitz:  So good news for law students out there, this is going to continue to be a booming industry with a lot of uncertainty and a lot of employment opportunities for the foreseeable future. A good plug for Eric's program and my program at the University of Nebraska. Keep it in mind.

 

      So I understand that we are all globalists now. So I'm going to ask about the GDPR. How does the law -- first, how does it compare to the GDPR?

 

Lindsey L. Tonsager:  Yeah, I can jump in on this one. I think that there's actually been some misinformation out there that the CCPA is just extending the protection of the GDPR to Californians. And that, unfortunately, isn't true. So, for example, one significant difference I've already mentioned, the California law—if you can believe it—actually defines personal information more broadly than the GDPR, which itself has a very broad definition, including this concept of covering data that's related to a household as opposed to just an individual.

 

      The CCPA is also much more prescriptive in terms of how companies are required to communicate with consumers under the law. So one example is in some cases companies would be required to set up a toll-free telephone number to answer some of these requests. And a lot of companies these days, that's not the best way to communicate with their consumers. Their consumers might be communicating through push notifications or instant messaging programs or text messages. That might be a more effective way to communicate with them. And so requiring a company, especially a medium sized business—the local pizzeria joint—to set up a toll-free telephone number to respond with their consumers is not necessarily all that beneficial to anyone.

 

Prof. Eric Goldman:  Gus, if I can add to Lindsey's remarks, which I agree with 100 percent. I think that there was a lot of skepticism that the GDPR wasn't the right model for the U.S. here, at least maybe not among the privacy advocates but among others beyond the privacy advocates, saying, "That just seems to European the way that Europe solved their privacy problems. That's not the right approach for us." And so the CCPA, trying to say something nice about it, takes a different approach. And so for those of us who were skeptical about the GDPR as the solution for the U.S., you might welcome the idea that the CCPA's based on a different structural framework than the GDPR.

     

      But I think we got the worst of both worlds in that circumstance. And obviously, we still have many U.S. citizens who are complying with GDPR, and now those who are governed by this law, either based in California or otherwise subject to its reach, also have to comply with this other framework. And the fact that they're not overlapping and that they cover different things actually increases substantially the compliance costs. So it would've been ideal from a business standpoint who had already complied with the GDPR that they could just roll out the same solution for the CCPA. But that won't work. They're going to have to basically take a whole new cut at compliance to now look at the specific parameters of the CCPA and then make incremental variations to reflect how it actually differs from the GDPR.

 

      So I think we got the worst of both worlds on this. We got an alternative to the GDPR, but the consequences of not standardizing is that now we have to do duplicative work and likely we won't get any value as consumers from that duplicative work.

 

Justin (Gus) Hurwitz:  And certainly it also has the impact of increasing entry barriers for U.S. firms that are compliant with one but have not entered the European market to become compliant with and expand into global markets. So as we frequently see with these regulations, they create entry barriers.

 

Prof. Eric Goldman:  Yeah, that's a great point. So, right, the entry barriers go in both directions, and if you had to comply with the GDPR, you've already overcome that. But now you've got pay extra just to get into the California market. Or if you're already in the California market but then you want to expand into Europe, you've got to overcome that extra cost in order to do that. So like anything, they do act as a barrier to expand into new markets.

 

Justin (Gus) Hurwitz:  So the GDPR has a number of core principles in it, and the California legislation isn't written in the same way. And I'm wondering a couple of the specific core GDPR principles are requiring firms to only collect data for specific purposes, to have data minimization requirements, to not collect data or use data that they don’t necessarily need, and also imposing liability on data controllers—that is the company that collects the data and has the customer relationship—for any problems that occur with data processors, the subcontractors who work with the data. Does the California legislation have any similar concepts in it? If either of you can answer this, I expect it would be Lindsey.

 

Lindsey L. Tonsager:  Sure. I'll chime in. So for many of those provisions they don’t go into those specifically. There are some interesting, with respect to kind of controllers and processors—they don’t use that terminology. That's a very European construct—but there are some provisions where businesses are kind of expected to make sure that the protections flow down through their service providers. So, for example, deletion requests. You might need to pass those on to that other entities that you partner with in terms of sharing or selling the personal information.

     

      So there's still, like we said before, there is still quite a bit of daylight between the GDPR and the CCPA.

 

Prof. Eric Goldman:  Yeah, and I'll just add to that, actually. So the law does describe business, which is like a controller, and a service provider, which is like a processor, though they aren't identical. And it actually has a third category what it calls a third party, which is really meant to describe a data buyer. So someone who's acquiring data from a third party. And yet, the three definitions put together—business, service provider, and third party I think don’t describe the universe of all the people who might be touching the data that's governed by this law. So it actually is one of the situations where they do a pie chart, but then they didn't define all the wedges of the pie or the definitions might be overlapping but leave some gaps nonetheless. And so actually, again, I think we get the worst of all worlds. We don’t get the GDPR solutions, which were not the way we think about things, but at least I think people figured out how to fit everyone into a proper bucket. Now we have to do the whole bucketing approach again with definitions that may actually be imperfect.

 

Justin (Gus) Hurwitz:  Okay, Devon, I want to hand things back to you for a moment. I have one or two more questions, but we're coming up on the 50 minute mark I see, so I want to make sure we open the floor for any questions.

 

Devon Westhill:  Yeah, that makes sense. Thank, Gus. Let's see if the audience has any questions. In just a moment, everyone's going to hear a prompt indicating that the floor mode's been turned on. After that, to request the floor, you can enter star and then pound on your telephone keypad.

 

      So the floor mode's on now. If you'd like to request the floor, enter star and then pound on your telephone keypad. When we get to your request, you're going to hear a prompt and then you can ask your question. We're going to answer all of the questions in the order in which they're received. Again, if you'd like to ask a question, enter star and then pound on your telephone keypad. I see a question here. Why don’t we go to the first question?

 

Caller 1:  Hi. Good morning. I enjoyed the presentation. What is the argument, if any, for preemption by the Dormant Commerce Clause of the California law? It seems to me, and we've got servers or other businesses outside of California, an impediment interstate commerce that for California you have special rules based on whether the electrons flow into California. It may be impracticable to decide I just don’t want to do business in California. And I heard the point that there's no express preemption argument that we can think of, but what about the Dormant Commerce Clause? Does it present an avenue of attack? Thank you.

 

Prof. Eric Goldman:  Yeah, I'm going to take the first cut at this if that's okay. And it's a question that's been on my mind and I think on the minds of some of the other opponents of the CCPA. And there are a number of points of attack under Dormant Commerce Clause. One would be if this law does propagate to other states in an inconsistent way, that's going to make the Dormant Commerce Clause issue much sharper. It's going to create an easier path to resolution. But let's assume that doesn't happen and this is a one-off that doesn't conflict with any other state's laws. The law has a number of ways in which it reaches outside of California. Let me give you one example that's expressly addressed in the statute.

 

      A California resident who's travelling to another state is still treated as a California resident if the business can figure it out. And it doesn't really resolve how the business is going to know that they're now dealing with a California resident who's in another state on travel. So to the extent that the business is -- let's pick a state: Georgia. As a California resident happens to swing by and check out their business on vacation, the law seems to try and reach that. And that should be squarely outside of California's purview.

 

      Lindsey mentioned that the law applies to things like household and households might not involve only California residents. There could be multiple residents or multiple people located in a household that come from different states or are located in a different state, but include one person who is a California resident. So by covering households, it actually picks up a larger universe of regulated entities that might be outside of California.

 

      To me, I think the best argument on the Dormant Commerce Clause grounds is that the law regulates businesses that have $25 million in annual revenue or that have these 50,000 consumer's information. But it doesn't make clear if that means all $25 million must come from California residents or all 50,000 must be California residents. If that isn't the case, and the law doesn't make it clear either way, then it could mean you make $25 million of revenue, of which $24 million plus comes from Georgia residents and $1 comes from a California resident. Per the expressed terms of the law, that seems like it would be governed, but that's clearly not permissible that you could have $1 raised from a California resident and the rest of your business from somewhere else and still be regulated by California law.

 

      And so not only is there a Dormant Commerce Clause problem with that, there might also be a personal jurisdiction limit as well on the ability of California to actually reach out and try and govern these non-California based entities. So there's a lot of really interesting geographic questions embedded in the law. I've only touched on a few of the ones that are the ones that jump out at me. Whether those translate into legitimate constitutional challenge to the law is something I think bright minds are thinking about.

 

Lindsey L. Tonsager:  Yeah, kind of unrelated to the Dormant Commerce Clause question, but related to the kind of question about constitutional validity, the amendments kind of added some additional language to help protect journalist's First Amendment rights. But as drafted, they're really focused on journalism. So of course other people and entities have First Amendment rights as well. So I think it's an interesting question about whether or not the law goes far enough to protect First Amendment rights.

 

Justin (Gus) Hurwitz:  And I'd just add that I thank the California legislature for having written have of my cyber law exam for the semester.

 

Devon Westhill:  They've made it easy for you, Gus.

 

Justin (Gus) Hurwitz:  Yes.

 

Devon Westhill:  Thank you for the question, and thank you for the responses. Gus, we're about four minutes out from a hard stop that we have at 12 o'clock Eastern Time. I don’t see other questions queued up, and I think it probably does make sense at this point, although I think we could go on for quite some time, and you mentioned that you have some other questions. But maybe for another time we should leave that and just get some final remarks from Lindsey and Eric and also from you, Gus, if you have any.

 

Justin (Gus) Hurwitz:  Yes, absolutely. And I'll just turn things over to Lindsey and Eric for their final remarks by queuing up what would've been my last question. Will this law, I hate to use the term, quote, "break the internet?" But feel free to respond to that or not as you offer us some closing thoughts. And let's start with Eric.

 

Prof. Eric Goldman:  Yeah, so the law does contribute to a breakage of the internet, just like the GDPR did. It creates these geographic-based privacy distinctions that force businesses to choose, if the laws work as stated, whether they're going to deal with people in particular geographies or not. And we're seeing a widespread fragmentation of the internet into smaller geographic-based internets that are all looking a little bit different because local geographic-based laws are causing businesses to handle each geography's population differently. So in that sense, the law will contribute to breaking the internet.

 

      But the law breaks the California economy, and that's I think one of the most important takeaways from this law is this is not an anti-Google or Facebook law. Google and Facebook are fine with this law because they can afford to comply. This law breaks the economic models of so many smaller businesses who are going to spend a lot of money on compliance for not a lot of value for consumers. And so in that respect, I think this law does a lot of violence to economic assumptions that we all have made. I call this law the privacy bomb because it's like dropping a bomb on the California economy. Or in the blog post I'm doing now, I just show the dumpster fire meme because I think that's the best way to think about this law.

 

Justin (Gus) Hurwitz:  Okay, Lindsey. Take it away.

 

Lindsey L. Tonsager:  Yeah, I'll just add to that. I think I agree. It's a fair question, but it's important to remember that the law isn't just about the internet companies. It is about the brick-and-mortar retailers and businesses as well. And although a lot of companies have already set up procedures to respond to data access requests, data deletion requests, and the like, there will be a lot of companies in California, particularly those local medium-sized companies in California whose business isn't selling data but will now have to set up entire new procedures to figure out what exact information they've been storing over the years, where they're storing it, how they can technically delete it from their systems, how to ingest all of these new customer requests, and that's going to be a very time intensive and expensive process for them.

 

Justin (Gus) Hurwitz:  Okay. And I will conclude by just thanking both Eric and Lindsey. This has been a wonderful and fascinating conversation. I learned a lot. I am certain it's not going to be the last of these conversations. We've been talking about privacy issues in the United States for 20-plus years in the internet era, and it feels like this is, in many ways, a turning or starting point to a new phase of those conversations. So I look forward to seeing what happens next. Thank you, and Devon, back to you.

 

Devon Westhill:  Well, I second your thanks to Eric and Lindsey, Gus. But thank you also, Gus, for running a wonderful call here. I also want to thank our audience for joining us. But with that, on behalf of The Federalist Society's Regulatory Transparency Project, we'll say so long until next time.

 

Operator:  On behalf of The Federalist Society's Regulatory Transparency Project, thanks for tuning in to Free Lunch. As always, you can subscribe on iTunes and Google Play to get new episodes of Free Lunch when they're published. Also, visit our website at RegProject.org. That's R-E-G project.org. There, we regularly upload content in addition to our podcasts, such as short videos and papers. And you can join the discussion by sharing your story of how regulation has personally affected you. Until next time, remember, there's no such thing as a free lunch.

 

Operator:  This has been a FedSoc audio production.