Within 7 hours of the EU’s General Data Protection Regulation (GDPR) coming into effect, Austrian privacy activist Max Schrems and his non-profit None of Your Business (NOYB) lodged four complaints against Google and Facebook, seeking damages of $8.8 billion. Among its 173 recitals, the GDPR empowers litigants through enumerated rights of representation, judicial remedy, and compensation to create non-profit organizations, lodge complaints, and collect fees on behalf of users. Violations are punishable up to 4 percent of annual revenue. The EU claims that the sweeping regulation empowers consumers to control their data, but seasoned observers note that the law more likely strengthen existing players, those which can afford the $1 million+ compliance costs for new staff and software required by the GDPR. In this Teleforum, privacy and regulatory experts discuss the economic, legal, and geopolitical consequences of the new law and what it could mean for class action litigation.
Sunny Seon Kang, International Consumer Counsel, Electronic Privacy Information Center (EPIC)
Roslyn Layton, Visiting Scholar, American Enterprise Insititute
Adam Thierer, Senior Reserch Fellow, Mercatus Center, George Mason University
Teleforum calls are open to all dues paying members of the Federalist Society. To become a member, sign up here. As a member, you should receive email announcements of upcoming Teleforum calls which contain the conference call phone number. If you are not receiving those email announcements, please contact us at 202-822-8138.
Operator: Welcome to The Federalist Society's Practice Group Podcast. The following podcast, hosted by The Federalist Society's Telecommunications & Electronic Media Practice Group and Regulatory Transparency Project, was recorded on Monday, July 9, 2018 during a live teleforum conference call held exclusively for Federalist Society members.
Wesley Hodges: Welcome to The Federalist Society's teleforum conference call. This afternoon, our topic is the GDPR and the future of internet privacy and is hosted by Telecommunications & Electronic Media Practice Group and the Regulatory Transparency Project at The Federalist Society. My name is Wesley Hodges and I'm the Associate Director of Practice Groups at The Federalist Society.
As always, please note that all expressions of opinion are those of the experts on today's call.
Today, we are very happy to have with us a very accomplished panel of experts. And starting with our first panelist today is Roslyn Layton, who is a Visiting Scholar at the American Enterprise Institute and a Visiting Researcher at the Aalborg University in Denmark. Our second panelist today Sunny Kang, who is International Consumer Counsel at the Electronic Privacy Information Center. And our third panelist today is Adam Thierer, who is a Senior Research Fellow at the Mercatus Center at George Mason University.
After our speakers give their remarks today, we will move to an audience Q&A so please keep in mind what questions you have either for the subject or for the individual panelists today. Thank you all very much for speaking with us. Rosalyn, I believe the floor is yours to begin.
Rosalyn Layton: Thank you, Wes and good afternoon, everyone. I'm delighted to kick off our call today. I've been a member of The Federalist Society for a number of years, and I get so much value from all of the teleforum. And so it's wonderful to be able to participate in one and contribute to the knowledge. So thank you all for joining us today. And I'm delighted to have the two additional speakers.
I would like to share with you my perspective on the GDPR, which comes as an academic studying the impact of the GDPR, but also prior to working in my particular field, I had a career in digital marketing, and, in fact, it's what brought me to live in the European Union over a decade ago. I actually worked in the field, and I have been witnessing the rollout of the various privacy regulations over time. And so what I want to share for the -- to set up our call, talk about the policy from academic, legal, geopolitical, and economic aspects.
So one of the things I think that's most interesting is if you take a sort of evidenced-based policy perspective, the EU almost a decade ago starting rolling out the Cookie Law, the new privacy directive, and what's interesting we've had a period of time to measure what was the impact of the policy since it was going in place. Ideally, you would see people should have more trust online, maybe they shop online more or use online banking. But it's interesting with a decade of regulation to privacy, the sense of trust reported by European consumers has increased only about 10 percent. As of 2017, the European Union reports only 22 percent of Europeans shop outside their own country.
And so this is not the best news because there is a goal of making a digital single market. As you know, the United States we have a common language and currency and I'm sure there are barriers of selling across states, but it's a world of difference when you can create your website, and you have everyone in the same language and currency being able to access it. And, of course, a large market. But that's still elusive in the EU.
More interesting as well is that only about 20 percent of European companies are highly digitized. Now, of course, for example, in a country like where I live in Denmark, everything is digital. We haven't used checks in 25 years. But that's quite the exception. Most of European countries are still quite analog, if you will. Most small, medium-sized businesses don't sell online. So that is definitely a concern, particularly because since the economic crisis in 2008, a lot of European countries are still suffering low employment, very slow economic growth and whole idea of a digital, single market was to get more competitiveness in the EU, more growth in the EU, and that hasn't happened.
We're also seeing very high costs to comply with European regulations, sometimes a $1 million or more for large firms. Requirements to hire new employees and so on. And that can be cost prohibitive for small to medium firms.
So the other thing to note is that it's probably not reported as much in the American media, but the EU is going through a crisis of sorts. You've all probably heard about Brexit. But there's a number of European countries who want to leave the EU or have political parties advocating for that. It's going on in Poland, in Hungary, Czech Republic, Italy, and so on. There is actually quite a high rate of Eurosceptic parties. So they're very upset; they don’t believe that these policies are working; they don’t have a sense of what the European project was about.
So, in many respects, a lot of political scientists see what's going on in terms of a geopolitical scheme in the sense that the European countries have not been able to produce their own powerhouses in terms of the internet. No European company has appeared on the top 20 list for internet companies since 2013. So there's, essentially -- you can't really name a European app that you use outside of Angry Birds or Spotify. So they're not really European powerhouses.
So the response is a kind of, "If we can't compete with the U.S., let's regulate them." And similarity, the European Union has tried this before. In the introduction of the 3G GSM standard for mobile phones, the Europeans adopted a standard and wanted to make that the global standard for mobile. Apple was not going to make CDMA phones, which was the prevailing network we had in the United States at that time. So this was a kind of ploy to make everybody adopt the European standards. Well, the United States said, "Well, we're not going to take the 3G standard. Let's just jump to 4G." And that is really what got our mobile economy going. Verizon started investing in 4G and followed the other operators.
So it's something that we've seen happen from the EU before. Geopolitical aspects do come into policy. This has also been underscored by who's called the "Father of the GDPR," Jan Philipp Albrecht, who said that he wants this to be something good for European companies.
The other thing to note is you can look at the GDPR as a tariff on the American commerce. We do have a trade deficit with Europe, but if you remove the digital sector, it's maybe $70 billion. If you remove that, we have a much smaller trade deficit with the EU because we have surplus of goods and services that we export to the EU. So many observers believe that the GDPR is really a form of trade policy or trade tariff to make using a European -- sorry, using an American website or service to be less attractive. I keep a log every single day of the websites I can no longer access from the EU. For example, Williams-Sonoma. All of their content is completely blocked. I can't shop on Williams-Sonoma. I can't even check recipes. Los Angeles Times, a whole range of newspapers. These are companies that just said it's not worth it to us to operate in the EU. We're not going to show our content. We won't sell our goods.
So then the last area—which I'm working on a paper for The Federalist Society on this topic—is really there's -- if you read the 173 provisions of the legislation, there are three specific numerated rights around the GDPR guaranteeing the right of representation.
So you can select a nonprofit to represent you and collect money for you. There's a right of compensation and a right to judicial remedy.
For anybody who's familiar with the class action lawsuit culture, it looks like that's really been -- that was the idea. That was engineered by Max Schrems, who is noted for his lawsuits against Facebook, bringing down the privacy shield.
And so today, he has a nonprofit called None of Your Business. He's filed -- well, his nonprofit has filed $8.8 billion worth of complaints against American companies, Google and Facebook. And two of the leading framers of the GDPR sit on his board. They are people who are working in the government. They are serving on his nonprofit, and they are working together to try to design a legislation to go after American companies.
In a press conference about a month ago, Jan Philipp Albrecht said, essentially, the policy is to go after American firms. They're not going to go after European firms or small to medium sized firms. So this has kind of been engineered to hammer American firms because in the views of these particular individuals, or this sort of movement, that antitrust is not sufficient or that the rules on the books weren't enforced or whatever, so that they're now going to professionalize the class action lawsuit culture. To date, it was not so popular in the EU to do that because they recognized, as many Federalist Society scholars, that one of the challenges with class action lawsuits is frequently the attorneys get the disproportionate share of the winnings versus the actors of the consumer's part of the lawsuit. Not always the case, but that can be an issue.
Another issue is very frequently that individual's particular concerns can be subjugated to the entire group.
So in any case, just the last point, I think, which bears mention, is that across the EU today, there are 62 individual, regulatory agencies for data protection and privacy. And all of these agencies have been empowered with some 35 new responsibilities. It has not been -- they have not received training necessarily or necessarily have funding to do what these responsibilities are. So that's a big question. And so the concern is does enforcing privacy rights sort of overshadow other fundamental rights? And we have a kind of skewed world where privacy becomes everything. And there's not a way to balance the other concerns that…
There are many concerns—and not that everything's necessarily, doesn't need to be mutually exclusive—but the entire legal culture will be consumed with GDPR complaints. I can say just in one month in the little country of Denmark, 5 million people, we've had 5,000 complaints. And they're not necessarily about Google and Facebook, but they're about government agencies; they're about anybody who's, "I'm upset with my mobile operator. I'm going to file a complaint about them," to maybe a legitimate issue for a person who's data has been violated or what have you. So there's now this explosion, and even there's software and startups helping you to automate your complaints so you can say, "Here is my email ID. Please find every single company where I'm registered and file a request or GDPR requirement to all of those."
So it's, I would say, a whole new world here. On top of that, it's expected 75,000 new data protection officers will be hired who have to go to work to be able to fulfil the requirements with the GDPR, having to have a privacy officer on staff, or so on. So, in any case, there's a number of unanswered questions, unintended consequences, and just coming back to the whole point, if we're wanting to do this, we should have a way to measure that it works. And we haven't seen in a decade that the policies to date have improved the trust, have improved the competitiveness, have made things necessarily better. So it really remains to be seen that this is a step in the right direction. Wes, back to you.
Wesley Hodges: Thank you very much, Rosalyn. Sunny, I believe you are our next panelist.
Sunny Seon Kang: Thank you very much. My name is Sunny Kang and I'm International Consumer Counsel at EPIC, and I really appreciate the opportunity today to continue this important dialogue on clarifying and implementing the GDPR.
So GDPR is a landmark movement to modernize and harmonize data protection law globally. It pushes forward many integral user rights, such as algorithmic transparency and the right to object to processing in order to promote accountability in tech companies that hold massive amounts of personal data. And it does so by expanding the scope of European data protection law to create a level playing field between business's space in and out of the EU and globally maximizing the implementation of important policy goals in the GDPR.
For EPIC, there are two overarching policy goals for the GDPR and the future of privacy regulation in the U.S. First, American companies with a global user base should extend these stronger, privacy standards to everyone regardless of where they live. Because if not, there would be a double standard where American users would have second-class privacy rights to European users. This is particularly true if a company like Facebook had publicly committed to complying with the GDPR worldwide, then the FTC should use their enforcement powers to hold them up to this promise. These GDPR compliance claims, now we know, should not be treated as PR puff because the consequences of a weak regulatory oversight were made painfully clear this year from the recent fallouts of Cambridge Analytica, Equifax, and the increased incidents of data breach and financial fraud for American consumers.
Secondly, there is public dissatisfaction with the current data protection regime in the U.S. and there could not be better impetus for reform than right now. The U.S. needs to adopt our own comprehensive data privacy legislation that enshrines the rights and responsibilities model of the GDPR and to create and independent, data privacy authority with the competence to bring impactful enforcement options.
Now, I'll discuss these points in three parts. First, I'll compare the GDPR with the current legal landscape in the U.S., which will bring into sharper focus why this reform is necessary here for the protection of consumers and ensuring responsible innovation itself. Then, I'm going to discuss data minimization and algorithmic transparency as a specific rights and responsibilities in the GDPR that needs to be adopted in the U.S. And then I'll conclude that privacy is beneficial and crucial to innovation and competition.
So first let's discuss the GDPR as a comparative model for the U.S. GDPR, understandably, is a big legislation and the intent is to keep companies vigilant about what data they're collecting, how the data is being processed and held, and why. It sets up procedures for accountability. And data protection authority, as a member state, now have enhanced, regulatory powers to impose responsibilities on companies that collect personal data. And this supervisory mechanism ensures that the law has teeth and the rights in shrine within it are actionable against the responsible parties.
At present, the U.S. has no general, data protection legislation that's comparable to the GDPR for both public and private sectors. So sectoral laws that we currently have leave out a significant portion of the U.S. economy and society, such as social media platforms. And this patchwork system is incomplete. It has been outpaced by emerging technologies, such as IoT and behavioral advertising, which don’t neatly fit in to the current, regulatory landscape that we have.
As a result, U.S. consumers have significantly narrower control over their personal data. So now the question is how did we get here and how do we progress from here? And the first thing to recall is that the U.S. and the EU do fundamentally share common privacy traditions and roots. And this is reflected in the OECD Guidelines for Transborder Data Flows in which the U.S. was a party. The U.S. Privacy Act of 1974 which applies to governmental data collection, and the Consumer Protection Provisions of FCRA, and the Code Affair Information Practices developed in the U.S. However, these fail to extend to the commercial sector and gave way to sectoral rules and self-regulation in the U.S. And this led to several problems that consumers face today, which I am trying to discuss that we need to address.
First, U.S. privacy laws stagnated, yet they're no technology-neutral standard innovations to guide innovation while preserving privacy. And institutionally, there is no federal privacy agency with the competence or the expertise to address these emerging challenges and new unforeseen uses of technology. The FTC has the responsibility to gap fill sectoral legislation with its wide authority to prosecute deceptive and unfair commercial practices, but we argue that it has failed in this role. The inadequacies of FTC oversight can be summarized as the failure to enforce its binding legal judgments against companies; lack of continual oversight over violators to prevent recidivism; and the resulting weak deterrent effect on companies, like Facebook.
So the biggest example that I keep going back to is how the FTC failed to enforce its 2011 consent order against Facebook. So when the Cambridge Analytica data leak happened, it happened on the FTC's watch. And secondly, FTC continually fails to meaningful review or scrutinize technology mergers like Facebook's acquisition of WhatsApp to ensure that the acquiring companies don’t roll back privacy promises already made to the consumers. And this bad enforcement in the U.S. is also exacerbated by the self-regulation of social media platforms, which creates a critical lack of transparency and accountability in data processing by internet platforms.
So, for example, Facebook and Google are constantly withholding the extent of data they collect and disclose to third-party advertisers and app developers until the actually have to respond to a breach. And this information asymmetry and big-data culture has put U.S. consumers in the dark about the serious privacy externalities that they bear. And these lapses created a breeding ground for irresponsible data practices and business models built on blindsiding consumers into the uncharted collection and use of their personal data.
So this indulgence for personal information that exists in the U.S. and is enabled by lack of standards would continue to erode user privacy, if companies were to apply the GDPR with a narrow territorial scope, benefiting only Europeans. So a dual standard is not only unfeasible, but it's manifestly unfair for the billions of non-EU individuals who are deserving of equal data privacy rights.
So I'm going to move on now and let's discuss data minimization and why the notice and choice model for privacy is not enough, and why we need to minimize the collection of sensitive data to begin with. To comply with the GDPR, companies can't just carry on business as usual by merely updating their privacy policies and having users consent to them. They must critically review and amend their business models and develop privacy-enhancing techniques that comfort with the letter and the spirit of the GDPR. This means minimizing the collection of personal data and finding better ways to engineer for privacy per -- required by Article 25 of the GDPR. The monopoly power of Facebook and Google solidified through the unregulated collection and use of personal data by collecting far more data than is necessary to serve a function and not placing enforceable limits on the purposes and retention periods of data. And tech companies have been deriving all the commercial benefits of data, while externalizing all of the privacy rights of individuals. And this isn't only unethical, but it threatens the individual rights to privacy, autonomy, and the integrity of our democratic processes from elicit interference by data misuse.
So laws need to encourage the development of innovative services that rely on less data and have better safeguards against breaches. Bigger data doesn't necessarily mean better utility and data that isn't collected can't be breached. So the burden should be on the companies to minimize this collection and for regulatory agencies to enforce this and not on the consumer to read through thousands of pages of privacy policies and fine prints to protect their interests. And this also is impossible to due to the information asymmetries that exist.
Finally, this asymmetry has another fatal consequence on privacy and the rule of law. Individuals not knowing, first, when they are subject to automated decision-making by companies and governmental agencies on employment, housing, credit opportunities, and immigration and whether the algorithms are deployed to make these outcomes were accurate, fair, or even about these data subjects.
So many organizations are increasingly delegated decision-making on individual's livelihoods, rights and opportunities to proprietary and opaque algorithms they don’t fully understand or can justify. The result is that personal data is collected and inputted into these systems, and the output contains unchecked biases and errors. So algorithmic transparency is critical to ensuring accountability for the rationale of a specific decision impacting the subject's life and opportunities, and the GDPR ensures this with Article 13 on the right of be informed of data processing. Article 15 access rights of the data subject, and 22 on automated decision-making and profiling, and it's also bolstered by many other recitals that address profiling. And EPIC also recently advised the U.K. Information Commissioner's Office that Data Protection Impact Assessments, recently called PIA's in American, mandated by the GDPR require companies to examine the logic process of their algorithms and asses potential impacts on individual rights and liberties.
And the U.S. needs a similar law and it's definitely overdue. It needs this to protect against algorithmic discrimination through the right to examine the design, implementation, and consequences of automated processing. This would provide checkpoints for transparency and accuracy at each processing stage to improve data governance, data quality, and the opportunity to correct hidden biases. And people should definitely have the right to invoke remedies and obtain redress from adverse decisions made by algorithms without human review. And once again, this is a responsibility of data processors to justify the provability of their own analytic systems and to address potential and actualized harms.
I'm going to conclude by saying that privacy protections are necessary to align innovation with democratic institutions and fundamental rights. America definitely has the tools to step out of the sidelines of digital privacy and they're compelling forces for conversions and harmonization and also great potential and demand for the U.S. to adopt tougher privacy laws. Notably, there are foundational privacy principles inherent in the U.S. systems that strikingly coincide with the GDPR due to common roots that we have through the OECD. And the mandate now is to redevelop them with comprehensive legislation and bring data protection regulations up to the global standards set by the GDPR. Thanks. Back to you, Wesley.
Wesley Hodges: Thank you very much, Sunny. Adam, I believe you're here to round us out.
Adam Thierer: Well, thank you, and I'll be brief. I've just got a few points I want to make, building on some of what we've already heard. The first high-level point I'd like to make is that the GDPR is no free lunch. Compliance is very costly. All regulation entails tradeoffs and no matter how well intentioned rules are, we always have to pay attention to what those tradeoffs entail. It's already been estimated that U.S. firms have spent around $7.8 billion in terms of compliance with the GDPR rules. There are putative fees that can range from $20 million Euros to 4 percent of global firm revenue. That's extraordinary money when you think about it in the global context.
The vagueness and the language of the regulations is such that there's considerable regulatory uncertainties surrounding what the law means and entails in terms of what, quote unquote, "compliance" even means. Even the EU member states aren't quite sure what compliance means. And a Reuters survey in May revealed that 17 of the 24 regulatory bodies polled by Reuters said they were, quote, "unprepared for GDPR's requirements."
The second point I want to make building on that is that GDPR will ultimately hurt competition and innovation and probably favor big players over small ones. Google, Facebook, and other tech giants are already significantly beefing up their compliance departments. One EU official, who's been in charge of doing a lot of GDPR issues, was quoted by the Wall Street Journal as saying that Google, Facebook, and others when she visited them, she worried that, quote, "they have the money, an army of lawyers, an army of technicians and so on to deal with the law," and that it really intimidated her that they were beefing up like that. But it really shouldn't be surprising because they have to, of course.
But the problem is is in the meantime, there're smaller firms and even mid-size firms that can't bear that same burden, and they're either exiting the market entirely or they're dumping data that could be used to provide better services or more tailored types of applications to their consumers or future consumers.
There was a PricewaterhouseCoopers survey that found that 88 percent of companies surveyed had already spent more than $1 million on GDPR preparations and 40 percent had spent more than $10 million. You have to think in a new startup that that's real money, and that it's going to be really hard for them to pay those sorts of compliance costs and to break into the EU, if they're not already there.
There's also a lot of concern about Google and Facebook, obviously, over there in Europe, but the reality is is that this is only going to make them bigger. The first day after the GDPR went into effect, there was already reports of how an increasing share of overall ad spending was starting to go to Google after they already had a pretty significant share.
So, in essence, with the GDPR the EU is sort of surrendering on the idea of serious competition being possible going forward to some of the very giants that they most want to see regulated. The law could actually end up having the unintended consequence of benefiting the very companies that the EU has long been going after on antitrust grounds because the smaller innovators are going to suffer under this law when the big players are the only ones who can afford to pay the pound of flesh to comply.
A third point, I think, is from a consumer perspective that the GDPR is likely to raise costs to consumers or diminish choice in quality in some other ways. Consumers obviously care about privacy, but consumers also care about choice, convenience and low-cost services. And the modern data-driven economy has given consumers access to an absolutely unparalleled cornucopia of information and services. And what's most remarkable about that story is how much of that content and how many of those services are offered to the public at no charge at all to them. That's a real consumer benefit. But if you take all of the data out of the data economy, you won't much have an economy left to regulate. Many organizations are already passing along these costs to consumers or erecting new pay walls or forcing users to view more intrusive pop-ups or ads. They've been inundated already with various types of notices and been confused by a lot of those. Some websites, including major newspaper sites, have blacked out entirely in the EU or created some sort of an EU-only web experience with a strip down experience for European consumers relative to others. Or they're just charging more money to European consumers. So that shows that the GDPR has real costs for consumers.
I'd also add that the GDPR potentially hurts the global flow of information and worsens the problem of data localization. The rules only allow data to move between jurisdictions that offer adequate level of protection under the regulations. And this raises the question for things that are happening in a world of cloud computing. What happens when you try to have one set of rules that govern the EU, but everyone's data is up there in a cloud far away? Is this just another step there for -- towards a so-called "bordered internet" with more walled gardens and local control that really starts to atomize the internet and the wonderful experience that we've enjoyed over the past couple of decades? It seems to be the case. It also raises the question that I think Rosalyn teed up which is that is this just really a roundabout way of the EU imposing sort of tariffs on databased activity to further go after American-based firms or America more generally?
I think there's also an important point here which is that the GDPR doesn't necessarily solve some of the bigger and more serious privacy problems related to government access to data. We have the EU Data Retention Directive about third parties retaining data for law enforcement for many years. It was passed after some terrorist attacks and member states have various types of conflicting policies regarding government access to data, which are, in some cases, quite intrusive. And in many cases, this is what consumers most care about, but it really isn’t help, much, here by any of this.
And then I'll just make a final point, which is that it's not entirely clear to me that the GDPR or any sort of legislation like it really does much to move the needle much in terms of true privacy protection. It's easy to believe that heavy-handed, top-down regulatory regimes are well-intentioned and would somehow curb supposed privacy abuses. But the reality is is that what gives consumers real privacy protection is ultimately more and better choices. We want competitive options and privacy innovations that offer alternatives to what we have and give those of us who want more privacy protection that right, while giving others a kind of experience they might prefer, which could be very different. We should not have a one-size-fits-all sort of top-down approach that makes this decision on behalf of all of us.
Unfortunately, the world we get and the choices that we need aren't going to appear in a world where GDPR is applied extraterritorially and Brussels is making this decision for the entire planet. That's only going to raise regulatory compliance costs, punish small innovators and new entrance, and really only probably end up benefiting the largest current incumbent companies, who Europe and others feel already have too much power. But they're certainly not going to have any less power in a world where the GDPR rules. So I'll just wrap up on that.
Rosalyn Layton: So -- and I want to thank my fellow panelists for some excellent remarks. I always learn so much listening to Adam and I'm really very pleased to have Sunny on the call today to get her particular view. I know that EPIC is also on the board of Max Schrems' nonprofit [NOYB]. So I really want to understand why she felt it's important, and I think she did bring up some important points that I want to respond to.
I think, as many of our listeners will know, regulation is never neutral. There will always be a political angle. And it's certainly unfortunate in the case there was a consent decreed for Facebook that wasn't honored, but one can also say was this a politically motivated -- was it politically motivated because it was during the Obama era and he was called the Silicon Valley President? So there wasn't enforcement of Silicon Valley. That's one -- I'm not saying I believe that, but that's one critique.
You can look at the European Union that has 62 individual privacy and data protection authorities, and even with that many, there's a belief that there was -- that the privacy protections were lacking, or the data protection was not enforced. So the notion of creating yet another regulatory agency to fix the problem has not been proven at all in the European Union.
The value of the United -- the approach taken by the United States was—I think has many values—but one is it's low cost in the sense that efforts focused where harm actually exists, not where we pretend or we would like to theorize it can emerge. We actually look where it does exist. Now, the sector specific regulation does reflect important areas for children, financial, sensitive information we know from the get-go is very important.
The other aspect from the American perspective is that we have a natural right to -- about privacy and that's important for so many ways to the founding of our country, which is we believe that people are born with inalienable rights and that they don’t request anything of others. It doesn't require that a company behave in a certain way. The government is not forcing different entities to do different things. And that's important for the freedom of choice, to be able to start a business, but also freedom of a consumer to choose a particular business.
So I think it also bears mention that even in the midst of all of the blowback against the tech industry, I mean, I know personally my own family, there're people who use Facebook. They love it so much they will never stop. It's used by members of my family of every age and range because that's how they like to share. They couldn't care less about Cambridge Analytica. Now, that doesn't mean they're stupid. But they value the service they get, and it's all in one place, all the things that they want. And they also understand their tradeoffs. And they don’t feel that they're being violated, and they certainly don’t think that if there's another independent agency that their life is going to be better.
If you look at the Eurobarometer, which is the official statistical survey of the European Union, Europeans themselves were not asking for the GDPR-type regulations. That, in fact, they had much more trust in their own national government's data protection rules. They thought it was much more close to home. They're distrustful of what the EU is trying to do, and they actually believe that there were many things that they could do themselves. And, probably for me, what's really stunning is that the framers of GDPR ignored the academic studies of their own European research institutes, which found that improving privacy was largely a function of consumer education and privacy enhancing technologies.
So we could double our privacy experience by educating consumers. This can be done at schools; it can be done by nonprofit organizations. Basically, what the Eurobarometer showed was that people were not doing basic things to protect their own privacy. And they could do many better things around being more aware, taking additional precautions about wat they…basically doing, being more, acting in a more responsible way. So that doesn't require any kind of regulation. But it gives an immediate benefit to the user.
The second part is a thesis, which has been underscored by the wonderful book you should read called Privacy on the Ground by Bamberger and Mulligan, that where privacy rules are ambiguous, companies do more to innovate in privacy. Now, I'm delighted that Sunny will talk about data minimization; that's great. But that's just about 1 of 100 different kinds of privacy enhancing technologies that a firm could decide to use. There's no one technology that's best. Different companies may have better and different ways to deliver privacy, and if you really want them to compete on doing a better experience, the government shouldn’t mandate what it should be because that, essentially, freezes in place the government's preference, not what the consumer preference necessarily is.
So just in closing, what I would like to say is that I do -- I would call out the leadership of Kennedy and Klobuchar, the two senators who have made a draft bill and one of the things they have in the bill is an innovation safe harbor. So that they won't criminalize a company that's attempting to experiment or innovate in a privacy enhancing technology. Today under the GDPR, me as an academic, making a mistake with some spreadsheet that I used, if I happen to mishandle some survey data, I can be prosecuted. This kind of thing doesn't differentiate at all. What's valuable about having an innovation safe harbor is we need to have a safe place for companies to be able to innovate without feeling fear of a retribution from a privacy regulator to punish them for trying to do the right thing, trying to make the systems better.
And finally, we know very well from looking at all the -- you know, our great regulatory history is that regulators are not necessarily better situated to know what consumers themselves desire. It's quite impressive to see 173 recitals with 45 regulations on business and 17 or 18 enumerated rights, but if you actually look at surveys of Europeans, the value that they put on the different rights is completely out of whack with what the European politicians thought was helpful. So, for example, many people thought that data portability, this was going to be the silver bullet. Everyone will just port Facebook's data to the Facebook clone. But it turned out some of the survey people liked the right to be forgotten.
So the whole sense of the various 17 different rights, they're all considered equally important from the politician's perspective, but from the users, they have an extremely different value about how they will value them. And one may be 10 times more valuable than any of the others, for example. So this notion somehow that a privacy activist or a politician would know better than any one person what's the particular thing to value, that's completely out of what with what we see from the Eurobarometer data, from the various studies, from European universities trying to make an economic analysis of GDPR—if you had to pay for it out right and so on. So sadly, all of that stuff has been ignored in the European process. At the very least, I hope that American policymakers will look at the actual data and not just go on kind of populist or what's really an elitist-driven version of what privacy is supposed to be.
Sunny Seon Kang: Can I respond to that, Wesley?
Wesley Hodges: Of course, Sunny. Please go ahead.
Sunny Seon Kang: So I'm going to respond to a couple of points made by both Adam and Rosalyn. I'm going to start with Adam's point on GDPR and competition. In America, there was a big misconception that the GDPR will shut down small and medium enterprises and not have enough cost for compliance. But the GDPR compliance process is an ongoing process. It's not the fact that you're not compliant right away means that you'll be shut down on May 23, 2018, the day after GDPR went into effect. It's more about the ongoing accountability and interaction with the data protection agencies to be transparent about what data's being collected and whether it has a lawful basis.
Secondly, the GDPR will not make privacy into tradeoff. It's not about paying more for privacy; it's actually about reducing the power and balances that exist between the consumers and the companies, and the consumers not knowing whether the choice that they're making are actually deceptive by design. So, for example, I'm going to go into Rosalyn's point here. She mentioned that consumers still use Facebook, and it's their choice to use it despite all of these scandals happening. But my point is that it's not about the addiction by the consumers and whether they're actually choosing to do it because they want to, but it's because they're being deceived by social media platforms who aren't being transparent and who aren't giving them a robust right to access and control the data that is happening. So there's an information asymmetry which starts to be addressed by the GDPR, but without this consumers won't know whether their data has been breached, whether their rights have been infringed.
And my final point is that Rosalyn mentioned that even if the GDPR was in place, it would not have necessarily deterred Cambridge Analytica, and I disagree with this because firstly, the GDPR has a great deterring effect with the enforcement penalties. So at the least it would've had an ex-ante effect on a deterring effect. And secondly, if the consumers using Facebook were more aware of the kind of operations that Facebook was deploying and if they had the right to access their data to rectify it, to restrict certain processing, then at the very least, the impact of this breach would've been minimized.
So even if this would've happened anyways, I think that having the GDPR in place would've had better ex-ante procedures to ensure that the impact was minimized and also better redress mechanisms for the ex-post impact of such a breach.
Wesley Hodges: Thank you, Sunny. Adam, do you have any response for our panelists?
Adam Thierer: Well, only just briefly to say that Sunny mentioned that we shouldn’t expect all the firms to go under right away, and that's certainly not the case I was trying to make. I was rather making the case that regulation no doubt has many benefits, but it has many costs as well. And no matter how well-intentioned the GDPR may be, the consequences, if they arrive in the form of direct costs in terms of clients, will raise barriers to entry over time, and therefore, discourage or diminish the amount of innovation that we should all hope to see come and forthcoming in the future if we hope to have real, credible alternatives to current market players.
And if it's already the case that U.S. firms are spending somewhere in the order of $7.8 billion of estimated compliances costs already with the law only in effect just a couple of months, and the other numbers I cited in terms of just how many lawyers and consultants and others need to be hired to do these thing, it's just plain common sense. There's just no way that certain small-size firms or startups are going to be able to absorb those costs. And that will have a real-world ramifications, not just in terms of choices, not just in terms of prices, but in terms of our real privacy options on the ground.
Rosalyn Layton: Wes, can I get one last point?
Wesley Hodges: Sure. Let's move to audience questions in just a moment, but please, Rosalyn, go ahead.
Rosalyn Layton: Okay. I just want to share one other data point. I think it's helpful for the audience. And this is actually something that was brought up in the complaint by the None of Your Business nonprofit against Facebook, which was there's a view that there's an information asymmetry, that people can't choose, or they're in a trap. They can't not use Facebook because it's inessential. But what's interesting if you look at the survey by Hill Holliday of Generation Z, those born from 1994 and after are the digital natives. They're estimated to be 40 percent of U.S. consumers by 2020. 90 percent of that use is social media platform. But the survey found that in the last quarter, more than half of them switched off social media for an extended period and one-third cancelled their social media account. And the number one reason for them either switching or cancelling was time wasting. And it was twice as much the reason as the concerns about privacy.
So that is an extremely savvy group of millions of consumers who have no problem switching or turning off their social media for various reasons. So I think that that is -- definitely flies in the face that people are entrapped in this particular platform that they're using.
The other thing to note is -- actually, I would say FedSoc has a wonderful teleforum that goes into detail on the Cambridge Analytica situation. It has some excellent scholars that really do a very thorough job of looking at all the angles. It's well worth listening to. But also to note that Facebook suffered an $80 billion loss of market value on the count of dissatisfaction. That's 48 times the level of the maximum GDPR fine. So consumers and the market can exact discipline maybe even at a higher rate than the regulators. And that happened pretty instantly. Facebook's made a lot of changes in their board and their practices to demonstrate to the public, to their users, to their vendors that they'll make a change. And good for them, but also, I think, Adam's point is very real that all of these we know in the history of regulation that it's the big companies that win, typically, because they're in the best position to comply with the regulators preference. And then over time, regulators and companies form a sort of interdependent relationship to keep each other going. And that's not great for market entry and for small firms that want to do it differently.
Wesley Hodges: Thank you, Rosalyn. Sunny or Adam, do you have any additional comments before we move to audience questions?
Sunny Seon Kang: Sure. I'll just say one thing. I've been hearing a lot of paradoxical statements about whether the GDPR is too detailed and too burdensome on the companies to comply with or whether it's unclear, but I haven't actually heard a lot of example and which aspects of GDPR are considered unclear. GDPR has a lot of recitals and a lot of supplementary guidances by the EDPD and working party because it is comprehensive. It's meant to ensure compliance with detailed guidance. And I guess I'm a little bit worried that I'm hearing kind of paradoxical arguments about whether it's too burdensome or whether it's not clear. But either way, I think that my point is that the GDPR establishes clear, baseline obligations on companies to comply with. And if they're not sure, there's a lot of provisions that allow interactions and consultations with the member state DPA's to ensure that accountability is checked. Thanks.
Wesley Hodges: Thank you. Adam, do you have anything to add.
Adam Thierer: Well, we could go back and forth like this all day, but I'd rather get the questions. I'll just say that clarity would be the last word I think anybody would use to describe what's in the GDPR. And maybe that's a good way to transition to questions.
Wesley Hodges: Okay. Looks like we do have one question in the queue. Let's go ahead and move to our first caller.
Caller 1: Thank you all for that wonderful panel. That was very educating. Let me get your thoughts on coming at this from perhaps a very different angle that I think you all likely may have addressed, but -- and, in fact, it may be something that has already been thought of, but simply dismissed as being impracticable. And that is the regulatory regimes, I think I'm of the same ilk that more regulatory regime is not competition enhancing, and more importantly, the idea that Brussels is back as doing the same thing it does all the time is quite worrisome, especially after finding out that Boris Johnson resigned today.
But here's my question. Is there perhaps a market-based answer to the question of protecting privacy? And I understand that technology is going -- we don't know what goes on behind the curtain. Ignore the man behind the curtain, sort of the Wizard of Oz thing. But at the same time, I think, are there various components of each effort of a regulatory regime that might be slashed -- extracted, the rest slashed and burned, and simply said, "Hey, if you all don't -- if you want to play and you want absolutely privacy, it's going to cost you 'X.' If you don't mind us sharing some of your data, it's going to cost you 'Y,'" and so on, and so on, and you all can see how that would play out. Your thoughts?
Rosalyn Layton: So, I'm Rosalyn. I'm going to jump in. There's actually a method in the policy world where you could, from a regulatory perspective, you could try that. It's called randomized controlled trials for regulation. So instead of taking and adopting 173 recitals, you'd actually use a test and learn methodology in the real world. So you would say, "Okay, I'm going to try. I'm going to use an AB split test. I will use a component of different things, see how it goes, and I will inform by actual evidence does it work? Do I get the effect I'm looking for?" And that, you would actually -- if you were actually sort of responsible about your regulation, you would say, "Let me find the best mix of bundles of what it would be," and you'd use it in a scientific way. You'd test it in the marketplace before taking it en masse and just applying the whole thing and not really knowing what's actually working or not.
So that might be one way to get around your particular… In a perfect world, I would do it that way. In the world of politics, as we know, that's not how things get done. There's windows of opportunity. Politicians want to deliver something within a particular timeframe. You also have the collective action problem. Mancur Olson described how a very small group has a lot greater ability to get the particular regulation they want because they're much more organized than this diffuse majority. That's exactly what happened in the EU. You have 17 languages, 24 different currencies, and so on. So only those who are technologically savvy, who speak English, who can organize, they're the ones who can get what they want from the regulation. But you don’t have an authentic -- it's not an authentic reflection of what the majority wants because if you look at the Eurobarometer, the people overall expressed desire for a much less, a much lighter hand, if you will; not delivering everything to Brussels and having these data protection authorities being able to nix what the local authorities said.
So that's one way. I think that Adam might have a suggestion from a market perspective about how you might achieve this.
Caller 1: Before he interjects, let me -- what I forgot to add to that was kind of the way lawyers look at these things. It's classically the regulatory environment: "Oh, lord, don't need that. Doesn't really benefit. Doesn't give you a remedy. It's simply builds an institution that does exactly what you said." So what I would add as a joinder to what I offered earlier is the potential for some significant private rights of action to enforce those rights of privacy. And it could be codified at any level. Again, of course, as you point out quite well, that you got the select few and it's just -- the deep-state program keeps running through my mind that yes, a lot of this is politically motivated. But at the same time, offering a market-based proposition coupled with significant private rights of action. Perhaps that -- of course, you're going to hear the, "Oh, lord, the floodgates of litigation will open." And it's like, I will put to you this: I would rather the floodgates of litigation than Brussels and some federal agency compelling coercively all of the ills and threats that you've pointed out. So let me qualify what I've requested in that way, and I guess, Adam, if you'd like, please chime in.
Adam Thierer: Well, just briefly, I'd say that the problem with the kind of approach that Europe's taken is you're probably going to get both. You're probably going to get the worst of both worlds as opposed to just one or the other. I agree it may make for a more interesting experiment to see which sort of governance mechanism works better. One of, sort of like, strictly enforced privacy torts and class actions versus a comprehensive regulatory regime. But it sounds like we're going down the path where we're going to get the worst of both worlds.
I personally worry in the United States, in particular, if that's the model we were to choose, we do have a pretty healthy trial bar, who's very eager to go out and sue at the first sign of troubles and they have. We have seen a lot of this activity. So I guess from that perspective, I'd be a little bit concerned about going down that path.
Rosalyn Layton: The other thing to say -- the other thing that's important to note -- there's a great paper on privacy overreach by Jamil Jaffer and Gus Hurwitz on the FedSoc website that goes into this. And it makes a great example about caller ID. When caller ID was first proposed, regulators were like, "Oh, no way. This is going to violate the rights of the callers." Well, today, nobody questions that caller ID is a wonderful innovation, right?
Caller 1: Right.
Rosalyn Layton: And the number one leading complaint at the FTC today is robocalls. And regulation has inhibited dealing with robocalls because we have these kind of no-blocking provisions in telephony. If we had allowed the marketplace, we'd have all kinds of blocking technology we'd use on our phones. Or the telephone operators would be blocking what they knew were malicious calls or… Now, they're getting more -- they have a safe harbor now to do that where they didn’t before. It's been one of the new innovations brought by the FTC Chair Ajit Pai. But in any rate, what we thought at the initial caller ID was we had to protect the right, the privacy of the caller. We now realized it's the receiving end whose privacy should be protected.
So there's nothing empirical or objective that the GDPR is the best way to do privacy. It's certainly not the only way. But there's no authority that says it's the best way. I mean, of course those screaming loudly will say it is, and the EU will stand up and do that. But they have their politicians the same way that we do. I mean, there's nothing objective about it.
Caller 1: May I ask is there the -- that always begs in my mind the next question, which is, okay, large—as I would characterize—overreaching efforts of some centralized authority to coerce ABC, XYZ and have a tendency to squeeze out the small guy completely. Is there any pushback that you all perceive in European circles or in any other -- at any other levels of analysis that could influence Brussels to throttle back on their, I guess the only way to characterize it is just massive growth into and intrusion into the lives of these ancient civilizations because Brussels thinks it's entitled to. Is there any evidence of any pushback or any mechanisms? Do any of you have an opinion on that?
Rosalyn Layton: Well, the rise of the Eurosceptic parties…you know, Brexit's the biggest example. Not that that's going smoothly or that's a great solution or what have you. But you have Eurosceptic parties in almost every European country that are trying to show their displeasure. But where one would think that the EU might get the message and say, "Well, let's reform or let's not go down that direction," frequently they double down on what doesn't work.
I mean, I think what goes on in California today is a great example. You've had maybe two decades of progressive policies which have been driving the state into the ground. And yet, [inaudible 59.22] the establishment continues to do the same thing. So maybe 138 -- more than 100,000 people leave the state every year because they're tired of it. Who knows what we'll see.
But politics is a dynamic process. It remains to be seen. My hope is that the U.S. can actually do something better, similar to how we bested the EU in the mobile space. Instead of copying what the EU did, we figured out a better way to deliver mobile service. And we delivered 4G and we've been the world leader in that. We have the biggest economy in that area. And now we're moving onto 5G. We have now China to compete with. But that, I think, is the answer. It's not copy what they do; it's to figure out a better way to do it and we have to do that through science and innovation.
Caller 1: Yes, ma'am. Thank you.
Sunny Seon Kang: I just wanted to jump in on that, actually. So about your point on pushing back on Brussels for expanding the extraterritorial scope of the GDPR, it's actually the big companies themselves who are publically committing to comply with the GDPR. So Facebook after all of the hearings that happened, they are the ones who voluntarily said, "We would expand the protection of GDPR to our U.S. users," and once they committed to it, it becomes a commitment that they have to actually honor. And just like how the FTC has the authority to bring enforcement actions for companies that are saying that they are complying with the privacy shield but isn't—like recently, there was an enforcement action on that—the same should be for GDPR. If companies are saying that users will get these additional rights enshrined in the GDPR, which aren't necessarily present in the U.S. regulatory landscape, then they should actually be held to that promise.
And also, I just wanted to make a second point about how I don't think that the GDPR extraterritorial scope was meant for Brussels to try to claw-back control of U.S. companies. I think it was more about maximizing the policy goals in the GDPR because Article 3 doesn't actually mandate the U.S. companies without any presence in the EU would be subject to it. But companies are doing it anyways because they respect that because data transfers are so ubiquitous, there needs to be harmonious coexistence between different legal systems, and actually reaching the ceiling of the data protection is better than just having a common denominator that's a race to the bottom.
Caller 1: Thank you very much for your answer.
Wesley Hodges: Thank you very much, caller, for your question. It looks like our queue is now empty and we're at the top of the hour. So I would very much like to thank all of our panelists for their wonderful remarks and for the back-and-forth conversation. I'd like for everyone to know that this call has been recorded and will be uploaded as a podcast in the near future on The Federalist Society's website. And you can consult a full roster of that if you visit us at fedsoc.org.
Well, everyone, on behalf of The Federalist Society, I want to thank our experts for the benefit of his valuable time and expertise today. We welcome all listener feedback by email at email@example.com. Thank you all for joining us. This call is now adjourned.
Operator: Thank you for listening. We hope you enjoyed this practice group podcast. For materials related to this podcast and other Federalist Society multimedia, please visit The Federalist Society's website at fedsoc.org/multimedia.