Since assuming the gavel of the Senate Judiciary Committee’s Subcommittee on Competition Policy, Antitrust, and Consumer Rights, Senator Amy Klobuchar has made it her personal mission to pass a variety of antitrust reform bills targeting “big tech.” At the top of the list is S. 2992, the American Innovation and Choice Online Act (AICOA), which ostensibly seeks to prevent the largest tech companies, and one retail company, from discriminating against the third-party vendors that use their respective platforms to offer wares to the public. Following AICOA’s introduction last October, the Senate Judiciary Committee—without the benefit of a legislative hearing—voted to approve AICOA in January subject to a Manager’s Amendment. Despite this procedural victory, it was widely acknowledged that AICOA needed a lot of work before it could be brought to the Senate floor for a potential vote.
To this end, Senator Klobuchar recently circulated a third draft of the bill which she hopes will get the bill over the 60-vote threshold (again without the benefit of a legislative hearing). But the third time is not the charm. Poor legislative drafting and bad economic logic plagued earlier versions of the bill, and Senator Klobuchar’s changes do nothing to address these problems. Instead, the alterations are little more than window-dressing. As such, if enacted, AICOA will likely do more harm than good.
A few examples demonstrate the point:
First, the revised bill continues to use the problematic standard of “would materially harm competition.” Among other concerns, the phrase “would materially harm competition” is ambiguous. Indeed, what exactly constitutes “material”? More than a minimum amount? Similarly, it is odd that the bill deliberately uses the word “would” to evaluate on-going—rather than future—conduct. If some on-going conduct “would” violate a statute, then a firm could be in violation without any showing of actual harm. Basic due process, at least in the United States, requires an affirmative determination about whether some conduct does or does not violate a statute before the government can mete out punishment.
Along the same lines, the number of affirmative defenses contained in previous drafts was extremely limited, and little has changed in the new draft. AICOA continues to provide only very narrowly-tailored affirmative defenses. Although the defenses proport to protect consumer privacy and security, in reality they simply create a perverse disincentive for firms to protect aggressively their customers’ privacy and data security. Indeed, covered platforms are limited only to conduct that is “reasonably tailored and reasonably necessary, such that the conduct could not be achieved through materially less discriminatory means,” but many mechanisms that platforms use to protect their customers are by nature extensive (and, in some cases, even duplicative) in order to anticipate all potential harms and to maintain a secure environment for customers. In today’s highly dynamic digital markets, any statute which allows the government to challenge, and a generalist court to define, what steps are “reasonably tailored and reasonably necessary, such that the conduct could not be achieved through materially less discriminatory means” will hang as a Sword of Damocles over the industry and thus act as a recipe for serious data security and privacy problems.
Third, the sponsors of AICOA continue to refuse to consider the benefits of the activities the law seeks to eliminate. For instance, the bill effectively bans a covered platform from helping consumers navigate through a bewildering number of products based on reviews, returns, shipping speeds, and so forth. Economic research shows such a ban, implicit or explicit, will make consumers worse off. So when regulators evaluate whether a covered platform’s conduct “would materially harm competition,” there is no way for them to consider whether the alleged harms are outweighed by important benefits (i.e., a rule of reason analysis). Instead, it is straight to the gallows.
Fourth, earlier drafts of the bill had a massive penalty of 15% of total revenues over the time of the alleged infraction. Thus, for a company such as Amazon (the central target of AICOA) with about $1 billion in sales per day, the risks were enormous, especially considering the company’s operating margin is only 6%. A successful complaint about a few hundred sales of a $3 adapter could cost a company like Amazon billions of dollars in penalties. In an apparent effort to address this issue, the current draft reduces the penalty from 15% of total revenues to 10% of total revenues for the duration of the activity. But moving from one arbitrarily-chosen and large penalty to another arbitrarily-chosen and large penalty, neither of which bears any systematic relationship to harm, is hardly an improvement.
A combination of vague legal standards, outsized penalties, limited affirmative defenses, and what is expected to be a highly-partisan regulatory review will likely result in the near end of third-party sellers on large platforms. Indeed, if AICOA ostensibly aims to minimize discrimination against third-party sellers on digital platforms, then the only guaranteed safe harbor is (ironically) to exclude third-party sellers altogether—the ultimate form of discrimination. When a single complaint filed by one of thousands of third-party sellers can lead to a penalty of 10% of revenue, a rational company would choose to eliminate the possibility of complaint. In fact, Amazon has already warned its third-party partners that they may be out of luck if AICOA becomes law.
The current version of AICOA also contains a couple of new twists. Perhaps the most glaring is the proposed Section 2(b), which directs the Federal Trade Commission, with the concurrence of the Department of Justice, to promulgate regulations within six months of enactment “to define the term data for the purpose of implementing and enforcing this Act” pursuant to the Administrative Procedure Act (APA). Such a direction is rather odd given that the bill specifically defines “data” in Section 2(a)(7). Moreover, unlike many similar rulemaking instructions commonly found in other regulatory statutes, AICOA does not require the FTC to revisit its findings after a period of time. Given the dynamism of digital markets, failure to include such a common review provision will essentially lock in the current FTC’s definition of “data” in perpetuity.
And speaking of the FTC, it is no secret that its leadership would like to institute a rulemaking to define and regulate “unfair methods of competition” (UMC) under Section 5 of the FTC Act. Although Congress did grant the FTC extremely narrow rulemaking authority under the Magnuson-Moss Act to deal with “deceptive acts and practices,” the Commission’s authority to engage in UMC rulemaking (which would ostensibly be governed by the APA) is far more questionable. Thus, although Section 2(b) makes clear that any rulemaking authority is limited to interpretation of AICOA, the likelihood that the FTC would use this additional APA rulemaking authority for administrative mischief is high.
Senator Klobuchar has a well-earned reputation for tenacity, and recent press reports indicate that she has pressured Senator Majority Leader Charles Schumer to bring AICOA to the Senate floor for a vote sometime later this month. Still, other recent press reports indicate that many Senators continue to have strong reservations about AICOA and wish to avoid addressing the controversial piece of legislation so close to the mid-term elections. These Senators are correct in their fears. AICOA is not an analytically serious piece of legislation.
In these fragile economic times, Congress should not rush to pass more ill-formed legislation in the cynical hope of scoring political points. The American consumer deserves a more thoughtful approach from their elected representatives.
Note from the Editor: The Federalist Society takes no positions on particular legal and public policy matters. Any expressions of opinion are those of the author. To join the debate, please email us at email@example.com.