In today’s politics, agreement is a rare beast. So it’s notable when it shows up. And recently, a budding consensus has emerged over labor policy. On the left and right, policymakers have converged on the idea that labor law has failed. These policymakers say that even though labor unions enjoy historic levels of popularity, they represent a historically small percentage of workers. So clearly, union organizing is too hard. The law throws up too many roadblocks between people’s desire for a union and their actually getting one. And that means labor law needs to change.

But this line of thinking misses labor law’s point. Yes, labor law does promote collective bargaining, and thus unions. But it promotes those things only as a means to an end. Its lodestar has never been union membership, but rather labor peace. And measured by that standard, it has been wildly successful: it has given us the most peaceful labor market in living memory. So before rushing to change it, policymakers should stop to consider how, through a century of trial and error, labor law finally achieved its original goal.

Labor Peace and the Wagner Act

At its core, labor law is about strikes. It was born in the early 20th century, when labor relations were much more tumultuous than they are today. The period had seen a rash of widespread and often violent work stoppages, many of which spilled over state borders. For example, in 1886, the Pullman railroad strike shut down rail traffic across large swaths of the country. Striking workers clashed with Pinkerton strike breakers and even federal troops, resulting in millions of dollars in property damage and more than thirty deaths. Similar strikes hit the coal industry in 1902, the steel industry in 1919, the railroad industry (again) in 1922, and the textile industry in 1934. Labor disputes were common, widespread, and economically devastating

That kind of strife loomed in the minds of lawmakers when they passed the 1935 Wagner Act. The Wager Act was justified as a way to protect interstate commerce. It aimed to keep the channels of commerce open by preventing industrial conflict. And it proposed to do that mainly by promoting collective bargaining. Its theory was that industrial strife stemmed from two sources: employee dissatisfaction and employer intransigence. Employees suffered from low wages, poor working conditions, and unresponsive managers. Worse, they had no peaceful way to express their discontent. If they wanted their employers to listen, they had few choices but to walk off the job. The Wagner Act aimed to give them another option. It gave them a legal right to designate a bargaining agent—i.e., to join a union. It then required the employer to bargain with that union. And by forcing the parties to bargain, it aimed to funnel disputes into more peaceful channels.

To be sure, lawmakers didn’t think bargaining would solve all disputes. The Wagner Act didn’t require parties to agree on any particular term—or to agree at all. It only required them to bargain in good faith. And it recognized, at least tacitly, that even good-faith bargaining would sometimes fail. So besides giving employees a right to unionize, it also gave them the right to strike. That right was new: until then, many strikes had been illegal under state and even federal law. But still, lawmakers thought that legal protection wouldn’t necessarily lead to more strife. Strikes would be a backstop, rather than a replacement, for peaceful negotiation.

Judged on those terms, the Wagner Act failed. On one hand, the law did boost union membership: in the five years after it was passed, union density more than doubled, rising from 8.5% to 18.2%. But on the other hand, it didn’t dawn a new era of labor peace. To the contrary, it triggered a strike wave. In 1933, there were 753 reported strikes involving 297,000 workers. Over the next six years, those figures rose to an average of 2,541 strikes involving about 1.18 million workers. The next five years saw even more surges, rising to an average of 3,514 strikes involving 1.5 million workers. And in 1946, nearly one out of every ten workers walked off the job.

The causes of these strikes were complex. Some were caused by pent-up demands. Union leaders had put off calls for wage and benefit increases during World War II, but with the war winding down, they were no longer willing to wait. At the same time, millions of American soldiers were being discharged and returning to work. This surge in available workers scrambled the labor market. Tensions were high, and some strikes were perhaps inevitable.

Still, greater union densities almost surely contributed to the problem. By 1946, 34.5% of nonagricultural workers belonged to a union. And those unions were key to organizing strikes. Indeed, many strikes were possible only with sophisticated coordination. Some were “sympathy” strikes, called by one union to support the demands of another. Others were “jurisdictional” disputes: fights between two or more unions over a single group of workers. And still others were secondary boycotts: efforts to exert maximum pressure on a single employer by coordinating boycotts up and down the supply chain. All of these tactics spread labor disputes beyond the immediate employer-union fight. They could shut down not just one employer, or even one group of employers, but whole economic segments. And standing in the center of the spreading chaos was a growing and powerful labor movement.

A New Strategy: The Taft-Hartley Act

This chaos sparked a political backlash. In 1946, Republicans campaigned heavily against labor unions, riding the slogan, “Had enough?” And as it turned out, many Americans had. Voters handed the GOP a 55-seat gain in the House and its first majority since 1930. Republicans then leveraged their newfound power to push through labor-law reform: the 1947 Taft-Hartley Act.

Taft-Hartley reshaped labor law almost entirely to unions’ detriment. To start, it banned “union shop” agreements, which made union membership a condition of employment. It also allowed states to pass “right to work” laws, which gave employees the right to opt out of union representation. And maybe most important, it banned some of unions’ most powerful economic weapons, including certain kinds of secondary boycotts.

Combined, these changes helped trigger a decades-long decline in union membership. Membership peaked in 1952, when unions represented about 35% of nonagricultural workers. (Strikes also peaked that year, with 470 major work stoppages involving 2.7 million workers.) But from there, membership fell steadily. By the early 1980s, union density had fallen to about 21%. By the 1990s, it had plunged into the teens. And by the 2020s, it hit just 10%, less than one third of its height.

Even that number doesn’t tell the full story. The decline in overall union membership was offset by increases in public sector union membership, which operates under a different legal structure. By contrast, in the private sector, where Taft-Hartley had its biggest impact, union membership rates fell to six percent. So today, unions represent a smaller slice of the private-sector workforce than they did when the Wagner Act was passed.

But at the same time, labor markets steadily became more peaceful. In 1960, there were 270 major work stoppages. By 1980, that number fell to 187. Ten years later, there were only 44 work stoppages. That number fell to 39 in 2000 and 11 in 2010. By 2020, there were only 8 major work stoppages—less than two percent of the 1952 peak. So just as unions started their long decline, so did industrial strife. The two dried up in tandem.

Labor Peace without Labor Unions?

Today, unions bemoan Taft-Hartley’s effects on organizing. They argue that right-to-work laws effectively legalize “free riding.” They also lament that Taft-Hartley allowed employers to campaign vocally against unionization. And they mourn the loss some of their best economic weapons. They blame these changes for declining unionization and reduced rates of collective bargaining. In short, they say, Taft-Hartley broke labor unions.

To be sure, Taft-Hartley probably caused some of the decline. While correlation is not causation, it’s hard to ignore the temporal link. Unions started disappearing shortly after the law was passed, and its reforms had the predictable effect of slowing union organizing. But by the same token, it’s just as hard to ignore the link between declining union densities and work stoppages. The story of the Wagner Act was rising unionism alongside surging industrial strife. And the story of the Taft-Hartley Act has been a decline in both.

Therein lies the irony. The Wagner Act was passed to promote labor peace. It aimed to keep commerce flowing by promoting collective bargaining, and thus unionism. Taft-Hartley reversed one part of that policy: it helped make unionism, and thus collective bargaining, less common. But by doing so, it finally achieved labor law’s original goal. The labor market today is more peaceful than at any time in the last century. And that peace owes in large part to the relative scarcity of unions.

That lesson is worth keeping in mind in contemporary debates. Today, voices on both sides of the aisle laud the benefits of unionism. They speak of unions as vehicles of workplace democracy—a productive way for workers to express their collective discontent. But unions have not always funneled discontent through peaceful channels: when given too much power, they have disrupted the avenues of commerce. And while that kind of disruption may seem remote today, its remoteness is a credit to labor law’s success. As Ruth Bader Ginsburg once wrote, albeit in a different context, discarding labor law because we have labor peace would be like “throwing away your umbrella in a rainstorm because you are not getting wet.”

Note from the Editor: The Federalist Society takes no positions on particular legal and public policy matters. Any expressions of opinion are those of the author. We welcome responses to the views presented here. To join the debate, please email us at [email protected].