How the United States lost the faith of its citizens—and what it can do to win them back
For years, the residents of oxford, massachusetts, seethed with anger at the company that controlled the local water supply. The company, locals complained, charged inflated prices and provided terrible service. But unless the town’s residents wanted to get by without running water, they had to pay up, again and again.
The people of Oxford resolved to buy the company out. At a town meeting in the local high-school auditorium, an overwhelming majority of residents voted to raise the millions of dollars that would be required for the purchase. It took years, but in May 2014, the deal was nearly done: One last vote stood between the small town and its long-awaited goal.
The company, however, was not going down without a fight. It mounted a campaign against the buyout. On the day of the crucial vote, the high-school auditorium swelled to capacity. Locals who had toiled on the issue for years noticed many newcomers—residents who hadn’t showed up to previous town meetings about the buyout. When the vote was called, the measure failed—the company, called Aquarion, would remain the town’s water supplier. Supporters of the buyout mounted a last-ditch effort to take a second vote, but before it could be organized, a lobbyist for Aquarion pulled a fire alarm. The building had to be evacuated, and the meeting adjourned. Aquarion retains control of Oxford’s water system to this day.
“It was a violation of the sanctity of our local government by big money,” Jen Caissie, a former chairman of the board of selectmen in Oxford, told me. “Their messiah is their bottom line, not the health of the local community. And I say that as a Republican, someone who is in favor of local business.”
A New England town meeting would seem to be one of the oldest and purest expressions of the American style of government. Yet even in this bastion of deliberation and direct democracy, a nasty suspicion had taken hold: that the levers of power are not controlled by the people.
It’s a suspicion stoked by the fact that, across a range of issues, public policy does not reflect the preferences of the majority of Americans. If it did, the country would look radically different: Marijuana would be legal and campaign contributions more tightly regulated; paid parental leave would be the law of the land and public colleges free; the minimum wage would be higher and gun control much stricter; abortions would be more accessible in the early stages of pregnancy and illegal in the third trimester.
Gilens and Page tested those theories by tracking how well the preferences of various groups predicted the way that Congress and the executive branch would act on 1,779 policy issues over a span of two decades. The results were shocking. Economic elites and narrow interest groups were very influential: They succeeded in getting their favored policies adopted about half of the time, and in stopping legislation to which they were opposed nearly all of the time. Mass-based interest groups, meanwhile, had little effect on public policy. As for the views of ordinary citizens, they had virtually no independent effect at all. “When the preferences of economic elites and the stands of organized interest groups are controlled for, the preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy,” Gilens and Page wrote.
To some degree, of course, the unresponsiveness of America’s political system is by design. The United States was founded as a republic, not a democracy. As Alexander Hamilton and James Madison made clear in the Federalist Papers, the essence of this republic would consist—their emphasis—“IN THE TOTAL EXCLUSION OF THE PEOPLE, IN THEIR COLLECTIVE CAPACITY, from any share” in the government. Instead, popular views would be translated into public policy through the election of representatives “whose wisdom may,” in Madison’s words, “best discern the true interest of their country.” That this radically curtailed the degree to which the people could directly influence the government was no accident.
Only over the course of the 19th century did a set of entrepreneurial thinkers begin to dress an ideologically self-conscious republic up in the unaccustomed robes of a democracy. Throughout America, the old social hierarchies were being upended by rapid industrialization, mass immigration, westward expansion, and civil war. Egalitarian sentiment was rising. The idea that the people should rule came to seem appealing and even natural. The same institutions that had once been designed to exclude the people from government were now commended for facilitating government “of the people, by the people, for the people.”
That basis is now crumbling, and the people have taken notice. In no small part that’s because the long era during which average Americans grew more wealthy has come to a sputtering stop. People who are asked how well they are doing economically frequently compare their own standard of living with that of their parents. Until recently, this comparison was heartening. At the age of 30, more than nine in 10 Americans born in 1940 were earning more than their parents had at the same stage of their lives. But according to eye-popping research led by the economist Raj Chetty and his co-authors, many Millennials do not share in this age-old American experience of improving fortunes. Among those Americans born in the early 1980s, only half earn more than their parents did at a similar age.
Americans have never loved their politicians or thought of Washington as a repository of moral virtue. But so long as the system worked for them—so long as they were wealthier than their parents had been and could expect that their kids would be better off than them—people trusted that politicians were ultimately on their side. Not anymore.
As a result, average voters feel more alienated from traditional political institutions than perhaps ever before. When they look at decisions made by politicians, they don’t see their preferences reflected in them. For good reason, they are growing as disenchanted with democracy as the people of Oxford, Massachusetts, did.
The politician who best intuited this discontent—and most loudly promised to remedy it—is Donald Trump. The claim that he would channel the voice of the people to combat a corrupt and unresponsive elite was at the very core of his candidacy. “I am your voice,” Trump promised as he accepted his party’s nomination at the Republican National Convention. “Today, we are not merely transferring power from one administration to another or from one party to another,” he proclaimed in his inaugural address, “but we are transferring power from Washington, D.C., and giving it back to you, the people.”
It would be easy to draw the wrong lesson from this: If the American electorate can be duped by a figure like Trump, it can’t be trusted with whatever power it does retain. To avoid further damage to the rule of law and the rights of the most-vulnerable Americans, traditional elites should appropriate even more power for themselves. But that response plays into the populist narrative: The political class dislikes Trump because he threatens to take its power away. It also refuses to recognize that the people have a point.
At the height of the mexican–american war, Nicholas Trist traveled to Mexico and negotiated the Treaty of Guadalupe Hidalgo, which ended the hostilities between the two nations and helped delineate America’s southern border. Two decades later, the U.S. government still hadn’t paid him for his services. Too old and weak to travel to Washington to collect the money himself, Trist hired a prominent lawyer by the name of Linus Child to act on his behalf, promising him 25 percent of his recovered earnings.
Congress finally appropriated the money to settle its debt. But now it was Trist who refused to pay up, even after his lawyer sued for his share. Though the contract between Trist and Child hardly seems untoward by today’s standards, the Supreme Court refused to uphold it out of fear that it might provide a legal basis for the activities of lobbyists:
If any of the great corporations of the country were to hire adventurers who make market of themselves in this way, to procure the passage of a general law with a view to the promotion of their private interests, the moral sense of every right-minded man would instinctively denounce the employer and employed as steeped in corruption.
Extreme as this case may appear, it was far from idiosyncratic. In her bookCorruption in America, the legal scholar Zephyr Teachout notes that the institutions of the United States were explicitly designed to counter the myriad ways in which people might seek to sway political decisions for their own personal gain. Many forms of lobbying were banned throughout the 19th century. In Georgia, the state constitution at one time read that “lobbying is declared to be a crime.” In California, it was a felony.
All of this began to change in the early 1970s. Determined to fight rising wages and stricter labor and environmental standards, which would bring higher costs, CEOs of companies like General Electric and General Motors banded together to expand their power on Capitol Hill. At first, their activities were mostly defensive: The goal was to stop legislation that might harm their interests. But as the political influence of big corporations grew, and their profits soared, a new class of professional lobbyists managed to convince the nation’s CEOs that, in the words of Lee Drutman, the author of the 2015 book The Business of America Is Lobbying, their activity “was not just about keeping the government far away—it could also be about drawing government close.”
Today, corporations wield immense power in Washington: “For every dollar spent on lobbying by labor unions and public-interest groups,” Drutman shows, “large corporations and their associations now spend $34. Of the 100 organizations that spend the most on lobbying, 95 consistently represent business.” (Read about a principal architect of the lobbying industry—Paul Manafort—in our March 2018 cover story.)
A model schedule for freshman members of Congress prepared a few years ago by the Democratic Congressional Campaign Committee instructs them to spend about four hours every day cold-calling donors for cash. The party encourages so many phone calls because the phone calls work. Total spending on American elections has grown to unprecedented levels. From 2000 to 2012, reported federal campaign spending doubled. It’s no surprise, then, that a majority of Americans now believe Congress to be corrupt, according to a 2015 Gallup poll. As Israel memorably put it to HBO’s John Oliver, the hours he had spent raising money had been “a form of torture—and the real victims of this torture have become the American people, because they believe that they don’t have a voice in this system.”
The problem goes even deeper than that. In America’s imagined past, members of Congress had a strong sense of place. Democrats might have risen through the ranks of local trade unions or schoolhouses. Republicans might have been local business or community leaders. Members of both parties lived lives intertwined with those of their constituents. But spend some time reading the biographies of your representatives in Congress, and you’ll notice, as I did, that by the time they reach office, many politicians have already been socialized into a cultural, educational, and financial elite that sets them apart from average Americans. While some representatives do have strong roots in their district, for many others the connection is tenuous at best. Even for those members who were born and raised in the part of the country they represent, that place is for many of them not their true home. Educated at expensive colleges, likely on the coasts, they spend their 20s and 30s in the nation’s great metropolitan centers.
After stints in law, business, or finance, or on Capitol Hill, they move to the hinterlands out of political ambition. Once they retire from Congress, even if they retain some kind of home in their district, few make it the center of their lives: They seem much more likely than their predecessors to pursue lucrative opportunities in cities such as New York, San Francisco, and, of course, Washington. By just about every metric—from life experience to education to net worth—these politicians are thoroughly disconnected from the rest of the population.
The massive influence that money yields in Washington is hardly a secret. But another, equally important development has largely gone ignored: More and more issues have simply been taken out of democratic contestation.
In many policy areas, the job of legislating has been supplanted by so-called independent agencies such as the Federal Communications Commission, the Securities and Exchange Commission, the Environmental Protection Agency, and the Consumer Financial Protection Bureau. Once they are founded by Congress, these organizations can formulate policy on their own. In fact, they are free from legislative oversight to a remarkable degree, even though they are often charged with settling issues that are not just technically complicated but politically controversial.
The range of crucial issues that these agencies have taken on testifies to their importance. From banning the use of the insecticide DDT to ensuring the quality of drinking water, for example, the EPA has been a key player in fights about environmental policy for almost 50 years; more recently, it has also made itself central to the American response to climate change, regulating pollutants and proposing limits on carbon-dioxide emissions from new power plants.
While independent agencies occasionally generate big headlines, they often wield their real power in more obscure policy areas. They are now responsible for the vast majority of new federal regulations. A 2008 article in the California Law Review noted that, during the previous year, Congress had enacted 138 public laws. In the same year, federal agencies had finalized 2,926 rules. Such rules run the gamut from technical stipulations that affect only a few specialized businesses to substantial reforms that have a direct impact on the lives of millions. In October 2017, for example, the Consumer Financial Protection Bureau passed a rule that would require providers of payday loans to determine whether customers would actually be able to pay them back—potentially saving millions of people from exploitative fees, but also making it more difficult for them to access cash in an emergency.
Most of these treaties and agreements offer real benefits or help us confront urgent challenges. Whatever your view of their merit, however, there is no denying that they curtail the power of Congress in ways that also disempower American voters. Trade treaties, for example, can include obscure provisions about “investor–state dispute settlements,” which give international arbitration courts the right to award huge sums of money to corporations if they are harmed by labor or environmental standards—potentially making it riskier for Congress to pass such measures.
This same tension between popular sovereignty and good governance is also evident in the debates over the power of the nine unelected justices of the Supreme Court. Since the early 1950s, the Supreme Court has ended legal segregation in schools and universities. It has ended and then reintroduced the death penalty. It has legalized abortion. It has limited censorship on television and the radio. It has decriminalized homosexuality and allowed same-sex marriage. It has struck down campaign-finance regulations and gun-control measures. It has determined whether millions of people get health insurance and whether millions of undocumented immigrants need to live in fear of being deported.
Take Citizens United. By overturning legislation that restricted campaign spending by corporations and other private groups, the Supreme Court issued a decision that was unpopular at the time and has remained unpopular since. (In a 2015 poll by Bloomberg, 78 percent of respondents disapproved of the ruling.) It also massively amplified the voice of moneyed interest groups, making it easier for the economic elite to override the preferences of the population for years to come.
Donald trump is the first president in the history of the United States to have served in no public capacity before entering to the White House. He belittles experts, seems to lack the most basic grasp of public policy, and loves to indulge the worst whims of his supporters. In all things, personal and political, Plato’s disdainful description of the “democratic man” fits the 45th president like a glove: Given to “false and braggart words and opinions,” he considers “insolence ‘good breeding,’ license ‘liberty,’ prodigality ‘magnificence,’ and shamelessness ‘manly spirit.’ ”
In the years since, many scholars have built this case: The political scientist Larry Bartels painstakingly demonstrated just how irrational ordinary voters are; the political philosopher Jason Brennan turned the premise that irrational or partisan voters are terrible decision makers into a book titledAgainst Democracy; and Parag Khanna, an inveterate defender of globalization, argued for a technocracy in which many decisions are made by “committees of accountable experts.” Writing near the end of the 2016 primary season, when Trump’s ascent to the Republican nomination already looked unstoppable, Andrew Sullivan offered the most forceful distillation of this line of antidemocratic laments: “Democracies end when they are too democratic,” the headline of his essay announced. “And right now, America is a breeding ground for tyranny.”
The antidemocratic view gets at something real. What makes our political system uniquely legitimate, at least when it functions well, is that it manages to deliver on two key values at once: liberalism (the rule of law) and democracy (the rule of the people). With liberalism now under concerted attack from the Trump administration, which has declared war on independent institutions such as the FBI and has used the president’s pulpit to bully ethnic and religious minorities, it’s perhaps understandable that many thinkers are willing to give up a modicum of democracy to protect the rule of law and the country’s most vulnerable groups.
The easy alternative is to lean in the other direction, to call for as much direct democracy as possible. The origins of the people’s displacement, the thinking goes, lie in a cynical power grab by financial and political elites. Large corporations and the superrich advocated independent central banks and business-friendly trade treaties to score big windfalls. Politicians, academics, and journalists favor a technocratic mode of governance because they think they know what’s best and don’t want the people to meddle. All of this selfishness is effectively cloaked in a pro-market ideology propagated by think tanks and research outfits that are funded by rich donors. Since the roots of the current situation are straightforwardly sinister, the solutions to it are equally simple: The people need to reclaim their power—and abolish technocratic institutions.
The far right puts more emphasis on nationalism, but otherwise agrees with this basic analysis. In the inaugural issue of the journal American Affairs, the self-styled intellectual home of the Trump movement, its founder Julius Krein decried “the existence of a transpartisan elite,” which sustains a pernicious “managerial consensus.” Steve Bannon, the former White House chief strategist, said his chief political objective was to return power to the people and advocated for the “deconstruction of the administrative state.”
Mair and Crouch, Krein and Bannon are right to recognize that the people have less and less hold over the political system, an insight that can point the way to genuine reforms that would make our political system both more democratic and better functioning. One of the reasons well-intentioned politicians are so easily swayed by lobbyists, for example, is that their staffs lack the skills and experience to draft legislation or to understand highly complex policy issues. This could be addressed by boosting the woefully inadequate funding of Congress: If representatives and senators were able to attract—and retain—more knowledgeable and experienced staffers, they might be less tempted to let K Street lobbyists write their bills for them.
Real change will also require an ambitious reform of campaign finance. Because of Citizens United, this is going to be extremely difficult. But the Supreme Court has had a change of heart in the past. As evidence that the current system threatens American democracy keeps piling up, the Court might finally recognize that stricter limits on campaign spending are desperately needed.
For all that the enemies of technocracy get right, though, their view is ultimately as simplistic as the antidemocratic one. The world we now inhabit is extremely complex. We need to monitor hurricanes and inspect power plants, reduce global carbon emissions and contain the spread of nuclear weapons, regulate banks and enforce consumer-safety standards. All of these tasks require a tremendous amount of expertise and a great degree of coordination. It’s unrealistic to think that ordinary voters or even their representatives in Congress might become experts in what makes for a safe power plant, or that the world could find an effective response to climate change without entering cumbersome international agreements. If we simply abolish technocratic institutions, the future for most Americans will look more rather than less dangerous, and less rather than more affluent.
It is true that to recover its citizens’ loyalty, our democracy needs to curb the power of unelected elites who seek only to pad their influence and line their pockets. But it is also true that to protect its citizens’ lives and promote their prosperity, our democracy needs institutions that are, by their nature, deeply elitist. This, to my mind, is the great dilemma that the United States—and other democracies around the world—will have to resolve if they wish to survive in the coming decades.
We don’t need to abolish all technocratic institutions or merely save the ones that exist. We need to build a new set of political institutions that are both more responsive to the views and interests of ordinary people, and better able to solve the immense problems that our society will face in the decades to come.
Writing about the dawn of democracy in his native Italy, the great novelist Giuseppe Tomasi di Lampedusa has Tancredi, a young aristocrat, recognize that he will have to let go of some of his most cherished habits to rescue what is most valuable in the old order: “If everything is to stay the same,” Tancredi says, “everything has to change.” The United States is now at an inflection point of its own. If we rigidly hold on to the status quo, we will lose what is most valuable in the world we know, and find ourselves cast as bit players in the fading age of liberal democracy. Only by embarking on bold and imaginative reform can we recover a democracy worthy of the name.
This article is adapted from Yascha Mounk’s new book, The People vs. Democracy: Why Our Freedom Is in Danger and How to Save It.