How American Democracy Fell So Far Behind

https://portside.org/2023-10-09/how-american-democracy-fell-so-far-behind
Portside Date:
Author: Steven Levitsky and Daniel Ziblatt
Date of source:
The Atlantic

In the spring of 1814, 25 years after the ratification of America’s Constitution, a group of 112 Norwegian men—civil servants, lawyers, military officials, business leaders, theologians, and even a sailor—gathered in Eidsvoll, a rural village 40 miles north of Oslo. For five weeks, while meeting at the manor home of the businessman Carsten Anker, the men debated and drafted what is today the world’s second-oldest written constitution.

Like America’s Founders, Norway’s independence leaders were in a precarious situation. Norway had been part of Denmark for more than 400 years, but after Denmark’s defeat in the Napoleonic Wars, the victorious powers, led by Great Britain, decided to transfer the territory to Sweden. This triggered a wave of nationalism in Norway. Unwilling to be traded away “like a herd of cattle,” as one observer at the time put it, Norwegians asserted their independence and elected the constitutional assembly that met at Eidsvoll.

Inspired by the ideals of the Enlightenment and the promise of self-government, Norway’s founders viewed the American experience as a path to follow. A few decades earlier, the Americans had done what the Norwegians now aspired to do: become independent from a foreign power. The Norwegian press had spread news of the American experiment, casting George Washington and Benjamin Franklin as heroes. Although the press didn’t always get the story right (it described the American president as a “monarch,” reported that Washington had been “appointed dictator of the United States for four years,” and referred to the vice president as a “viceroy”), many of the men at Eidsvoll were quite familiar with the workings of the American system. Christian Magnus Falsen, a prominent independence advocate who took a leading role in the constitution-writing process, even christened his son George Benjamin, after Washington and Franklin. Falsen was deeply influenced by Madison and Jefferson, too, later declaring that parts of the Norwegian constitution were based “nearly exclusively” on the American example.

Despite its flaws, the U.S. Constitution was a pioneering document. America became the first large nation to rule itself without a monarchy and instead fill its most important political offices via regular elections. Over the next century, the American Constitution served as a model for republican and democratic-minded reformers across the world.

The United States no longer seems like a good model today. Since 2016, America has experienced what political scientists call “democratic backsliding.” The country has seen a surge in political violence; threats against election workers; efforts to make voting harder; and a campaign by the then-president to overturn the results of an election—hallmarks of a democracy in distress. Organizations that track the health of democracies around the world have captured this problem in numerical terms. Freedom House’s Global Freedom Index gives countries a score from 0 to 100 each year; 100 indicates the most democratic. In 2015, the United States received a score of 90, roughly in line with countries such as Canada, France, Germany, and Japan. But since then, America’s score has declined steadily, reaching 83 in 2021. Not only was that score lower than every established democracy in Western Europe; it was lower than new or historically troubled democracies such as Argentina, the Czech Republic, Lithuania, and Taiwan.

The causes of America’s crisis are not simply a strongman and his cultlike following. They are more endemic than that. Over the past two centuries, America has undergone massive economic and demographic change—industrializing and becoming much larger, more urban, and more diverse. Yet our political institutions have largely remained frozen in place. Today, American democracy is living with the destabilizing consequences of this disjuncture.

Indeed, the problem lies in something many of us venerate: the U.S. Constitution. America’s founding document, designed in a pre-democratic era in part to protect against “tyranny of the majority,” has generated the opposite problem: Electoral majorities often cannot win power, and when they win, they often cannot govern. Unlike any other presidential democracy, U.S. leaders can become president despite losing the popular vote. The U.S. Senate, which dramatically overrepresents low-population states by giving each state equal representation regardless of population, is also frequently controlled by a party that has lost the national popular vote. And due to the Senate’s filibuster rules, majorities are routinely blocked from passing normal legislation. Finally, because the Supreme Court’s composition is determined by the president and Senate, which have often not represented electoral majorities in the 21st century, the Court has grown more and more divorced from majority public opinion. Not only does the Constitution deliver outsize advantages to partisan minorities; it has also begun to endanger American democracy. With the Republican Party’s transformation into an extremist and antidemocratic force under Donald Trump, the Constitution now protects and empowers an authoritarian minority.

America was once the standard-bearer for democratic constitutions. Today, however, it is more vulnerable to minority rule than any other established democracy. Far from being a pioneer, America has become a democratic laggard. How did this come to pass?

Consider what happened in Norway. Though inspired by the American experience, Norway’s founding 1814 constitution was hardly revolutionary. The country remained a hereditary monarchy, and kings retained the right to appoint cabinets and veto legislation. Members of Parliament were indirectly elected by regional electoral colleges, and voting was limited to men who met certain property requirements. Urban elites also gained a powerful built-in advantage in Parliament. Norway was overwhelmingly rural in 1814: About 90 percent of the electorate lived in the countryside. Because many peasants owned land and could therefore vote, wealthy urbanites feared being overwhelmed by the peasant majority. So the constitution established a fixed 2-to-1 ratio of rural to urban seats in Parliament—a ratio that dramatically overrepresented cities, because rural residents actually outnumbered urban residents by 10 to one. This was the so-called Peasant Clause. Majority rule was further diluted by bicameralism: Norway adopted an upper legislative chamber elected not by the people but by the lower house of Parliament.

Like America’s 1789 Constitution, then, Norway’s 1814 constitution included a range of undemocratic features. In fact, early 19th-century Norway was considerably less democratic than the United States was.

Over the next two centuries, however, Norway underwent a series of far-reaching democratic reforms—all under its original constitution. Parliamentary sovereignty was established in the late 19th century, and Norway became a genuine constitutional monarchy. A 1905 constitutional reform eliminated regional electoral colleges and established direct elections for Parliament. Property restrictions on voting were suspended in 1898, and universal (male and female) suffrage came in 1913.

After 1913, Norway was a democracy. However, one major counter-majoritarian institution remained: the Peasant Clause. By the mid-20th century, urbanization had reversed the nature of the malapportionment caused by the Peasant Clause. With half of the population now living in cities, a fixed 2-to-1 rural-to-urban seat ratio had come to overrepresent rural voters. Like the U.S. Senate, then, the Peasant Clause threatened majority rule by inflating the political power of sparsely populated areas—to the benefit of conservative parties. Unlike in the United States, however, the major political parties negotiated a constitutional reform that eliminated the Peasant Clause in 1952. Norway took additional steps toward majority rule when it reduced the voting age to 18 in 1978 and eliminated its upper chamber of Parliament in 2009.

But Norway didn’t stop democratizing. As Norwegian society and global norms changed in the late 20th and early 21st centuries, constitutional and democratic rights were expanded in new ways. Indigenous minorities, for example, gained new protections in a 1988 constitutional amendment. A 1992 constitutional amendment guaranteed Norwegians the right to a healthy environment. In 2012, the constitution was amended once again, this time to abolish Norway’s official religion and guarantee equal rights to “all religious and philosophical communities.” And in 2014, Norway adopted a set of sweeping constitutional human- and social-rights protections, including the right of children to “respect for their human dignity,” the right to education, and the right to subsistence (through work or, for those who could not support themselves, government assistance). In total, Norway’s constitution was amended 316 times from 1814 to 2014.

Two centuries of reform transformed Norway into one of the most democratic countries on Earth. On Freedom House’s Global Freedom Index, most established democracies received a score above 90 in 2022. A handful of countries, including Canada, Denmark, New Zealand, and Uruguay, received a score above 95. Only three countries received a perfect score of 100: Finland, Sweden, and Norway. Freedom House scores countries on 25 separate dimensions of democracy. Norway got a perfect score on all of them.

Norway’s story of transformation is particularly impressive, but its general trajectory is not unusual. Other European political systems started in an equally undemocratic place, with numerous institutions designed to thwart popular majorities. Most of them, like Norway, were ruled by monarchies. With few exceptions, only men with property could vote. Voting was usually indirect: Citizens voted not for candidates but for local “notables”—civil servants, priests, pastors, landowners, or factory owners—who in turn selected members of parliament. And in Latin America, where founding leaders took the U.S. Constitution as a model after gaining independence in the early 19th century, all presidents were indirectly elected, via electoral colleges or legislatures, prior to 1840.

In addition, early electoral systems were skewed to favor wealthy landowners. Cities—home to Europe’s growing working classes—were often massively underrepresented in parliament compared with rural districts. In Britain’s notorious “rotten boroughs,” a few dozen voters sometimes had their very own representative.

Most countries also had extensive legislative checks on popular majorities, including undemocratic bodies with the power to veto legislation. In Britain, the House of Lords, an unelected body composed of hereditary peers and appointees, had the authority to block most legislation. Canada also created an appointed Senate after it gained independence in 1867. Most 19th-century European political systems possessed similar upper chambers, composed of hereditary members and appointees from the crown and the Church.

Parliaments everywhere thus offered excessive protection to minority interests. An extreme example was Poland’s 18th-century Parliament, in which each deputy in the 200-member body possessed individual veto power over any bill. The French political philosopher Jean-Jacques Rousseau regarded Poland’s liberum veto (Latin for “I freely object”) as, in the words of one legal analyst, a “tyranny of the minority of one.” The system’s defenders characterized it as a “privilege of our liberty.” But it brought political life to a grinding halt. From 1720 to 1764, more than half of Poland’s parliamentary sessions were shut down by individual vetoes before any decisions were made. Unable to conduct government business or raise public funds for defense, Poland fell prey to military interventions by neighboring Russia, Prussia, and Austria, whose armies dismembered its territory, literally erasing Poland from the map for more than a century. (The dysfunctionality of the liberum veto was not lost on America’s Founders, including Alexander Hamilton, who cited Poland as an example of the “poison” of “giv[ing] a minority a negative upon the majority.”)

Although other countries steered clear of the liberum veto, states across Europe lacked rules to halt parliamentary debate, allowing small minorities to routinely scuttle legislative majorities. This filibuster-like behavior grew so widespread in Europe that the German legal theorist George Jellinek warned in 1904 that “parliamentary obstruction is no longer a mere intermezzo in the history of this or that parliament. It has become an international phenomenon which, in threatening manner, calls in question the whole future of parliamentary government.”

Across the West, then, early political systems placed elections and parliaments beyond the reach of popular majorities, ensuring not just minority rights but outright minority rule. In that world of monarchies and aristocracies, America’s founding Constitution, even with its counter-majoritarian features, stood out as comparatively democratic.

Over the course of the 20th century, however, most of the countries that are now considered established democracies dismantled their most egregiously counter-majoritarian institutions and took steps to empower majorities. They did away with suffrage restrictions. Universal male suffrage first came to France’s Third Republic in the 1870s. New Zealand, Australia, and Finland were pioneers of female enfranchisement in the late 19th and early 20th centuries. By 1920, virtually all adult men and women could vote in most of Western Europe, Australia, and New Zealand.

Indirect elections also disappeared. By the late 19th century, France and the Netherlands had eliminated the powerful local councils that had previously selected members of parliament; Norway, Prussia, and Sweden did the same in the early 20th century. France experimented with an electoral college for a single presidential election in the late 1950s, but then dropped it. Electoral colleges gradually disappeared across Latin America. Colombia eliminated its electoral college in 1910; Chile did so in 1925; Paraguay in 1943. Brazil adopted an electoral college in 1964 under military rule but replaced it with direct presidential elections in 1988. Argentina, the last country in Latin America with indirect presidential elections, dropped its electoral college in 1994.

Most European democracies also reformed their electoral systems—the rules that govern how votes are translated into representation. Countries across continental Europe and Scandinavia abandoned first-past-the-post election systems when they democratized at the turn of the 20th century. Beginning in Belgium in 1899, Finland in 1906, and Sweden in 1907, and then diffusing across Europe, coalitions of parties from across the spectrum pushed successfully for proportional representation with multimember districts (meaning multiple members of parliament are elected from a single district) to bring parties’ share of the seats in parliament more closely in line with their share of the popular vote. Under these new rules, parties that won, say, 40 percent of the vote could expect to win about 40 percent of the seats, which, as the political scientist Arend Lijphart has shown, helps ensure that electoral majorities translate into governing majorities. By World War II, nearly all continental European democracies used some variant of proportional representation, and today 80 percent of democracies with populations above 1 million do so.

Undemocratic upper chambers were tamed or eliminated beginning, in the early 20th century, with Britain’s House of Lords. Britain experienced a major political upheaval in 1906 when the Liberal Party won a landslide election, displacing the Conservatives (or Tories), who had governed for more than a decade. The new Liberal-led government launched ambitious new social policies, which were to be paid for with progressive taxes on inherited and landed wealth. Outnumbered by more than two to one in Parliament, the Conservatives panicked. The House of Lords, which was dominated by conservative-leaning hereditary peers, came to the Tories’ rescue. Inserting itself directly into politics, the unelected upper chamber vetoed the Liberal government’s all-important tax bill of 1909.

By convention, the House of Lords could veto some legislation, but not tax bills (fights over taxation had sparked the English Civil War in the 1640s). The House of Lords nonetheless voted down the ambitious budget bill, breaking all precedent.

The Lords justified this unusual move by claiming that their chamber was a “watchdog of the constitution.” The Liberal chancellor of the Exchequer, David Lloyd George, the main author of the budget bill, dismissed this, calling the House of Lords a plutocratic body—“not a watchdog” but rather the “poodle” of the Conservative Party leader. In a speech to a roaring crowd in London’s East End, the sharp-tongued Lloyd George ridiculed the aristocrats who inherited their seats in the House of Lords as “500 ordinary men, accidentally chosen from among the ranks of the unemployed,” and asked why they should be able to “override the deliberate judgment of millions.”

Facing a constitutional crisis, the Liberals drew up the Parliament Act, which would strip the House of Lords of its ability to veto any legislation at all. If the House of Lords lost its veto, its Conservative members warned, political apocalypse would follow. It wasn’t just taxes they feared. They worried about other items on the Liberal-led majority’s agenda, including plans to grant Catholic Ireland greater autonomy, which Conservatives viewed as a fundamental affront to their traditional (Protestant) vision of British national identity.

Ultimately, the bill passed not only the House of Commons but also the House of Lords. It took some hardball. The Lords were persuaded after the Liberal government, with the King’s support, threatened to swamp the House of Lords by appointing hundreds of new Liberal peers to the body if it did not relent. With the bill’s passage, the House of Lords lost the ability to block laws passed by the elected House of Commons (although it could delay them). One of Britain’s most powerful counter-majoritarian institutions had been substantially weakened. And rather than trigger a crisis, the reform paved the way for the construction of a fuller, more inclusive democracy over the course of the 20th century.

Several other emerging democracies abolished their aristocratic upper chambers outright after World War II. New Zealand eliminated its House of Lords–like Legislative Council in 1950. Denmark abolished its 19th-century upper chamber in 1953 via referendum. Sweden followed suit in 1970. By the early 21st century, two-thirds of the world’s parliaments were unicameral. The result was not—as defenders of upper chambers frequently warned—political chaos and dysfunction. New Zealand, Denmark, and Sweden went on to become three of the most stable and democratic countries in the world.

Another way to democratize historically undemocratic upper chambers is to make them more representative. This was the path taken by Germany and Austria. In Germany, following World War II, West Germans wrote a new democratic constitution under the watchful eye of American occupying forces. One of the main tasks facing Germany’s constitutional designers was to revamp the country’s 19th-century second chamber, which historically was composed mostly of appointed civil servants. They considered several options. Despite the outsize role played by American occupying forces, they rejected the U.S. Senate model of equal representation for federal states. Instead, representation in the Bundesrat would be based roughly on states’ populations. So Germany’s second chamber remained in place but was made more representative. Today, the smallest German states each send three representatives to the Bundesrat, medium-size states send four representatives, and the largest states send six representatives. With this structure, Germany’s postwar framers combined principles of federalism and democracy.

Most 20th-century democracies also took steps to limit minority obstruction within legislatures, establishing a procedure—known as “cloture”—to allow simple majorities to end parliamentary debate. The term cloture originated during the early days of the French Third Republic. In the 1870s, the provisional government of Adolphe Thiers faced daunting challenges. France had just lost a war to Prussia, and the new Republican government had to contend with the revolutionary Paris Commune on the left and forces seeking to restore the monarchy on the right. The government needed to show it could legislate effectively. However, the National Assembly was renowned for its marathon debates and inaction on pressing issues. Pushed by Thiers, the assembly created a cloture motion through which a parliamentary majority could vote to rein in an otherwise endless debate.

Britain carried out similar reforms. In 1881, Liberal Prime Minister William Gladstone pushed through a “cloture rule” that allowed a majority of members of Parliament to end debate so that Parliament could move to a vote. The Australian Parliament adopted a comparable cloture rule in 1905. In Canada, opposition minorities in Parliament had filibustered several important bills, including the Naval Aid Bill introduced by the Conservative Prime Minister Robert Borden in 1912. The bill aimed to respond to the rise of German sea power by bolstering Canada’s navy but was filibustered by the opposition Liberals for five months. The debate took a physical toll on the prime minister, who developed such severe boils that he was forced to take the floor with his “neck swathed in bandages.” The ordeal led the government to push through a cloture rule—allowing a simple majority to end debate—in April 1913.

The trend of eliminating filibusters and other supermajority rules has continued in recent decades. For much of the 20th century, Finland’s Parliament had a delaying rule under which a one-third minority could vote to defer legislation until after the next election. The rule was abolished in 1992. Denmark still has a rule in which a one-third parliamentary minority may call a public referendum on nonfinancial legislation, and if 30 percent of the adult population votes against (a high bar given low voter turnout), it is blocked. However, this rule has not been used since 1963. Iceland’s Parliament (known as the Althingi) long had an old-fashioned talking filibuster. The secretary-general of the Althingi, Helgi Bernódusson, described it as “deeply rooted in the Icelandic political culture.” Efforts to curb the filibuster were met with considerable resistance because they were viewed as threatening the “freedom of speech” of members of Parliament. In 2016, Bernódusson declared, “There are no indications at present that it will be possible to curb filibustering … The Althingi is stuck in the filibuster rut.” Three years later, however, after a record-breaking 150-hour filibuster on a European Union energy law, the Parliament curbed the filibuster through new limits on speeches and rebuttals.

Amid this broad pattern of reform is one area in which many democracies moved in a more counter-majoritarian direction in the 20th century: judicial review. Prior to World War II, judicial review existed in only a few countries outside the United States. But since 1945, most democracies have adopted some form of it. In some countries, including Austria, Germany, Italy, Portugal, and Spain, new constitutional courts were created as “guardians” of the constitution. In other countries, including Brazil, Denmark, India, Israel, and Japan, existing supreme courts were given this guardian role. One recent study of 31 established democracies found that 26 of them now possess some type of judicial review.

Judicial review can be a source of what we call “intergenerational counter-majoritarianism”—when judges appointed decades ago routinely strike down legislation backed by present-day majorities. Democracies across the world have attenuated this problem by replacing life tenure with either term limits or a mandatory retirement age for high-court justices. For example, Canada adopted a mandatory retirement age of 75 for Supreme Court justices in 1927. The law was a response to two aging justices who refused to retire, including one who became inactive in court deliberations and another whom Prime Minister William Lyon Mackenzie King described in his diary as “senile.”

Similarly, Australia established a retirement age of 70 for High Court justices in 1977, after the 46-year tenure of Justice Edward McTiernan came to an inglorious end. McTiernan had been appointed to the court in 1930, and by the 1970s, the octogenarian’s voice was often “difficult for counsel to understand,” In 1976, McTiernan broke his hip swatting a cricket with a rolled-up newspaper at the Windsor Hotel in Melbourne. In an apparent effort to nudge him into retirement, the chief justice refused to build a wheelchair ramp in the High Court building, citing costs. McTiernan retired, and when Parliament took up the issue of establishing a retirement age, there was little opposition. Members of Parliament argued that a retirement age would help “contemporize the courts” by bringing in judges who were “closer to the people” and held “current day sets of values.”

Every democracy that has introduced judicial review since 1945 has also introduced either a retirement age or term limits for high-court judges, thereby limiting the problem of long-tenured judges binding future generations.

In sum, the 20th century ushered in the modern democratic era—an age in which many of the institutional fetters on popular majorities that were designed by pre-democratic monarchies and aristocracies were dismantled. Democracies all over the world abolished or weakened their most egregiously counter-majoritarian institutions. Conservative defenders of these institutions anxiously warned of impending instability, chaos, or tyranny. But that has rarely ensued since World War II. Indeed, countries such as Canada, Denmark, Finland, France, Germany, New Zealand, Norway, Sweden, and the U.K. were both more stable and more democratic at the close of the 20th century than they were at the beginning. Eliminating counter-majoritarianism helped give rise to modern democracy.

America also took important steps toward majority rule in the 20th century. The Nineteenth Amendment (ratified in 1920) extended voting rights to women, and the 1924 Snyder Act extended citizenship and voting rights to Native Americans—although it was not until the 1965 Voting Rights Act that the United States met minimal standards for universal suffrage.

America also (partially) democratized its upper chamber. The U.S. Senate, which has been provocatively described as “an American House of Lords,” was indirectly elected prior to 1913. The Constitution endowed state legislatures, not voters, with the authority to select their states’ U.S. senators. Thus, the 1913 ratification of the Seventeenth Amendment, which mandated the direct popular election of senators, was also an important democratizing step.

Legislative elections became much fairer in the 1960s. Prior to this, rural election districts across the country contained far fewer people than urban and suburban ones. For example, Alabama’s Lowndes County, with slightly more than 15,000 people, had the same number of state senators as Jefferson County, which had more than 600,000 residents. The result was massive rural overrepresentation in legislatures. In 1960, rural counties contained 23 percent of the U.S. population but elected 52 percent of the seats in state legislatures. In state-legislative and national-congressional elections, rural minorities frequently governed urban majorities. In 1956, when the Virginia state legislature voted to close public schools rather than integrate them in the wake of the 1954 Brown v. Board of Education ruling, the 21 state senators who voted for closure represented fewer people than the 17 senators who voted for integration.

From 1962 to 1964, however, a series of Supreme Court rulings ensured that electoral majorities were represented in Congress and state legislatures. Establishing the principle of “one person, one vote,” the court rulings required all U.S. legislative districts to be roughly equal in population. Almost overnight, artificial rural majorities were wiped out in 17 states. The equalization of voting power was a major step toward ensuring a semblance of majority rule in the House of Representatives and state legislatures.

A final spurt of constitutional reforms came in the 1960s and early ’70s. The Twenty-Third Amendment (ratified in 1961) gave Washington, D.C., residents the right to vote in presidential elections; the Twenty-Fourth Amendment (1964) finally prohibited poll taxes; and the Twenty-Sixth Amendment (1971) lowered the age to vote from 21 to 18.

But America’s 20th-century reforms did not go as far as in other democracies. For example, whereas every other presidential democracy in the world did away with indirect elections during the 20th century, in America the Electoral College remains intact.

America also retained its first-past-the-post electoral system, even though it creates situations of minority rule, especially in state legislatures. The United States, Canada, and the U.K. are the only rich Western democracies not to have adopted more proportional election rules in the 20th century.

The country’s heavily malapportioned Senate also remains intact. The principle of “one person, one vote” was never applied to the U.S. Senate, so low-population states like Wyoming continue to elect as many senators as populous states like California. As a result, states representing a mere 20 percent of American voters can elect a Senate majority. America’s state-level “rotten boroughs” persist.

America also maintained a minority veto within the Senate. Much like in legislatures in France, Britain, and Canada, the absence of any cloture rule led to a marked increase in obstructionist tactics beginning in the late 19th century. And as in Canada, the filibuster problem took on added urgency in the face of German naval threats in the run-up to World War I. But Canada, like France and Britain, put in place a majoritarian 50 percent cloture rule, while the U.S. Senate adopted a nearly insurmountable super-majoritarian 67-vote cloture rule. The threshold was lowered to three-fifths in 1975, but it remains highly counter-majoritarian. America thus entered the 21st century with a “60-vote Senate.”

Finally, unlike every other established democracy, America did not introduce term limits or mandatory retirement ages for Supreme Court justices. Today, on the Supreme Court, the justices effectively serve for life. It’s an entirely different story at the state level. Of the 50 U.S. states, 46 imposed term limits on state-supreme-court justices during the 19th or 20th century. Three others adopted mandatory retirement ages. Only Rhode Island maintains lifetime tenure for its supreme-court justices. But among national democracies, America, like Rhode Island, stands alone.

The united states, once a democratic innovator, now lags behind. The persistence of our pre-democratic institutions as other democracies have dismantled theirs has made America a uniquely counter-majoritarian democracy at the dawn of the 21st century. Consider the following:

One reason America has become such an outlier is that, among the world’s democracies, the U.S. Constitution is the hardest to change. In Norway, a constitutional amendment requires a supermajority of two-thirds support in two successive elected Parliaments, but the country has no equivalent to America’s extraordinarily difficult state-level ratification process. According to the constitutional scholars Tom Ginsburg and James Melton, the relative flexibility of the constitution allows Norwegians to “update the formal text in ways that keep it modern.” Americans are not so fortunate.

Of the 31 democracies examined by the political theorist Donald Lutz in his comparative study of constitutional-amendment processes, the United States stands at the top of his Index of Difficulty, exceeding the next-highest-scoring countries (Australia and Switzerland) by a wide margin. Not only do constitutional amendments require the approval of two-thirds majorities in both the House and the Senate; they must be ratified by three-quarters of the states. For this reason, the United States has one of the lowest rates of constitutional change in the world. According to the U.S. Senate, 11,848 attempts have been made to amend the U.S. Constitution. But only 27 of them have been successful. America’s Constitution has been amended only 12 times since Reconstruction, most recently in 1992—more than three decades ago.

This has important consequences. Consider the fate of the Electoral College. No other provision of the U.S. Constitution has been the target of so many reform initiatives. By one count, there have been more than 700 attempts to abolish or reform the Electoral College over the past 225 years. The most serious push during the 20th century came in the 1960s and ’70s, a period that saw three “close call” presidential elections (1960, 1968, and 1976), in which the winner of the popular vote very nearly lost the Electoral College.

In 1966, Senator Birch Bayh of Indiana, the chair of the Senate Judiciary Committee’s Subcommittee on Constitutional Amendments, proposed a constitutional amendment to replace the Electoral College with direct presidential elections. Americans were on board. A 1966 Gallup poll found 63 percent support for abolishing the Electoral College. That year, the U.S. Chamber of Commerce polled its members and found them nine to one in favor of the reform. In 1967, the prestigious American Bar Association added its endorsement, calling the Electoral College “archaic, undemocratic, complex, ambiguous, indirect, and dangerous.”

Bayh’s proposal was given a boost by the 1968 election, in which George Wallace’s strong third-party performance nearly threw the race into the House of Representatives. A shift of just 78,000 votes in Illinois and Missouri would have cost Nixon his Electoral College majority and left the outcome to the House, where Democrats held a majority. The result frightened leaders of both parties, who began to rally behind Bayh’s proposal.

By 1969, the movement to abolish the Electoral College “seemed unstoppable.” Newly elected President Richard Nixon backed the initiative. So did Democratic Senate Majority Leader Mike Mansfield, Republican Minority Leader Everett Dirksen, House Minority Leader Gerald Ford, and key legislators such as Walter Mondale, Howard Baker, and George H. W. Bush. Constitutional reform was backed by business (the Chamber of Commerce) and labor (AFL-CIO), the American Bar Association, and the League of Women Voters.

In September 1969, the House of Representatives passed the proposal to abolish the Electoral College 338–70—far more than the two-thirds necessary to amend the Constitution. As the proposal moved to the Senate, a Gallup poll showed that 81 percent of Americans supported the reform. A New York Times survey of state legislators found that 30 state legislatures were ready to pass the amendment, six others were undecided, and six were slightly opposed (38 states would be needed for ratification). Abolition seemed well within reach.

But the U.S. Senate killed the reform. Like so many times in the past, opposition came from the South and sparsely populated states. Senator James Allen of Alabama declared, “The Electoral College is one of the South’s few remaining political safeguards. Let’s keep it.” The longtime segregationist Senator Strom Thurmond promised to filibuster the bill, and Senate Judiciary Committee Chair James Eastland, another segregationist, “slow-walked it through the Judiciary Committee,” delaying it by nearly a year. When a cloture vote was finally held on September 17, 1970, 54 senators voted to end debate—a majority, but well short of the two-thirds needed to end the filibuster. When a second cloture vote was held 12 days later, 53 senators voted for it. The bill died before it ever came up for a vote.

Bayh reintroduced his Electoral College reform bill in 1971, 1973, 1975, and 1977. In 1977, following yet another “close call” election, the proposal got some traction. The new president, Jimmy Carter, backed the initiative, and a Gallup poll found that 75 percent of Americans supported it. But the bill was delayed and then, once again, filibustered in the Senate. When a cloture vote was finally held in 1979, it garnered only 51 votes. Afterward, The New York Times reported that supporters of Electoral College reform “conceded privately that they stood little chance of reviving the issue unless a president was elected with a minority of the popular vote or the nation came disturbingly close to such a result.” As it turned out, reform supporters were wildly overoptimistic. Two presidents have been elected with a minority of the popular vote during the early 21st century, and yet the Electoral College still stands.

Our excessively counter-majoritarian Constitution is not just a historical curiosity. It is a source of minority rule. The Constitution has always overrepresented sparsely populated territories, favoring rural minorities, but because both major parties had urban and rural wings throughout most of American history, this rural bias had only limited partisan consequences. This changed in the 21st century. For the first time, one party (the Republicans) is based primarily in small towns and rural areas while the other party (the Democrats) is based largely in urban areas. That means that our institutions now systematically privilege the Republicans. The Republican Party won the popular vote in only one presidential election from 1992 to 2020—a span of nearly three decades. But thanks to the Electoral College, Republicans occupied the presidency for nearly half of that time.

In the U.S. Senate, Republican senators not once represented a majority of Americans from 2000 to 2022, but they nevertheless controlled the Senate for half of this period. As often as not during the 21st century, then, the party with fewer votes has controlled the Senate.

In 2016, the Democrats won the national popular vote for the presidency and the Senate, but the Republicans nonetheless won control of both institutions. A president who lost the popular vote and senators who represented a minority of Americans then proceeded to fill three Supreme Court seats, giving the Court a manufactured 6–3 conservative majority. This is minority rule.

What makes the situation so dangerous is that this privileged partisan minority has abandoned its commitment to democratic rules of the game. In other words, the Constitution is protecting and empowering an authoritarian partisan minority.

But that Constitution appears nearly impossible to reform.

To escape this predicament, we must begin to think differently about the country’s founding document and about constitutional change itself—more like the Founders thought about it and more like Norwegians think about their own constitution today. In 1787, just after the Philadelphia Convention, George Washington wrote, “The warmest friends and best supporters the Constitution has, do not contend that it is free from imperfections; but found them unavoidable.” He went on to write that the American people

can, as they will have the advantage of experience on their side, decide with as much propriety on the alterations and amendments which are necessary as ourselves. I do not think we are more inspired, have more wisdom, or possess more virtue, than those who will come after us.

Born of compromise and improvisation, the U.S. Constitution is not a sacred text. It is a living embodiment of the nation. Throughout our history, from the passage of the Bill of Rights to the expansion of suffrage to the civil-rights reforms of the 1960s, Americans have worked to make our system more democratic. But that work has stalled over the last half century. It is essential to reawaken the dormant American tradition of democratic constitutional change. Doing so will enable America to realize its unfinished promise of building a democracy for all—and perhaps be a model for the world.

This article was adapted from Daniel Ziblatt and Steven Levitsky’s forthcoming book, Tyranny of the Minority: Why American Democracy Reached the Breaking Point.

When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.


Steven Levitsky is the David Rockefeller Professor of Latin American Studies and Government at Harvard University.

Daniel Ziblatt is the Eaton Professor of the Science of Government at Harvard University.

Since 1857, The Atlantic has been challenging assumptions and pursuing truth.

When the founders of The Atlantic gathered in Boston in the spring of 1857, they wanted to create a magazine that would be indispensable for the kind of reader who was deeply engaged with the most consequential issues of the day. The men and women who created this magazine had an overarching, prophetic vision—they were fierce opponents of slavery—but they were also moved to overcome what they saw as the limits of partisanship, believing that the free exchange of ideas across ideological lines was crucial to the great American experiment. Their goal was to publish the most urgent essays, the most vital literature; they wanted to pursue truth and disrupt consensus without regard for party or clique.  

Here is the mission statement published in the very first issue of The Atlantic, in November 1857, and signed by many of the greats of American letters, including Ralph Waldo Emerson, Herman Melville, Harriet Beecher Stowe, and Nathaniel Hawthorne:

First: In Literature, to leave no province unrepresented, so that while each number will contain articles of an abstract and permanent value, it will also be found that the healthy appetite of the mind for entertainment in its various forms of Narrative, Wit, and Humor, will not go uncared for. The publishers wish to say, also, that while native writers will receive the most solid encouragement, and will be mainly relied on to fill the pages of The Atlantic, they will not hesitate to draw from the foreign sources at their command, as occasion may require, relying rather on the competency of an author to treat a particular subject, than on any other claim whatever. In this way they hope to make their Periodical welcome wherever the English tongue is spoken or read.

Second: In the term Art they intend to include the whole domain of aesthetics, and hope gradually to make this critical department a true and fearless representative of Art, in all its various branches, without any regard to prejudice, whether personal or national, or to private considerations of what kind soever.

Third: In Politics, The Atlantic will be the organ of no party or clique, but will honestly endeavor to be the exponent of what its conductors believe to be the American idea. It will deal frankly with persons and with parties, endeavoring always to keep in view that moral element which transcends all persons and parties, and which alone makes the basis of a true and lasting national prosperity. It will not rank itself with any sect of anties, but with that body of men which is in favor of Freedom, National Progress, and Honor, whether public or private.

In studying this original mission statement, we came to understand that its themes are timeless. The core principles of the founders are core principles for us: reason should always guide opinion; ideas have consequences, sometimes world-historical consequences; the knowledge we have about the world is partial and provisional, and subject to analysis, scrutiny, and revision.

Subscribe to the Atlantic.


Source URL: https://portside.org/2023-10-09/how-american-democracy-fell-so-far-behind