The forgotten majority: how norms inform the practice of democracy

The forgotten majority: how norms inform the practice of democracy
An election canvasser holds a sign encouraging people to register to vote in March 2018.



We have a lot of norms about democracy. They’re not all consistent.

Even before the current allegations against Brett Kavanaugh became public, his nomination to the Supreme Court represented a brewing crisis of American democracy. Really, this potential crisis was evident before Kavanaugh was nominated: the emergence of minority rule in the United States. Michelle Goldberg tackled this brilliantly in her debut column for the New York Times. After Anthony Kennedy announced his retirement, critics noted that this would mean a rightward-shifting Court despite a leftward-shifting public (at least on some issues).


There’s some debate about how exactly we should think about the Senate, where Democrats have continued to hold a minority of seats despite winning more votes in recent elections. And the Electoral College is a perpetual target of popular ire. Nate Silver points out that the Democrats will need to win a substantial margin in the national vote in November in order to win a majority of House seats, and that there’s a chance they could win more votes and still be the minority party in the chamber.


In other words, votes don’t translate neatly into political power, and the geographic differences in the Democratic and Republican voting bases (combined with some redistricting practices) mean that these distortions tend to benefit one side over the other.


I don’t know when this specific problem will hit the breaking point, where the outcomes are so numerically and consistently distorted that we have a real crisis of democracy and legitimacy. But despite protests and the rise of a “resistance,” people accepted the process by which Donald Trump became president — even though that process involved losing the popular vote by 3 million votes in November 2016.


Vox’s Ezra Klein tweeted about the Electoral College last week, noting that it was initially designed as an institution that might prevent a demagogic, underqualified leader and yet had ultimately been the institution that allowed Trump to win the presidency. This, concluded Klein, “shows how far we’ve strayed.”


I hate to pick too much on Twitter language. But instead of thinking about “straying” from the original design of the system, looking at the Electoral College shows us two sets of evolving norms about democracy. It’s hard to imagine the Electoral College going rogue and selecting, say, John Kasich, in 2016 and that going over very well.


Indeed, while there have been occasional faithless electors and several elections decided by some form of “corrupt bargain,” the Electoral College has never independently selected a president.


Furthermore, the idea that electors would be chosen by the state-level popular vote took on very quickly, and half a century after the founding, mass democracy had taken hold and looked very different than what the clunky system set up in Article II might have anticipated.


The central role of electoral democracy hasn’t moved in a single direction at every point in American history, but it’s been a pretty strong current. Parties built mass democracy in the 19th century, and the reaction against these parties in the Progressive era sought to put even more power in the hands of ordinary voters.


In contemporary politics, ballot initiatives are part of important policy decisions in some states. Political parties have opened up their processes in response to pressure to eliminate elite checks on the preferences of primary voters.


As my research has demonstrated, elections have gained symbolic importance as political legitimacy declines and the partisan chasm expands. This has only grown more true in the past few years, as commentators (including the president) cite the 2016 election as a reason for one thing or another, or seek out explanations for its surprise outcome.


Special elections and primaries, too, have drawn attention from election analysts looking for clues about the attitudes of the electorate. The emphasis on elections isn’t merely symbolic; the permanent campaign is a real thing.


But the other aspect that my work shows is that the turn in popular attention toward elections hasn’t necessarily been accompanied by a lot of concern about real popular majorities.


Trump has perhaps been the most extreme in his touting of an election victory that turned on narrow victories in a few states. But he’s not totally unique in this regard. George W. Bush talked about “the reason I was elected” after the 2000 election, too. Bill Clinton claimed an election mandate in 1993 despite falling far short of a popular vote majority. Reagan’s 1980 “mandate” was just barely over half of the popular vote. Richard Nixon talked about a “silent majority,” but it was so silent in 1968 that it was only about 43 percent.


In sum, the norms about how democracy works have evolved unevenly. Elections are important, and, increasingly, popular participation is a requirement for legitimacy. But we’ve proven far less demanding when it comes to the actual results of these votes, allowing important narratives of the popular will to develop around plurality victories, narrow wins, and actual popular vote losses.


The challenges this poses are about more than just incongruity in how we view elections. It’s also possible that the prominent of use of election-based claims — “I’m doing what I was elected to do” — from elites have obscured the fact that political majorities are not self-executing.


You can have broad public support for a candidate, a policy, or a point of view, but it doesn’t matter if that support isn’t mobilized. Even without the high levels of distortion between seats and votes that a winner-take-all system can produce, elections don’t automatically translate public preferences into policy change. Pretending that they do is a form of populist demagoguery.


A third factor is determining who really counts as part of the electorate. Perhaps more than the other two changing norms about democracy, this one has shifted forward and also regressed. This is especially true in the realm of voting rights for African Americans, which have been extended in law and restricted in practice over and over again.


While Americans value voting rights overall, some discrepancies are evident. This Pew survey shows that 79 percent of black respondents favor the option “everything possible should be done to make it easier to vote,” over “citizens should prove they want to vote by registering ahead of time.”


A majority of white respondents chose this option as well, but only 54 percent. In 2016, a majority of Americans supported voter ID laws, though there were partisan differences. And after the 2016 election, the surge of stories about the “white working class” — despite the diversity of this segment of the electorate overall — made it clear how far we have to go when it comes to weighing all citizens equally in our thinking about elections and the popular will.


It’s hard to know what form a potential crisis of democracy will take. Maybe there won’t be one. Maybe it’s already here. But as we think about how norms inform the practice of democracy, the problem isn’t just that people across party lines have very different views about those norms. It’s that even widely held norms aren’t always consistent or compatible.


Our system seems to be able to withstand a lot of internal contradiction, between belief and practice as well as different sets of beliefs. At the same time, our history suggests that sometimes a situation can seem stable and legitimate until it doesn’t.