This is Chapter 9 in a blog series. If you’re new to the series, visit the series home page for the full table of contents.

Notes key: Type 1 - fun notes. Fun facts, extra thoughts, or further explanation. Type 2 - less fun notes. Sources and citations.

Part 4: Politics, in 3D

“Knowledge of human nature is the beginning and end of political education.” – Henry Adams

Chapter 9: Political Disney World

I grew up in Newton, Massachusetts in the 80s and 90s. Newton back then was a pretty diverse place—a 90,000-person suburb with a wide range of ethnic, religious, and socioeconomic backgrounds. To live in Newton, there were only two requirements: you had to be a Red Sox fan and you had to be a Democrat. I was both, so things were chill.

When I was six, my second-grade classroom voted on the 1988 presidential election by circling either “Michael Dukakis” or “George Bush” on a little sheet of paper, folding it, and placing it into a shoebox on the teacher’s desk. It was the first time I had been sentient for a big political event. Later that day, the teacher revealed the results:

Dukakis 20, Bush 1

Duh. Dukakis was the nice good guy candidate and Bush was the bad guy candidate. I still don’t know who the one sick fuck was who voted for the bad guy, but other than that, the results made sense. Pretty boring.

Then the actual election happened and—somehow—Bush won.

I was floored. What kind of medieval shit did my country just pull? How could so many people have gotten it so obviously wrong?

I assumed when I was older and understood the world better, it would make more sense.

But I got older, and the storyline stayed the same. There was the Obviously Good Party, who cared about poor people and black people and flowers and smiles—and the Obviously Bad Party, who were all these two men, teaching their sons about offshore bank accounts.

And every election, the vote would split very near 50/50. I figured there really just were a lot of bad people in my country. Shame.

Then I went to college. It was 2000. Bush-Gore year. While everyone I grew up with was obviously rooting as hard as possible for Gore to win, it began to dawn on me that I had made a very strange group of new friends in college. Some of them were rooting for Gore, but they hated certain things about his beliefs. Others disliked both candidates. And some of them were fervently rooting for Bush, even though they had previously seemed like reasonable people.

I knew exactly where I stood, of course, and made my opinion clear. When I explained that I was unquestionably voting for Gore, instead of giving me a high five, my friends asked me why. I had all kinds of explanations, but when they’d push me to talk in specifics, I’d run into a problem.

I didn’t really know the specifics.

I knew Gore was the better choice, just like I knew the Democratic Party was the better party—but when pressed about my underlying reasons for liking any specific policy of Gore’s, I’d end up in an uncomfortable place.

Gore will be much better for the poor.

Why?

Because he won’t cut taxes for the rich as much and there will be more money for social programs.

Which social programs are you talking about? What about them do you think has worked well? Why are you so sure increased government spending on those programs is the best way to help the poor? And why are you so confident that tax cuts for the rich don’t end up positively affecting poor people?

Um well Gore will be better for the environment. 

How so?

He talks about it more and seems to care about it more.

Right but what kinds of policies do you hope he’ll put in place that Bush won’t? And do you think government regulations or incentives will accomplish more than a market solution like a carbon tax?

Well fucking shit. When continually pressed, my underlying reasoning for my positions would always seem to boil down to some combination of, “Because that’s what seems intuitive to me based on what everyone I know has always said” and “Because the Democrats are the good guys”.

Being challenged by people who didn’t agree with me made me realize I didn’t actually know anything—I just strongly believed a bunch of things.

I didn’t know anything because I hadn’t ever needed to know anything to feel like I had all the answers, and I hadn’t ever been interested enough in the workings of government to put in the serious effort to truly understand it. All I knew was how to articulate the beliefs I assumed were right, in a pretty surface way.

I had always thought of myself as a well-educated thinker, an independent thinker, and a thinker whose opinions were based on evidence and facts—but freshman year, I was smacked over the head with the truth about myself. When it came to politics, at least, I wasn’t really a thinker at all.

___________

If I had to describe politics in modern societies, I’d say it’s—how should I put this—it’s a fucking nightmare. It’s just awful, for basically everyone. It makes us angry. It makes us anxious. It makes us hateful. It makes us our worst selves.

But why?

Politics is just the domain of how people live and work and make decisions together, which on its face seems like a fascinating puzzle—a joint project each society works on together, for all of their benefit. Sure, it’s contentious and involves competition and disagreement, but there are a lot of worlds like that that aren’t a fucking nightmare and don’t consistently bring out our worst selves: science, sports, tech, entrepreneurship, and the arts, to name a few. What is it about politics that makes it so much more miserable than all of those other vibrant centers of human development?

Let’s pull out our tools and discuss.

Politics, in 2D

I don’t know about other countries, but the entire U.S. talks about politics as if it’s one-dimensional:

In this chapter, let’s try looking at politics in 2D instead. We spent Chapter 7 talking about the Thinking Ladder. What would a Political Ladder look like?

The Probably Time For a Refresher Blue Box

We started this series by defining what I see as two fundamental elements of the human psyche.

I call them “minds,” but really, they just represent two states a person (or group of people) can be in. When the Primitive Mind is in control in our minds, we’re often not being our best selves, not making very good choices, and not especially self-aware about what we’re doing or why. When the Higher Mind is in control, we’re being more of a grown-up. It’s not binary though—it’s more of a tug-of-war between the two states. The tug-of-war ebbs and flows in each of us and often, we’re somewhere in the middle.

The Psych Spectrum is our way of visualizing the state of this tug-of-war. When the Higher Mind has a strong presence in our heads, we’re higher up on the spectrum. When the Higher Mind’s voice gets lost in the fog of a riled up Primitive Mind, we sink lower down on the spectrum.

I find that when I’m thinking about any What of life—what we do, what we say, what we think—things make a lot more sense once I bring the Psych Spectrum into my thought process. The ladder is our way of doing this visually. If we simplify any What of life so we can represent its possibilities on a one-dimensional, horizontal spectrum, we can then slap the Psych Spectrum onto it as a vertical y-axis. The resulting square forces us to add another dimension to our thinking and reconsider the What of life alongside the question, “but how is the Psych Spectrum affecting what’s happening here?”

I call the square a ladder because thinking of it in terms of rungs focuses the discussion in on the Psych Spectrum, which is the skill I want us to gain in this series.

To define the rungs of any ladder, we need to start by asking ourselves how the Higher and Primitive Minds “do” that part of life. This defines the y-axis’s two extremes. Each person, in thinking about their own psyche, might define it a little differently. When it comes to our intellectual lives, I see the Higher Mind as motivated to seek truth (because that’s the rational thing to want) and the Primitive Mind as motivated to confirm what it already believes (because that was the best way for a human to survive 50,000 years ago). My specific definition of each rung of the resulting Thinking Ladder is derived from those two definitions (I call the Thinking Ladder’s y-axis the “How You Think” axis for clarity—but it’s really just “the Psych Spectrum, as it applies to thinking”):

So what would a Political Ladder look like?

It’s a little more complicated than the Thinking Ladder. A huge element of politics is “political thinking,” and for that element, we could just use the Thinking Ladder. But politics also involves action. To bring this element in, we should also ask ourselves: What would the Higher Mind and Primitive Mind’s political goals be?

Everyone can take their own crack at this, but here’s mine: The political goal of the grown up, rational, universal-thinking Higher Mind is to build a more perfect nation. And the political goal of the ancient, survival-obsessed, Power-Games-playing Primitive Mind is political triumph against the bad guys, whomever they may be. A discussion of politics should incorporate both political thinking and political activism:

Using two political ladders in this post would be terribly cumbersome, and also pretty redundant. I’m sure there are instances when someone is simultaneously at different y-axis positions when it comes to their political thinking and activism… but you know people—most of them will be in a similar place on both. So to simplify, we’ll combine these into a single Political Ladder.

With our view of politics now in 2D,1 we can return to our question: Why is politics such a nightmare?

The answer: The Political Ladder is bottom-heavy.

Bottom-Heavy Topics

Religion, like most things human, exists all up and down the Psych Spectrum. At the top, you’ll find people who think about religion as a set of cultural traditions, as a basis for community, as a moral framework, even as an enticing set of possibilities for the unknown. Every single major religious text has high-minded ideas in it, and every single major religion includes millions of high-minded members—those whose religious adherence isn’t mutually exclusive with, but right in line with, top-rung intellectual and moral thought. Religion, when done the Higher Mind’s way, is a lovely thing.

And in each case, as you work your way down the Psych Spectrum, high-minded conceptions of religious culture, community, and philosophy morph into complete and utter zealotry, tribalism, delusion, and depravity, as they’re transferred from the Higher Mind’s domain into the clutches of the Primitive Mind.

What makes religion a major cause of some of the largest, most intense Echo Chambers isn’t that religious thinking spans the Psych Spectrum—most topics do—it’s that the distribution is bottom-heavy. For every deeply religious person thinking about religion from the high rungs, there are even more people down below. Some reasons why:

  • Religion involves beliefs about death, sex, morality, and almost every other topic the Primitive Mind cares about. Beliefs about eternal life, in particular, match up perfectly with the core end goal of animal genes.
  • Religion is faith-based, and at least with its conceptions of what happens after death, inherently untestable—i.e. unfalsifiable.
  • Religion is a topic that identities like to attach themselves to. People don’t follow Christianity or believe in Christianity or live by the philosophies of Christianity—people are Christians.
  • Religion lends itself perfectly to a tribal, Us/Them worldview. Not only are you an X, but other people are a Y, and if Y religion is true, it would mean your religion is not true.
  • Most religions are based on books written long ago, by people whose Higher Minds had much less access to knowledge and advanced moral wisdom than we do today.

So it makes sense that religion would rile up our Primitive Minds and damn religion to eternal Psych Spectrum bottom-heaviness.

And if I had grown up in a religious Echo Chamber—and if I were surrounded by religious dogmatists in my life today—and if my country were currently being torn apart by religion—then I might have decided to write a big post series about religion. Instead, I wrote one about politics.

Like religion, politics is a pro at igniting our primitive fires.

The Primitive Mind mistakes politics, like it does religion, for a life-and-death situation. This makes sense, because in the ancient world where the Primitive Mind still thinks it lives, politics was a life-or-death game. For almost everyone who lived before the Enlightenment, and still for many people in today’s world, being on the losing end of the game of politics put you in grave danger at the hands of your enemies. And being on the winning side meant having the power to vanquish those enemies. If politics went wrong, nothing else mattered—you were fucked.

It’s not that today’s politics no longer deals with critical life factors like freedom, safety, fairness, and resources—it’s that today, in a country like the U.S., the stakes in each of those games are far lower than they were in ancient times. Modern politics is about whether taxes should be higher or lower—not about which people should have food during a period of low resources and which should starve to death. It’s about where the line should be drawn when certain rights butt up against other rights—not about which people will be slaves and which will be masters. Politics today is an argument about whether the criminal justice system is applied consistently—not about which citizens the written law itself will and won’t apply to. It’s about the way police do their job and police accountability—not about which citizens should be protected by the government during a genocide and which should be the subject of government genocide. It’s not that modern liberal politics doesn’t have life-or-death consequences for some people—it’s that today, those cases are the exception, not the rule.

But our Primitive Minds are hardwired to see politics the old-fashioned way, regardless of how the world has changed. That many people will read the above paragraph and think, “politics is still all of those things, just in better disguise,” is reflective of how bad we are at thinking reasonably about politics.

And politics doesn’t just rile up one part of your ancient mind—like religion, politics is a one-stop-shop for nearly every concept that lights the Primitive Mind’s fires:

The Primitive Mind is obsessed with the concept of power hierarchies—and politics is literally the allotment to some humans of power they’re allowed to use against the rest of the population.

The Primitive Mind is obsessed with binary moral divisions—and politics, like religion, is a prime arena for the fiercest disputes over what’s righteous and depraved, fair and unfair, pure and toxic, good and evil.

The Primitive Mind is deeply concerned with defending your identity—and political alignment, like religious affiliation, consistently forms a piece of people’s core identity.

Politics sometimes even overlaps with the world of religion itself, in the continual dispute over how political laws interact with religious laws.

And of course, there’s the way politics lends itself beautifully to tribalism, the Primitive Mind’s favorite game. The Primitive Mind sees the whole world through a Power Games lens, and it’s always looking for ways to divide its surroundings into Us people and Them people—it just needs a vehicle. And politics provides a perfect one.

This all adds up to politics being a bottom-heavy thing for us. But don’t just take my word for it—

The Some Actual Science Agrees That We Suck at Politics Blue Box

We’re still learning about this, but there’s some interesting research that helps explain why politics so often takes place on the lower rungs of the ladder.

A 2016 study published in the journal Scientific Reports presented people with “arguments that contradicted their strongly held political and nonpolitical views.” The results were pretty stark: people were much less likely to have their minds changed when it came to their political beliefs.1

In other words, political thinking was taking place in Unfalsifiable Land, while other thinking was not.

Even more interesting is that while conducting this study, the scientists used an fMRI scanner to measure participants’ brain activity, revealing that people actually processed challenges to their political beliefs with different parts of their brains than they used to process nonpolitical contradictions.2

In particular, they found that having nonpolitical beliefs challenged lit up regions of the brain like the orbitofrontal cortex that are involved in decision-making. Having political beliefs challenged, on the other hand, generated less activity in those areas and more activity in the Default Mode Network—a group of brain regions associated with creating a sense of self and with disengagement from the external world. The scans also showed that having a political belief challenged caused more activity in the insula and the amygdala—emotional, fight-or-flight parts of the brain—than having a nonpolitical belief challenged.

So when the participants had one of their political views challenged, they were more likely to withdraw from the external world and go into the internally focused parts of their brains that deal with their identity, as well as the parts of their brains that deal with danger, fear, and other primal emotions. And while doing their thinking this way, their minds were far less likely to change.

This is just one of dozens of studies I came across in my research that examine the relationship between political beliefs and the likelihood of changing one’s mind—and the findings seem to be pretty consistent.

The study above examined people who identified with the American Left, but of course, the phenomenon spans the political spectrum. Another study found that in their questioning, “people whose political identity was made salient were less likely to believe in an anthropogenic cause of climate change and less likely to support government climate change policies than those whose identity was not made salient; particularly when those people were aligned with the right-wing of politics.”

Another found that “even under conditions of effortful processing, attitudes toward a social policy depended almost exclusively upon the stated position of one’s political party.” This study also examined participants’ awareness of their own political dogmatism and found, predictably, that “participants denied having been influenced by their political group.” But of course, “they believed that other individuals, especially their ideological adversaries, would be so influenced.”

One study suggests that showing people belief-disconfirming scientific evidence not only leads them to reject the evidence but to lose faith in science in general—finding that “relative to those reading belief-confirming evidence, participants reading belief-disconfirming evidence indicated more belief that the topic could not be studied scientifically and more belief that a series of other unrelated topics could not be studied scientifically.”

Then there are the studies about the backfire effect that find that not only do “corrections frequently fail to reduce misperceptions among the targeted ideological group … corrections actually increase misperceptions among the group in question.”

The study suggests an explanation: “When confronted with counterevidence, people experience negative emotions borne of conflict between the perceived importance of their existing beliefs and the uncertainty created by the new information. In an effort to reduce these negative emotions, people may begin to think in ways that minimize the impact of the challenging evidence: discounting its source, forming counterarguments, socially validating their original attitude, or selectively avoiding the new information.”

In case you’re assuming that well-educated people might fare better here—mountains of evidence (not to mention real-world observation) suggest that they don’t. One study looked specifically at what happens when education and science knowledge butt heads with political dogmatism. It found that “more knowledgeable individuals are more likely to express beliefs consistent with their religious or political identities for issues that have become polarized along those lines (e.g. stem cell research, human evolution), but not for issues that are controversial on other grounds (e.g. genetically modified foods).” So for controversial science-related issues that were not politically polarized, more education meant less dogmatism—which seems intuitive. But when the science-related controversies were politically (or religiously) polarized, that correlation went away, and their beliefs simply lined up with their tribal alliance. In our terms: well-educated people are likely to be high-rung thinkers…until the topic is politically or religiously polarized, at which point they drop down the ladder and become obedient partisans like anybody else.

Taking a moment to look at some research is a nice reminder that high-rung thinking is actually neurologically different than low-rung thinking. Low-rung thinking isn’t really thinking at all—it’s self-preservation. Our relationship with intellectual culture follows suit. When our psyche is up on the high rungs, we know that thinking is just thinking. This makes us interested in truth and open to changing our minds—so we like Idea Labs. When we’re lower on the ladder and confusing thinking with self-preservation, confirmation of our beliefs feels like safety—so we seek out an Echo Chamber as a protective bunker.

The good news is that politics isn’t confined to the low rungs. There’s plenty of political activity up on the high rungs—it’s just that politics has an ugly high/low ratio. I think, with some work, we can improve that ratio. But first, we have to see the political landscape for what it is. You can’t improve upon a bottom-heavy distribution if you can’t see the bottom-heavy distribution, and you can’t see it if you don’t know there’s a vertical dimension to be at the bottom of in the first place.

It reminds me of the whole “Inuit people have 428,085 words to describe different kinds of snow!” thing. Whether that’s true or just a fun myth (it’s a myth), it gets at an important concept: the level of nuance in our thinking is limited by the level of nuance in our language. Before I encountered the delightful term “humblebrag,” I found it vaguely irritating when someone would humblebrag, but it was more of a subconscious irritation and one I’d have had a hard time articulating to someone if I tried. But then this term entered my thinking and my vocabulary, and suddenly, humblebragging became a distinct thing in my head. I clearly noticed it now, and I knew exactly why it irritated me. I also noticed myself doing it, which helped me do it less. Labeling a nuanced concept sharpens our ability to think about that concept and communicate our thoughts to others. With the right labels, nuance becomes a breeze.

That’s what we’re trying to do here. Like, consider these four political thinkers:

The two thinkers on the left side, at least on the topic at hand, share a common viewpoint. Same for the two thinkers on the right.

But the two high-rung thinkers share a common way of thinking. They’re humbler, more nuanced, and their opinions about the topic were hard-earned. The two low-rung thinkers are more sure of themselves while knowing less than the thinkers above them—and there’s nothing you could really do to change their minds.

Our societies are great at talking about the horizontal distinction. We’re experts at identifying what people think and grouping people that way, because we’ve been trained to look at these four thinkers and see two left-wingers and two right-wingers.

But we’re awful at talking about the vertical distinction. When I listen to arguments or read op-eds, I constantly hear people trying to make vertical distinctions in their arguments about politicians or ideas, but because A) many people forget that there is a vertical axis, and B) those who do think vertically lack a common language with which to talk about it, those attempts are usually misunderstood or missed altogether.

When people notice a vertical discrepancy between thinkers, it’s like me before I learned the word “humblebrag”—they often can’t quite tell what it is that they’re noticing, so they’ll misattribute the qualities that distinguish the thinkers to something they do have a vocabulary for. I hear people refer to high-rung political thinkers as being more centrist, or more moderate, than low-rung thinkers. But those are What You Think words. They refer to the middle part of the x-axis—as if holding viewpoints in those areas is the mark of a good thinker, and vice versa. Often, high-rung thinkers will end up at more centrist or moderate positions than low-rung thinkers, but there are plenty of cases where the opposite is true. Vertical terms like high-rung and low-rung make our discussions a bit less constrained and a bit nimbler.

So let’s try taking a breath from left-wing and right-wing politics and focusing, for the rest of this chapter, on the worlds of high-rung and low-rung politics.

The Political Arch

Every country has their own special political squabble around the What of politics—the parties, the stances, the ideologies. We’ll be using U.S. politics as our “demo system” in this discussion—because as an American, it’s the system I understand the best and will be the least wrong about. But this discussion can apply to any country, as the high-rung / low-rung distinction is something that all political systems share.

If we mapped out the American political landscape in the traditional way—by bunching everyone up in one dimension, paying attention only to What people think—it might look something like this:

Now let’s bring the landscape into 2D:

This tells a more interesting story. The American political distribution now forms a St. Louis Arch-esque shape.

Of course, since no one talks about the Psych Spectrum, there are no Gallup polls, Pew research tables, or Our World in Data graphs showing us the exact shape or distribution of Americans on our two-dimensional graph. All we can do is guess—and my best guess is that we’re dealing with some kind of St. Louis Arch situation.

Let’s start at the top of the arch and work our way down.

High-Rung Politics

Not everyone who participates in high-rung politics approaches politics like a Scientist. Up in this realm, you’ll find some super-objective, unaffiliated top-rung thinkers. But you’ll probably find even more somewhat partisan, pretty confirmation-bias-y political Sports Fans. You’ll even find some hopelessly partisan, highly tribal, fully unfalsifiable political Attorneys.

The thing that makes high-rung politics high-rung is that it takes place within high-rung political culture.

High-rung political culture is the political version of the high-rung Idea Lab cultures we discussed last chapter. It subscribes to all the same high-rung intellectual values and supplements them with the high-rung political notion that the good of the country trumps the good of any political tribe. It’s a culture that makes it safe for Scientists to be Scientists, and it lets Sports Fans do their thing while keeping their worst tendencies on a leash. Attorneys who abide by the culture’s norms and don’t inhibit good conversations can stay. When Attorneys are policed by a strong high-rung culture, their one-sided arguments can provide potential truth material or serve as useful criticism of prevailing ideas. The right political culture can turn a wide collection of thinkers into a productive thinking system.

In high-rung political culture, people are micro-divided in their viewpoints and macro-united, in a broader sense, in their values.

They’re macro-united because they’re almost all liberals. Not “liberal” the way it’s often used in the U.S., as a synonym with “Left”—liberal the way the Enlightenment thinkers used it. Liberal meaning “committed to liberal values”—values like truth, human rights, freedom of expression, and equality of opportunity.

They’re macro-united because they share a common notion of reality. Their opinions will differ wildly, but they’ll usually agree on facts or the lack thereof.

They’re macro-united by a shared humility—an understanding of just how hard politics is and a self-awareness that knows it’s impossible to fully understand the values or the worldview of people who grew up in or live in circumstances different from your own.

They’re macro-united because they get how democracy works. They know that a successful democracy is one where everyone gets what they want only sometimes—where regular and widespread frustration and disappointment means the system is working.

Finally, high-rung political thinkers and activists are macro-united around the broad shared goal of a more perfect nation, along with a mutual understanding that they can move towards that goal only by being micro-divided within a vibrant marketplace of ideas. High-rung political discussions are boxing rings, where ideas get their asses kicked, but people don’t. When it’s safe for people to say what they’re thinking, Speech Curves line up with Thought Piles, turning high-rung thinking communities into giant superbrains.

And what exactly are people micro-divided about in the high-rung political world? Their debates center around three core questions:

Question 1: What Is?

You can’t figure out how to make a more perfect nation if you don’t have a good sense of what the nation currently is. What does the population look like, and how has it evolved over time? What are the current policies, and how do they work? Which experimental programs are being attempted, and what does the data say about their efficacy? How are resources currently distributed? How is the status quo being experienced by citizens of all kinds and in all circumstances? The study of What Is is the domain of science. Embedded in What Is, and critical to its understanding, is the study of What Has Been—i.e. how did What Is become What Is? This is the domain of history.

Both science and history are the search for truth—the quest to see reality as best you can. High up on the ladder, there’s disagreement around Question 1, but not too much conflict. Conflict happens when disagreement is accompanied by conviction, and two high-rung thinkers won’t usually both feel strongly about conflicting conceptions of reality. Conviction on the high rungs is a function of clarity, and if there’s clarity around a certain set of facts, high-rung thinkers will usually agree with each other. When things are hazier, two differing high-rung thinkers will both speak with doubt, and they’ll consider the points where their conceptions differ to be areas for joint exploration as part of a collaborative knowledge quest.

Question 2: What Should Be?

Unlike What Is, What Should Be is a matter of philosophy and often the subject of fierce conflict on the high rungs. High up on the arch, almost everyone’s goal is a more perfect nation, but thinkers hold different notions of what kinds of policies and systems are the fairest, the most morally right, and the most philosophically consistent. They’ll dig deep on lots of hard questions with no objectively correct answer:

What should the role of government be? Which freedoms should be restricted in the name of citizen protection and which shouldn’t? When does a fetus become a human being? What are the criteria for “equal opportunity” to be considered equal? How big and how powerful should government be, and where should the boundaries be drawn between state and federal government power? What should the country’s role be in the world, and under what circumstances should it involve itself in foreign affairs? When is it appropriate for the military to use force against other countries or police to use force against citizens? Which resources are rights, and which are privileges? The list is long, and the debates are heated.

Question 3: How to Get There?

What Is and What Should Be, when compared, yield the gaps between reality and the ideal. These gaps define the political objectives of the high-rung thinker. But even when high-rung thinkers do agree about What Should Be, they often completely disagree about the best way to bridge the gap from What Is to their vision of something better. No one is an expert at how to run a country, and there’s rarely a consensus about the most effective way to fix an identified flaw in the system. Two people who agree that the middle class should be larger than it is can completely disagree about which tax structure or government structures will best achieve the goal. Two people who feel the same exact way about the history of race in the U.S. can hold opposite viewpoints about the efficacy of affirmative action. Two people who both hate the current healthcare system can come up with entirely different government healthcare programs as their proposed solution.

Parsing political arguments using these three questions can help us isolate what the arguments are really about. Sometimes thinkers who agree philosophically will disagree strategically. Some who seem to agree strategically may actually be aiming for different outcomes. Some will disagree on all fronts.

Other times, disagreements may be more fundamental. Here, it may be appropriate to *cautiously* apply what have become two of the most unpleasant words in American English: Progressivism and Conservatism.

If we’re going to discuss these words—and the core concept behind each—the first thing we’ll need to do is put aside the baggage.

Well done. Now, anytime in this post we’re going to use politically charged words, we should make sure to agree on the definitions we’re using.

If you want to confuse yourself, google around for a while reading about “Progressivism” and “Conservatism.” Each of the words has been the banner for a huge range of political, economic, social, and philosophical ideas—some of them overlapping, some that are unrelated to each other, and some that are totally contradictory with others.

In the U.S., giant political Echo Chambers have appropriated these words as banners for themselves and for their enemies. And we’ll come back to what the words mean in that low-rung context, but let’s remind ourselves that the words themselves actually have pretty intuitive literal definitions, and I think those meanings provide an important and useful distinction in political thinking. At their most literal—and, because we’re dealing with Higher Minds at the moment, their most charitable:

Progressivism = concerned with helping society make forward progress—positive changes to the status quo. That progress can come from identifying what you deem to be a flaw in your nation’s systems or its culture and working to root it out, or by trying to make your nation’s strong points even stronger.

Conservatism = concerned with conserving what is already good about society—either by fighting against the erosion of what you deem to be your nation’s strong qualities, or by pushing back against well-intentioned attempts at positive progress that you believe, in reality, will prove to be changes for the worse, not for the better.

Put more simply, if a nation is a boat, high-rung Progressivism tries to make improvements to flaws in the boat and build newer, better features, while high-rung Conservatism tries to protect the existing boat against damage and deterioration.

Given that any nation, like any boat, has some things working well and others working poorly—along with the capacity to be both improved and damaged over time—Progressivism and Conservatism, the way we’re currently defining them, are simply the two sides of the “Let’s make this the best boat we can” coin. Two halves of a single noble quest for a more perfect nation.

As high-rung thinkers trudge their way up the mountain into a foggy future, some of the most fundamental disagreements will be those between a progressive and conservative mindset—the “to change or not to change?” disagreements, and their underlying “how well is this part of the system accomplishing what it’s supposed to?” and “what does a more perfect nation even look like?” disputes.2

It’s easy to see looking at this diagram why Progressivism is important. No country is perfect, and you can’t become a more perfect nation without making changes. Progressivism drives that change.

But Conservatism is just as important. Firstly, there are some aspects of a country that are working beautifully—and in these cases, the conservative impulse to resist the inevitable calls for change will be wise. Further, a country like the U.S. is permanently tasked with figuring things out as they go, and when it comes to running and adjusting a massive country in a rapidly changing world, everyone is an amateur. Mistakes will be made, and some changes will prove with time to have been ineffective or detrimental. In these moments, the voice urging the country to press the undo key and go back to the way things used to be will be the wisest voice.

Secondly, Progressivism is the collection of lots of different ideas—most of them untested—and inevitably, most of them will be bad ideas. Nations evolve the same way species do—through beneficial mutations. Coming up with mutations and pushing them into the national genome is the job of Progressivism. But for every beneficial mutation to a species, there are many more mutations which prove to be detrimental to survival. The conservative resistance to all progressive ideas provides a critical filter—a gauntlet that relentlessly tries to expose flaws in each progressive effort at mutation. Forcing progressive ideas to pass through intense conservative resistance in order to implement their desired change helps separate the wise ideas from the foolish or naive and protects the country from the latter kind.

It’s worth noting that I’m using these ism terms and not “progressives” and “conservatives” because the latter implies that people are either one or the other, and high-rung culture doesn’t equate people with their ideas. High-rung thinkers may tend to think in a more progressive or conservative way—but they are no line of thought.

Even using the words as adjectives for people—declaring yourself to be progressive or conservative, in general (as opposed to “holding a conservative viewpoint” or “tending to be progressive in a certain area of your thinking”) is an implicit presumption of uniform thinking across the board and through time. A single brash label for a person, or for their thinking, boxes in a person’s intellect and boxes in their evolution—and high-rung thinkers don’t like to be put in boxes, by themselves or by anyone else. This non-boxable phenomenon is apparent when I think about the high-rung political thinkers I know or know of, as it can often be frustratingly hard to figure out what their “deal” is politically.

But while the individuals in high-rung politics may bounce back and forth between the two camps, what is consistent is a substantial group of people falling into each bucket on any given issue. If we bring things into 3D and venture upwards on Emergence Tower, we can visualize the two groups as a progressive giant and a conservative giant.

If high-rung politics is a grand political courtroom, these giants are the two lawyers.3 When the “defendant” is the nation’s status quo or its traditional values, the progressive giant is the prosecutor and the conservative giant is the defense. In these cases, Progressivism will be the voice of negativity and criticism, while Conservatism will paint the rosier picture of the country as it stands, and its history.

But when it comes to how to change the country—when the defendant is the country’s evolution—the roles switch. Progressivism, now in the role of the defense, will tend to be a vocal proponent of change, while Conservatism, as prosecutor, will be critical of and resistant to change.

In both cases, each giant acts as a counterforce against the other and helps keep it in check. When the conservative giant gets riled up, it can drift too far into “Our country is perfect just as it is” or “Our country used to be perfect” territory. When the progressive giant gets out of hand, it can fall too far down the “Our country is and always has been awful” hole. The presence of its rival giant restrains each giant from becoming a ridiculous caricature of itself.

The clash of these two forces lies at the heart of the parts of society that evolve. I have a friend who’s a new mother and decided not to breastfeed her baby and use formula instead. She explained her reasoning to me and it made sense. I mentioned this to another friend, also a new mother, who thinks the first friend is crazy. Her reasoning made sense too.

Another friend of mine makes a compelling case about how women who can afford to should consider using a surrogate for pregnancy instead of getting pregnant themselves. I found this interesting and have brought it up with a few other friends, to hugely negative reactions.

I’m not sure who’s more right in either case, or if there even is a clear right and wrong side—but I know that some people having a progressive, “we should challenge the status quo” instinct in each area is important for our ability to evolve and improve, and some people having a knee-jerk conservative instinct to criticize and push back against progressive ideas is important for our ability to proceed prudently and effectively in our evolution. Together, they are the two lawyers that allow societal evolution to undergo “due process” in the marketplace of ideas.

This same tension exists at the core of debates about nutrition, wellness, parenting, education, professional sports rules, holidays, company culture, employment practices, and 100 other things. In each area, evolution is driven by progressive ideas and policed by conservative sensibilities. In any of these situations, people with a progressive mindset feel like they’re dragging more conservative people upward to a better place, while people on the conservative side feel that the progressive effort is dragging things downward to a worse place.4

Most of us will find ourselves on the progressive side in some of these “courtrooms” and on the conservative side of others. Even people who find themselves falling on the same side of most the debates I mentioned would hesitate to box themselves in by attaching their identities to that quality and letting that label automatically determine all of their viewpoints. High-rung political culture simply extends this way of thinking to politics as well.

Some political debates aren’t about “to change or not to change.” Instead, they’re about a spectrum of possibility and the debates are about where exactly on the spectrum our policies should lie.

In spectrum battles, which side ends up backed by the Left vs. the Right doesn’t always map on very well to “progressive” or “conservative,” but it doesn’t matter. The important thing is that each side of the spectrum has a group advocating for it. This allows the marketplace of ideas as a whole to home in on a point that represents a reasonable compromise. As the debates rage on and public opinion evolves, that point can evolve along with it. It’s democracy at its finest: everyone disagrees with each other in an unpleasant marketplace of ideas, and it results in a policy that represents a broad compromise that most people are somewhat unhappy with. There a lot of these types of issues in American politics:

Sometimes political issues revolve around priorities and where we should direct our attention. Here, again, high-rung politics usually organizes into a two-sided structure. A recent paper explored how the two giants differ in which parts of Emergence Tower they focus on. Here are their results (they call progressives “liberals”):3

In our language, that translates to:

The Left sometimes seems overly focused on the global and the universal, and the Right can be a broken record about individualism and community and family values—but when you remember that each is half of a two-part system, it all makes sense. They’re both just doing their part of the job.

It’s like a company having two founders, one who focuses more on operations and the other who thinks more about growth. Progressivism and Conservatism each worry about one half of every issue, and together, they make sure we’re paying enough attention to everything that matters.

Every person involved in high-rung politics has a Primitive Mind in their head that wants to identify with political parties and treat politics like a tribal war. But up on the high rungs, the Higher Minds have the edge—one that they protect with a pervasive high-rung culture. The culture keeps everyone—even the more partisan people—aware that ultimately, they’re all on the same team. As fierce as the debates between the high-rung giants can be, they know deep down that what they’re really doing is working together to navigate their way up the mountain, towards a more perfect nation.

But politics is bottom-heavy. And even the high-rung-thinking grown-ups among us are prone to morph into childish low-rungers when it comes to politics.

When our Primitive Minds get ahold of our political thinking, our political worldview, values, and general mentality jump in a time machine back to hunter-gatherer times. Politics ceases to be about figuring out the truth and building a more perfect nation and becomes geared toward ideological confirmation and triumphing over the bad guys. We forget how to do the Value Games and revert to the old human ritual—the Power Games. That’s why low-rung politics looks like this:

Politics done the Primitive Mind way leaves us in a place that can really only be called one thing.

Political Disney World

I’m pretty into most Disney movies. But especially The Little Mermaid, Beauty and the Beast, Aladdin, and The Lion King. I’ve never been sure if those are objectively the best four Disney movies or if everyone just thinks whichever Disney movies happened to come out when they were between the ages of 7 and 12 are the best Disney movies. Either way, clearly those are the four best Disney movies.56

The thing about those movies, though, is that they’re definitely fake movies, and definitely not real life. Right?

Like, kids might think Disney movies are the way the real world is, but everyone else knows that actually, the real world is not like Disney movies.

Right?

This is what I thought too—and then I started writing this post series.

After spending most of the last three years thinking about hardcore political partisans and their hardcore political Echo Chambers, it hit me: like 80% of the U.S. thinks they live inside a Disney movie.

I know it seems crazy.

I know it seems crazy that like 280,000,000 adult humans in 2019 think they’re a beautiful Disney princess living inside a magical Disney castle perched on a sparkling Disney landscape on a fluffy Disney planet—

But that’s the situation.

Let’s discuss.

Analog and Digital

When I wrote about Neuralink, one of the concepts I got into was the difference between analog and digital information (brain waves are analog signals, but they need to be converted to digital information in order to be processed by a brain-machine interface).

The thing is, ever since then, I can’t get analog and digital out of my head. I see it as a metaphor for all kinds of things in the world. Here’s what I mean:

Analog is what actually goes on in the natural world. It’s a perfect representation of reality: information in its natural, messy state. Sound is a nice example.7 Sound is analog information that can be represented by a wave:

Digitization is a way to approximate analog information using a set of exact values. Like this:

Information in digital format can be expressed as a series of 1s and 0s—an exact, binary format computers can process. When you listen to an mp3, you’re not listening to the true analog information made by the band’s instruments, you’re listening to a digitized version of the sounds—a big string of 1s and 0s that approximates the analog sound wave of the song.

Above, the sound wave has been digitized to eight incremental values, by rounding all parts of the wave to the nearest value. Eight values can be expressed by three “bits” (a three-number string of 1s and 0s). You can compress an mp3 into a smaller file by making your approximations of the analog wave cruder—by making the digital “steps” bigger, using only four values. Now you only need two bits.

The more you compress a sound file, the smaller the mp3 file gets, because bigger steps require fewer 1s and 0s to express the sound. But the song also sounds worse, because more “rounding” is happening to make cruder approximations—i.e. the sound has become lower-res. The size and sound quality of a digitized file all depend on how far down the digitization spectrum you go in your conversion.

At the far end of the digitization spectrum, you’d have only straight 1s and 0s—a tiny file that would sound almost nothing like the original song.

The same concept applies to visual information. Each pixel is a datapoint. You can make a photograph file smaller, and worse-looking, by making the pixels bigger.4

Another way to make it smaller is by reducing the real-world’s infinite gradients of color to 10,000 gradients, or 100, or 15.

The typical goal when we work with audio and visual information isn’t to try to go as high-res as possible—it’s to try to find the sweet spot: the crudest approximation you can get to while still accomplishing your goal. You want to weigh the costs of high file size alongside the costs of quality reduction and choose the optimal compromise for whatever you’re trying to do.

I’ve been thinking about this a lot because the general concept behind the digitization spectrum and the compromise it represents is relevant in all kinds of places. If someone asks you a time-related question, without realizing you’re doing it, you’ll answer the question at what you believe to be the optimal point along the spectrum. If you didn’t do this, you’d be a weird person.

      

In our thoughts and our conversations about life, society, politics, or anything else, we’re always negotiating this same balance. Digitization/approximation, when used appropriately, is an incredibly handy efficiency tool that leverages the human mind’s talent for pattern recognition. But digitization is inherently lossy—it intentionally does away with nuance—and the appropriate amount of digitization is up to whatever point where the lost nuance isn’t important, meaningful information—or at least where the lost nuance is less important than the gained efficiency.

Back to Disney movies.

The real world is analog—gray, amorphous, and endlessly nuanced. What Disney movies do is they digitize the shit out of the real world. They go the full distance, converting all that gray into clean black-and-white 1s and 0s.

Real people are complex and flawed, full of faults but almost always worthy of compassion. Disney characters, on the other hand, are either entirely good or entirely bad.8

It goes beyond characters. In the real world, each turn of events is mired in potential positives and potential negatives, which is a mess to sort out. Disney movies get rid of that messiness. Something that happens is either clearly good, or it’s clearly bad. Disney even digitizes the weather.

Disney digitization spares no one. Not even the birds.

Going full digital is logical in Disney movies. Their core audience is little kids, who aren’t ready yet to sort through gray. Before a person learns to think in nuance, they first need to learn the basic concepts of good vs. bad, right vs. wrong, safe vs. dangerous, happy vs. sad. It’s the same way you wouldn’t teach a beginner poker player about the difference between how to slow play a big hand when you’re in early vs. late position—you’d start by making sure they understood what a pair is, what folding means, and how the betting works. Going straight to the higher-level strategy would only confuse them.

If good Disney characters are shown to have deep character flaws, kids may misinterpret the message and think they’re supposed to mimic those qualities.910111213 And if bad guys are humanized, kids will get upset when things turn out badly for them in the end.

Digitizing an analog world into perfect cartoon simplicity makes sense. In fictional Disney movies. Made for kids.

But over-digitizing the real world is pretty bad idea—and unfortunately, that’s exactly what the Primitive Mind likes to do. So low-rung politics ends up feeling, to its participants, just like a Disney movie.

Up on the high rungs, people know the world is a mess of analog complexity. They look out at that world, with clear eyes, and see fog. They also know that people are little microcosms of the messy world—each person an evolving gray smattering of virtues and flaws.

Political Disney World is much more fun. Everything is nice and crisp and perfectly digital. Good guys and bad guys, with good ideas and bad ideas, respectively. Good politicians and bad politicians with good policies and bad policies. Right and wrong. Smart and ignorant. Virtuous and evil. Safe and dangerous.

1s and 0s.

In the foggy minds of Political Disney World, it’s all quite clear.

At the heart of every faction in Political Disney World (PDW) is a guiding narrative. PDW narratives are all-encompassing versions of reality—they come with their own worldview, their own telling of history, their own description of the present, and their own explanation for the causes behind all of it. A unique, customized Disney movie for the tribe, by the tribe.

Every country has a Political Disney World, each with their own factions and their own narratives. I live inside “Political Disney World, U.S.,” where there are two major factions: the low-rung Democrats and the low-rung Republicans. Their narratives digitize both people and ideas.

How PDW Narratives Digitize People

Central to each narrative are the main characters. In some stories, the protagonists live here—

—while the bad guys are some version of these:

In other stories, the protagonists have this vibe—

—while the bad guys are more doing this thing:

The important thing is that the characters can be divided into clear digital 1s and 0s, because that’s the kind of story the Primitive Mind understands the best.14

In the U.S., when the Democrats imagine their Republican opponents, they tend to see them as Mr. Mean Man. Mr. Mean Man takes a few forms, usually one of these:

In the Democrat Disney kingdom, the traditional narrative tells the story of righteous Democrats in a continual struggle to pull the country upwards to a liberal utopia as mean, bigoted Mr. Mean Man uses all his weight to try to pull the country back down into an underwater Backwards Land of all-powerful corporations run by gun-swinging Nazi rapists.

On the other side of things, the low-rung Republican narrative paints their Them group—the Democrats—as Miss Shitty Pants, who might be depicted like any number of these:

In the Republican Disney kingdom, the standard story looks a little different. It’s about the honest, hard-working families doing their best to stand their ground as the stupid, lazy, morally defunct Miss Shitty Pants tries her hardest to pull the country down into a dystopian hell of a tyrannical government run by ivory tower elitists that gives endless handouts to hordes of gay, Muslim immigrant terrorists.

In high-rung politics, it’s understood that people aren’t 1s and 0s—they’re all 0.5s, each in their own messy, complicated, unique way. And to people who see people as 0.5s, it’s clear that PDW narratives not only dehumanize their opponents, they dehumanize everybody into fake cartoon people.

Digitizing people is a practice in moral dualism. The world of low-rung religion (Religious Disney World) does this all the time, with their gods and devils, their believers and infidels, their heaven and hell. Political Disney World does the same thing, just using different terms. A digital people mentality is why people in PDW rarely marry someone with opposing political views (something people in the high-rung political world do all the time). It’s why people in PDW tend to feel an endless well of compassion and understanding for bullies, blunderers, and criminals within the protagonist group, while dropping all semblance of empathy for bad actors on the evil side.

How PDW Narratives Digitize Ideas

Political Disney World is also big on digitizing ideas, using one of PDW’s defining features: the checklist. A narrative’s checklist allows its thinkers to trade in the gray mess of nuanced “What Is,” “What Should Be,” and “How to Get from A to B” debates for a perfectly digitized list of binary issues with a Good, Correct Stance and an Evil, Wrong Stance. In the U.S. narratives, the current checklist includes items like these:

In each case, what is treated as a complex debate up on the high rungs digitizes out to perfect cartoon simplicity down below.

Some telltale signs that people are deriving their viewpoints from a checklist:

  • They abide faithfully by the entire list of protagonist viewpoints, with no exceptions. They can scan down their side of the above checklist and, without hesitation, check off every box.
  • For each issue, they tend to see the Them stance as having 0% merit.
  • They have strong feelings about the specific issues highlighted by the checklist but have little to say about all the other issues that matter to their country. Issues played up in the media are like plotlines in the Disney movie narrative, which you’ll hear constant emotional discussion about, while other issues are like plotlines that didn’t make it into the movie’s final cut—and in PDW, you won’t hear people talking about them at all.

Anytime a bunch of adults are pretty sure that they live in a Disney movie, there can only be one explanation:

They’ve been sucked into the Power Games.

New World, Old Games

The Power Games, as you’ll recall, is what humans evolved to do a long, long time ago. They’re super simple, with the only rule being:

Everyone can do whatever they want, if they have the power to pull it off.

Our Primitive Minds only know how to make sense of the world through the Power Games lens—and when people in modern societies are playing the Power Games, it’s a sign that Primitive Minds have hijacked the culture. Primitive Mind smoke is like a virus, and when a culture becomes permeated with it, it spreads through minds like an epidemic. Soon, almost everyone is convinced that they live in a Disney movie, where everything is 1s and 0s, and they’re the good guys—allowing the Power Games to rule the day.

In Chapter 4, I laid out the American notion of fairness using this graph:

The graph is a bit complicated (go here for a full refresher), but the basic idea is that the U.S. is based on a freedom/equality compromise. The U.S. Zone represents the region of compromise that the country is supposed to stay in at all times. The areas outside the U.S. Zone are restricted because those areas would mean the Power Games has taken over.

In theory, the two American political parties are somewhere around here:

Inevitably, a lot of Americans who read this chapter will yell at me and say I’m committing a gross false equivalency. Their reasoning will be that while their party is indeed behaving themselves neatly inside the U.S. Zone, the other party is playing all kinds of Power Games in the restricted areas.

People on the Left will say it’s like this:

People on the Right will tell this story:

The thing is though, there are ample studies that suggest both parties are pretty similarly intolerant and similarly biased. Whether one is a bit worse than the other in any given year or decade is less important to our discussion than the fact that both are bad.

Both parties are a bit challenged on the adult vs. grown-up thing, buying fully into the middle-school-esque “in-group/out-group” social structure—a classic sign of the Power Games. And both are totally down with gross negative generalizations of the out-group (John Cleese explains further).

On both sides of PDW, people would struggle to name three policies they like of a president on the Them side of things and three legitimate areas where an Us president has gone wrong—even though every president does a lot of good and bad things. People on both sides tend to believe that if only everyone in the country shared their viewpoints and values, all national problems would be solved. All signs of simplistic, tribal thinking. All signs of the Power Games.

Probably the clearest sign of the Power Games is rampant hypocrisy. High-rung thinking is all about values and principles, and there’s an effort to remain consistent about them in the face of the inevitable tug of tribal attachment. But the Power Games has only one principle: power. As George Orwell succinctly said it in 1984: “The object of power is power.”

Channeling more Orwell, writer Andrew Sullivan sums it up nicely:

George Orwell famously defined this mind-set as identifying yourself with a movement, “placing it beyond good and evil and recognising no other duty than that of advancing its interests.” It’s typified, he noted, by self-contradiction and indifference to reality. And so many severe critics of George W. Bush’s surveillance policies became oddly muted when Obama adopted most of them; Democrats looked the other way as Obama ramped up deportations to levels higher than Trump’s rate so far. Republicans, in turn, were obsessed with the national debt when Obama was in office, despite the deepest recession in decades. But the minute Trump came to power, they couldn’t be more enthusiastic about a tax package that could add trillions of dollars to it. No tribe was more federalist when it came to marijuana laws than liberals; and no tribe was less federalist when it came to abortion. Reverse that for conservatives. For the right-tribe, everything is genetic except homosexuality; for the left-tribe, nothing is genetic except homosexuality. During the Bush years, liberals inveighed ceaselessly against executive overreach; under Obama, they cheered when he used his executive authority to alter immigration laws and impose new environmental regulations by fiat.

In the Power Games, principles will lose to power every time. While people in high-rung politics are criticized for flip-flopping on their principles (as in the above paragraph), PDW flip-floppers are criticized for the opposite reason: you get in trouble on the low rungs for flip-flopping on policy positions in an effort to stay consistent with principles. Integrity matters up high, loyalty matters down below.

Liberalism itself is a set of principles, and in Political Disney World, people won’t hesitate to go illiberal if it helps with tribal victory. Beyond common PDW illiberal practices like selective empathy or being selectively supportive of core liberal rights like free speech, there’s the illiberal way people in PDW view democracy. When people in low-rung politics lose an election, they scream that they’re disenfranchised, they insist that the system must be broken,15 and they have an impulse to overthrow the opposition leader. When their candidate wins, they say things like, “faith in democracy restored!”—i.e. democracy is only working when my candidate wins. This isn’t the mindset of someone who believes in democracy—it’s the mindset of someone who believes in dictatorship but who is stuck in a democracy.

This is why it’s bad that the U.S. has come to redefine the word “liberal” as a synonym for “progressive.” While “progressive” is an x-axis word, “liberal” is a y-axis concept.

When we do a “zoom-up” on Emergence Tower, we’re reminded that what feels to PDW members like being a protagonist in a Disney movie is actually just being a uniform cell in a big, dumb, Power Games giant.

Variations in the Us/Them Divide Blue Box

People with a Power Games mentality will almost always divide into the Us vs. Them format—the thing that varies is how big the giants in question are. This is what that Bedouin proverb is getting at (feel free to refresh yourself on my cartoon depiction):

Me against my brothers; my brothers and me against my cousins; my cousins, my brothers, and me against strangers.

During primary season of elections in the U.S. the Us/Them divides move down Emergence Tower to the “me against my brother” level, as factions within each side go at it.

During a war, the Us/Them divide moves upward on Emergence Tower to the “whole family against strangers” level, temporarily uniting the country as one big Us.

But those are special circumstances. In normal times, the U.S. likes to be at the “cousins” level in between, where one half of the U.S. is pitted against the other half.

Since this is the norm, we’ll focus on these two big national factions.

Keeping the giants glued together

Each giant’s guiding narrative, which feels so much like reality to the people inside it, is just another superglue story.

If high-rung politics is micro-divided and macro-united (people disagree, giants work together), low-rung politics is the opposite: micro-united (people in a giant all agree) and macro-divided (giants are enemies with other giants). Keeping things this way is the critical objective of the superglue story:

Keeping things micro-united

A low-rung tribe is like an ant colony, and it needs all of the ants in solid agreement and working together. This isn’t always easy, given the motley crew that makes up a PDW faction. This crew includes a few classic types, each there for their own reasons.

Some prominent members of any PDW faction:

Zealots: People who believe every word of the narrative.

Tribalists: People who love being part of a big, powerful in-group and talking shit about the out-group. These people were usually either super popular in middle school and use politics to relive their glory days or super unpopular in middle school and use politics to revel in the other side of things.

Opportunists: People who use politics to gain social status or career advancement, to sell books, to get clicks, or any other number of ways politics can generate profit.

Soul-searchers: People who have been convinced that politics can be a get-rich-quick scheme for meaning, purpose, intellectual conviction, moral conviction, self-esteem boosting, or any other parts of life that are, in reality, far harder than that to achieve. These people are also great candidates to buy weight-loss pills guaranteed to make you skinny with no work and snake oil balm guaranteed to make your hair grow back or your money back.

Intellectual townies: A me-coined term I’m super proud of. People who never “move out of their childhood hometown,” intellectually or morally.

Undercover high-rung thinkers: These people’s minds are up on the high rungs, but the low-rung culture they’re immersed in has successfully intimidated them into keeping their mouths shut.

As far as the giant is concerned, this odd coalition falls into two categories:   

And the important thing is making sure that on the outside, things stay like this:

That means making sure that everyone who believes the narrative continues to believe the narrative and everyone who doesn’t believe the narrative continues to pretend like they do (either out of fear or profiteering). The sacred narrative baby must always be said to be cute. This is what it means for a tribe to be micro-united.

Keeping things macro-divided

As the Bedouin proverb reminds us, in the Power Games, the best glue of all is a good common enemy. And the bigger a giant you want to build, the bigger the common enemy you’ll need to keep things glued together—because if the Them giant isn’t big enough, the Us giant will inevitably fracture into a new Us/Them structure. To serve this cause, low-rung political giants will typically frame politics as a zero-sum game—one in which the goals of the good guys can only come as a result of the bad guys losing (and vice versa). And they’ll focus a ton of energy on the part of the narrative that talks about how stupid, ignorant, evil, bigoted, opportunistic, sneaky, toxic, backward, selfish, and most importantly dangerous the bad guys are, making lots of memes like this:5

And this:6

Citizens of Political Disney World will be even better trained to rattle off the narrative’s story about how bad the bad guys are than they are to rattle off why the good guys are good.

The bad-guys-are-bad part of the narrative is especially important because on top of its common-enemy glue benefits, it is the critical foil the story’s protagonists need in order to feel like protagonists. Without Jafar, Aladdin is no longer a hero—he’s just some guy. That’s why Mr. Mean Man has to always be super mean and Miss Shitty Pants has to always stay shitty-pantsed.

Protecting the glue

Power Games giants glued together by belief in a certain story need a very specialized environment to survive. Unlike the inherent robustness of values-based high-rung giants, Power Games giants are brittle and vulnerable. When you rely on people fervently believing an all-encompassing, mostly fictional reality when real reality is all around them, you need to maintain a lot of control to keep things in order.

The sacred narrative would be torn to shreds in the marketplace of ideas outside the kingdom’s walls, where high-rung thinkers roam and no idea is safe from criticism. Loyalists would be not only told but shown clear evidence that Disney World isn’t a real place, like when a shitty five-year-old bursts the kindergarten class’s bubble by spilling the truth about Santa Claus. Unacceptable.

Traditionally, brittle Power Games giants have avoided having their bubble burst with strict laws that control the flow of information—like King Mustache’s laws in Hypothetica. But in a country with laws like the First Amendment, Echo Chambers are forced to police speech with culture. The right culture can serve as a filter system, which both enriches the giant with glue-strengthening narrative confirmation and protects the giant from every Power Games giant’s kryptonite—doubt.

The PDW Giant’s Filter System

1) The Media Filter

In today’s world, every political Echo Chamber giant has its own media channels, which serve as the giant’s eyes and ears. These media channels are for the Echo Chamber, by the Echo Chamber, and they’re the first line of defense in upholding the giant’s belief in the sacred narrative. To keep the giant strong and well-fed, they sensationalize the stories that confirm the narrative, like an amplifier. To keep the giant free of intellectual contamination, they downplay stories that challenge the narrative or neglect to report them at all.

On any given day, just do a side-by-side at foxnews.com and msnbc.com, or breitbart.com and huffpost.com/news/politics, or townhall.com and salon.com, and you’ll see the two major U.S. filters at work. One amplifies a story, the other muffles it. When they do report on the same story, their framings reverse who the protagonists and antagonists are, to mold the story to fit the narrative (Scott Alexander lays out some good examples here—and this is kind of interesting).

2) The Sharing Filter

If the Media Filter determines what ends up in the PDW giant’s brain, the Sharing Filter sets the rules about how information circulates through the brain.

A key safeguard against those in the tribe who don’t actually believe the narrative, the giant’s political culture provides powerful social incentives to keep everyone’s Outer Selves in line and saying the right things.

Expressing narrative confirmation is socially rewarded while challenging the narrative is laden with taboo. Because remember how the Agreement-Decency thing works in an Echo Chamber:

The Media Filter will never be perfect, but the Sharing Filter can clean up the mess. When compelling alternative viewpoints make it into the giant’s brain, they have hard time making it very far, as every neuron in the brain is socially incentivized not to pass it along to other neurons. The same system works as a market for narrative confirmation. When people share narrative confirmation, the most snappily worded and convincingly argued receive the biggest rewards when they’re shared, which then incentivizes others to share them too (Twitter retweet numbers are a nice example). The best of the best pieces of confirmation go viral, spreading like wildfire through the Echo Chamber.

3) The Individual Bias Filter

Any scraps of compelling dissent not caught by the first two filters usually meet their doom at the gates of the final filter—the biases of the giant brain’s neurons: individual minds. Those who do believe the narrative are thinking from the low rungs, in Unfalsifiable Land, where they’ll use all of those low-rung tricks from Chapter 7 to make sure to stay unconvinced by any dissent that manages to reach them.

Low-rung political thinkers, Reasoning While Motivated, will do the “Can I believe this?” / “Must I believe this?” toggle on their Skepticism Meter:

Like trains in biased motion, they’ll see any skeptics of their beliefs as worse thinkers than they actually are, making it easy to disregard the info right off the bat.

The Thanksgiving Dinner Table Hideous Political Conversations Blue Box

It’s this third filter that lies behind phenomena like the whole “Oh my god I’m dreading the political conversations at the Thanksgiving dinner table so much” thing. When I hear someone say this, I know one of three things is happening:

1) The person talking about their Thanksgiving dread is part of a low-rung political giant and they’re dreading the one day of the year when they’re with high-rung political family members who will challenge them.

2) Same as #1 except the dreaded family members are also low-rung political thinkers, from the opposite Disney kingdom.

3) The person is a high-rung political thinker who is dreading their annual interaction with low-rung political family members.

Low-rung dreading high-rung, low-rung dreading low-rung, or high-rung dreading low-rung. The one thing I know is not the case is a high-rung thinker dreading interaction with a high-rung thinker who disagrees with them, since high-rung thinkers don’t dread having political conversations with each other. At least one of the parties involved in a nightmarish Thanksgiving political conversation is from the low-rung political world. And they’re dreading it because it’s a moment when their usual info guardians—the Media Filter and the Sharing Filter—will not be able to shield them. They’ll be exposed to challenges to the sacred narrative they identify with, and they’ll have no tools to handle that interaction. So the third, final filter of individual unfalsifiability will be left to fend off the challenge, which tends to make for an unpleasant interaction.

We can imagine these three filters looking something like this:

While high-rung giants gauge their filters to expose the truth, we can see how PDW filters work hand-in-hand to keep the giant glue strong.16

But Political Disney World doesn’t stop there. The filter system is great for managing the world’s real information, but when real information doesn’t cut it, a political giant has to take matters into its own hands.

Fallacies

If there’s one thing we’ve established in this series, it’s that humans aren’t good at reality. For us, trying to figure out what’s right and what’s real is like an obstacle course lined with cognitive pitfalls. The smartest people I know spend a huge amount of effort trying to become experts on their own irrational tendencies in order to become better thinkers, and they’re still pretty bad at reality. That’s why the high-rung Idea Lab culture is so important—it turns the reality obstacle course into a team effort.

But what if reality isn’t your goal? What if reality is itself the obstacle?

Political Disney World turns confirmation bias into its own team effort—it does confirmation bias on a systematic, industrial scale. And when the mission relies on people getting reality wrong, human cognitive deficiencies are invaluable tools.

One such tool is the fallacy. If human reasoning is an outdated 1.0 software program, fallacies are the glitches and bugs.

We fall victim to fallacies by mistake all the time. A classic example is the sunk cost fallacy. As an untalented illustrator, I learned long ago that it’s usually a terrible idea to draw elaborate backgrounds in my illustrations. Just draw the three stick figures talking over a white background—skip the street and the trees and the sky and the sidewalk they’re standing on. And yet—sometimes I forget that lesson and decide to get all Bob Ross, like “well what if…what if I just put a happy little tree over there next to the people…well that looks weird like a floating tree…so I’ll make some ground…how do you make ground again?…I’ll try drawing a line…that looks bad…ooh okay I’ll draw grass…” Suddenly it’s 18 minutes later and I’m drawing individual strands of grass and questioning my entire existence.

At that point, a little part of my brain is like, “So you’re about halfway into finishing this background. The background doesn’t look good. It looks bad. The drawing would be better without it. It was a cute idea but it failed. So just delete the background and move on.”

And then a much bigger, glitchier part of my brain is like, “Huh? No. Of course I’m not deleting this bad background I just spent 18 minutes doing half of. That would be a total waste of 18 minutes—which would be incredibly unsatisfying. I’m not allowing those 18 minutes to go to waste. I’m finishing the background. If it makes the drawing worse, then that’s just what’ll have to happen.”

So I spend 18 more minutes finishing the background.

Rationalist Julia Galef likens this situation to walking to a store that’s 20 minutes away, only to learn 10 minutes into the walk that the store is closed… and then deciding to “finish the job” and walk all the way to the store anyway, since you already started. Obviously that would be deeply inane—but that’s exactly what I’m doing when I finish my bad background. To avoid having the 18 minutes I already spent go to waste, I’ll waste another 18 minutes, even though the first 18 minutes is already gone and spent either way. It’s a sunk cost.

We all commit the sunk cost fallacy. Sometimes it leads us to stick with jobs or relationships we know deep down are wrong for us. Sometimes we go the full distance with a long project even though, after having put some work into it, we’ve come to the realization that it wasn’t such a great idea. Sometimes we read the last 250 pages of a book we’re not liking very much because we’ve already read the first 100. In all cases, we do it because we simply can’t bear to acknowledge that some of our time has officially been wasted. So we double (or quadruple) down.

It’s a reasoning error. It makes absolutely no sense—as Julia’s example illustrates—but we do it anyway. Because we’re bad at reality.

The sunk cost is a famous one, but there are a lot of common fallacies. Wikipedia lists over 100 of them here. Our reasoning software sucks.

But fallacies aren’t always mistakes. If you’re trying to win an argument and you’re not doing so well, you might try pulling a fallacy out of your bag of dirty tricks. If your opponent doesn’t catch it, it’ll appear to be a beautiful point in your favor.

Political Disney World is pretty big on both accidental and intentional fallacies. Let’s go through some of the most prevalent, in three categories:

Category 1: Fallacies that misrepresent reality

The practice of misrepresenting reality falls on a spectrum with “slight data nudging” on one end and “total fabrication” on the other. Low-rung politics has a long tradition of misrepresenting reality by concocting questionable studies and misleading statistics or by spinning real events in a way that best fits the narrative.

A common type is what I call the Trend-Anecdote Swapper.

It’s simple: If you come across an anecdote that supports the narrative, you put it through the swapper and frame it as evidence of a larger trend to make it seem representative of broader reality. Meanwhile, if there’s an actual trend happening that really is representative of broader reality—but it’s a trend that makes your narrative look bad—you just put it through the swapper, and it’ll come out the other side framed as nothing more than a handful of freak anecdotes.

For example, imagine your tribe’s narrative says that dogs are almost always good boys (and anyone who says otherwise is a bigot), while most raccoons are dangerous, vile creatures (and anyone who says otherwise is a bigot). Now imagine that one week, these six news stories happen:

The actual reality here isn’t really your friend. Your narrative, like all PDW narratives, leaves no room for mixed messages. Dogs are good. Raccoons are bad. Period. Meanwhile, the actual information at hand here suggests that maybe both can be good sometimes and bad sometimes. So you pull out the Trend-Anecdote Swapper and get to work.

You start by categorizing and color-coding the stories as they actually seem to be.

Then, when there’s an inconvenient red trend, you use the Trend-Anecdote Swapper to reframe it as nothing more than an anecdote: 

When there’s a helpful green anecdote, you use the Trend-Anecdote Swapper to make it seem like part of a larger trend.

By the time you’re done, the colors have sorted themselves out nicely: red on the left, green on the right.

Another common fallacy uses what I call the Causation Arrow. The most 101 concept in Statistics 101 has to be: correlation does not imply causation.

A nice example, courtesy of Jonathan Haidt: A 2013 study found that people who have sex more often make more money. If you weren’t being cautious with your Causation Arrow, you might read a headline about the study and jump to the conclusion that having more sex caused people to make more money—or that making more money led people to have more sex. In reality, the study found that a third variable—extraversion—lies behind both the sex and money trends.

Any correlation stat—”variable A is correlated with variable B”—actually leaves us with four possibilities:

In high-rung politics, people assess every correlation and try to determine which of the above is actually going on. But in Political Disney World, people just go with whichever of the four possibilities best supports the narrative. They grab their Causation Arrow and point it in the most convenient direction.

 

Of course, presidential debates are full of fighting over Causation Arrows. The incumbent candidate will claim that every positive trend during the past four years was caused by his presidency and every negative trend happened in spite of his great policies. The challenger candidate will say the opposite, in both cases.

Returning to dog-raccoonville, imagine that this graph starts making the rounds on Twitter.

If you’re in the “dogs good / raccoons bad” tribe, you won’t hesitate to pull out the Causation Arrow and use the graph as evidence that raccoons are hurting the city. If you’re in the pro-raccoons tribe, you’ll call the correlation a coincidence or ignore it altogether (and call anyone who shares the graph a bigot). In neither case will you actually be getting to the bottom of why unemployment is going up—which makes sense, because the goal in PDW isn’t a more perfect country, it’s political triumph.

This is an example of how the Causation Arrow can also be used as a Blame Arrow. The pro-dog crowd could use the arrow to further nudge the day’s news in their favor by fiddling with blame in two of the stories: 

Then, to top things off, the pro-dog media channels will add on their own twist:

Of course, that’s just the pro-dog side of things. This whole time, the pro-raccoon tribe has been outraged about a whole different set of stories:

One of the critical defining features of high-rung politics is a shared sense of reality—a shared understanding of What Is. In Political Disney World, the beliefs and viewpoints of people in different tribes are premised on entirely different conceptions of reality. Of course they can’t find any common ground.17

Category 2: Fallacies that misrepresent an argument

In Chapter 7, we talked about how a viewpoint is nothing more than a hypothesis until it’s gone through testing.

The real test of any argument is how well it stands up in the face of rigorous criticism. When you’re confident in your viewpoint, you love a chance to throw it into the ring with other arguments and watch it show off its strength. Like real boxing, the stronger the opponents you’ve beaten, the better your ranking. That’s why a strong college paper always includes a strong counterargument—it lets the thesis “show off” in front of the professor.

But what if you’re not so confident in your viewpoint? And you still want to make it seem like it can do well in the boxing ring? As a procrastinator who wrote a lot of hasty, shitty papers in college, I can tell you firsthand that one of the trademarks of a paper with a weak thesis is an even weaker counterargument.

When exposed to real opponents not afraid to tear apart bad arguments, oversimplified PDW narratives end up TKO’d in round 1. That’s why political Echo Chambers are so intent on making it taboo to criticize the narrative—it’s their way of banning anyone from landing a good hit on their sacred baby.

But to generate the kind of intense conviction in its members that of COURSE the narrative is correct, political Echo Chambers need to make it seem like the narrative is a champion heavyweight boxer who demolishes anyone who tries to prove it wrong. So how can this happen when no actual living, breathing dissenters are allowed to fight the narrative?

Here’s the trick: The Echo Chamber stages scripted fights that seem real to the Echo Chamber’s members, but where the narrative always comes out on top. To pull this off, they use one of the most tried and true tools of the low-rung intellectual world:

The man machine takes real criticism of the narrative and converts it into easy-to-beat opponents. Here are three of the most common:

The Straw Man

To make a Straw Man, the man machine reframes the wording of a strong dissenting argument, transforming it into a much weaker argument.

To see how it works, let’s first watch a standard low-rung political narrative face off against real dissent from outside the Echo Chamber.

As expected, that didn’t go very well. But the man machine can save the day.

We’ve all used this tactic.

When we create Straw Men, we sometimes do it knowingly, sometimes cluelessly. Most of the time, we probably do it with our subconscious knowing what we’re doing but our conscious mind in denial that we’re pulling a cheap trick.

In public arguments, the goal of an arguer isn’t to change the opponent’s mind as much as it is to win over a viewing audience. Here, arguers will use Straw Men in hopes that the audience isn’t smart enough to notice the sleight of hand.

Using a Straw Man can make you appear victorious to unwitting viewers, like a boxer who takes a swing at the balls mid-match and hopes the ref won’t see it. You wouldn’t think it could work, but humans are bad at reality, so straw-manning often goes unnoticed.

In a courtroom or debate stage, the opposition at least has a chance to object to or refute a straw man attack. But usually, the opposition doesn’t get a voice at all.7

In Political Disney World, when a cleverly worded Tweet or op-ed straw-mans the opposing side, it goes viral, and soon, the farce boxing match is played on loop throughout the Echo Chamber, ad nauseam.

The Weak Man

The Straw Man is a well-known fallacy. But in the past decade, people have begun talking about what political theorist Robert Talisse calls the Weak Man fallacy.

Straw-manning takes a strong argument and distorts it into a weak one. Weak-manning takes a strong argument and hand-picks the weakest part of it, or the weakest version of it, and attacks that. When you handily defeat the weak argument, you then frame it as if you’ve defeated the argument, in general.

Partisan media are big fans of the Weak Man. People like Jon Stewart and Tucker Carlson have made entire careers out of weak-manning.8

Weak-manning is why everyone in low-rung politics sees the other side as absolutely indefensible and unforgivable. They’ve been presented again and again with the worst of the other side’s low-rung giant, and they’ve come to believe that it’s representative of the other side as a whole.

The Hollow Man

The Hollow Man does away with the work of distorting or cherry-picking the dissenting argument and just fabricates one from scratch. Often framed by “some people say” or something else vague, the Hollow Man is the ideal opponent for the narrative—the easiest match possible.

In 2004, in order to refute opponents of the Iraq War, George W. Bush said:

“There’s a lot of people in the world who don’t believe that people whose skin color may not be the same as ours can be free and self-govern…I reject that. I reject that strongly. I believe that people who practice the Muslim faith can self-govern. I believe that people whose skins aren’t necessarily—are a different color than white can self-govern.”

In other words:

The Hollow Man argument is a viewpoint held by no one at all, created just to make the opposition look as bad as possible. It seems like a ridiculous tactic—until you remember that Political Disney World is a ridiculous place. Today, in the enchanted castles of PDW, Hollow Men are roaming around everywhere.

In PDW, the power of the man machine goes beyond winning individual arguments. In 1961, social psychologist William Maguire wrote about what he called the “inoculation effect.” Vaccines work by exposing a person’s immune system to a weak version of a dangerous virus. After the body defeats the weak version of the virus, it develops an immunity against all versions of the virus, including the strong ones. Maguire found that people’s beliefs worked in a similar way. He wrote:

[B]eliefs can be “inoculated” against persuasion in subsequent situations involving forced exposure to strong counterarguments by pre-exposing the person to the counterarguments in a weakened form that stimulates—without overcoming—his defenses.

If Straw Man, Weak Man, and Hollow Man arguments are repeated enough inside a political Echo Chamber, they become people’s ubiquitous conception of what dissenters to the narrative think—eternal proof of how right the narrative is and how stupid anyone is who says otherwise. Soon, any version of dissenting arguments—even the strong ones—will be disregarded as nothing more than better-worded versions of the well-known absurd dissent. People will have become “immune” to changing their mind on the topic.

This also makes it even less likely that anyone inside the Echo Chamber will dare challenge the narrative—because the second they do, people will hear it as a defense of all of those terrible arguments and evidence of the challenger’s own stupidity and awfulness. Social penalties will ensue.

But argument-misrepresenting fallacies can do more than attack opponents. They can also be used for defense.

The Motte and Bailey

The “motte and bailey” fallacy is a recently named piece of age-old trickery (coined by Nicholas Shackel and further popularized by Scott Alexander).

The name comes from a type of two-part medieval fortification common in Northern Europe between the 10th and 13th centuries. It looked something like this:

The bailey is an area of land that was desirable and economically productive to live on but hard to defend. It would always be vulnerable to attack. That’s where the motte came in. A motte is a hill in or adjacent to the bailey with a wooden tower on top of it. When the bailey was threatened, inhabitants would run up the motte and into the tower. The motte, unlike the bailey, was easy to defend and nearly impossible to conquer—so invaders who captured the bailey would be unable to conquer the whole fortification. Eventually, with arrows raining down on them from the motte’s tower, the attackers would give up and leave, at which point the inhabitants could resume life in their pleasant and profitable bailey.

Shackel used the motte and bailey as a metaphor for a cheap argument tactic, whereby someone holding a convenient but not-very-defensible “bailey” viewpoint could, when facing dissent to that viewpoint, quickly run up the motte and swap out the viewpoint with a far stronger “motte” position.

 

The motte and bailey is using the man machine reverse—instead of swapping an opponent’s strong argument for a weaker one, it swaps out your own questionable argument for an irrefutable one. The goal is to make it seem like the two arguments are essentially the same, and that anyone who agrees with the motte statement must also agree with the bailey argument. It’s an attempt to stitch one position to another and use it as armor.

Political Disney World is a land of sprawling baileys, dotted with motte hills. And if you listen carefully, you’ll notice people darting up to their trusty mottes, using them as trump cards whenever their views come under fire.

Fallacies that misrepresent arguments let people twist, mold, and fabricate arguments in order to engineer faux boxing matches. These tactics go a long way toward making the PDW giant nearly invincible to the outside world. But when all else fails, low-rung political thinkers can reach into their bag for the dirtiest trick of all:

Category 3: Fallacies that misrepresent people

Paul Graham once laid out what he calls his hierarchy of disagreement, which can be summed up like this:9

According to Graham, the lowest forms of disagreement are attacks on the person arguing against you instead of the argument itself. On the very bottom level, name calling is the trashiest form of argumentation and the trademark of someone who knows they have little ability to win a real debate. Name-calling is also often a sign an argument’s substance isn’t really relevant because the disagreement is mostly a vehicle two people are using to vent anger onto each other. In any case, no one in human history has ever gotten to the bottom of anything while throwing insults. It can be fun though.

One level up, you have the slightly more civilized ad hominem fallacy. People often use “ad hominem” as an umbrella term that includes name-calling, but here, we’re referring to the specific practice of discrediting dissent based on who the dissenter is instead of attacking the argument itself. Another form of ad hominem fallacy is bringing up your own authority on the matter as a way to add credibility to your argument.

In Political Disney World, ad hominem arguments happen constantly, partially because people on the low rungs are childish arguers—but also because on the low rungs, ad hominem arguments are incredibly effective. The reason they’re effective is that the less someone knows about the substance of an issue, the more they’ll form their judgments based on how much they trust the messenger. In low-rung politics, people who seem trustworthy also tend to seem correct and well-intentioned, regardless of the quality of their arguments. And vice versa.

Standard tribalism takes care of most of the trust allotment. Earlier this year, professors Steven Sloman and Elke Weber compiled a wide range of articles exploring the science behind political polarization. Many of the findings confirmed the intuitive: that people are highly uncharitable in their assumptions about those in their political out-group. For example, if an opposing candidate has mostly mainstream views but holds a few extreme positions, people tend to make the assumption that the candidate’s supporters voted for them because of, not in spite of, the candidate’s extreme positions. But there’s no evidence that this is true. Another study found that “constituents are likely to attribute the actions of in-group leaders as intended to benefit the country (national interests), and the actions of out-group leaders as intended to benefit the political leaders themselves (egoistic interests)”—even when the actions in question are identical.

So people in PDW are already predisposed to not trust those who challenge the narrative—and therefore, to not believe their arguments, regardless of the substance. But a strong tradition of ad hominem reasoning helps cement this key stability mechanism.

Enemies of a political Echo Chamber are regularly discredited based on their background, their religion, their race, their gender, their education, their profession, their friendships—none of which addresses whatever clearly-wrong, not-even-worth-listening-to argument they’re actually making.

Dissenters are smeared by quotes pulled out of context, a tactic that can double up on misrepresenting the person and misrepresenting their argument. Often, a regrettable quote from a decade earlier is reason enough in PDW to rule out anything a dissenter ever says again—even if the dissenter swears they no longer believe that thing they said back then.

If those don’t do the trick, there’s always mind-reading—where disciples of a narrative will assume the worst about the dissenter’s real, true, deep-down intentions (like people assuming that opposition candidates are motivated by selfishness while being more charitable with their preferred candidates). Political Disney World scales this up until everyone in the Echo Chamber is convinced that anyone who wants to curb immigration is racist, or everyone who opposes a war effort is unpatriotic, or everyone who supports tax cuts is greedy, or anything else that helps the Echo Chamber write off those who challenge the narrative.

In the most extreme Echo Chambers, the discrediting of arguments and people form an interlocking chain of dismissal. Once a given position is branded as terrible and wrong, anyone holding that position is automatically branded as wrong-headed, which in turn leads people to write off all of their other positions as well. In other cases, once a well-known person is deemed by an Echo Chamber to be bad, their viewpoints become tarnished with the same reputation, which then extends to anyone else who happens to hold those same positions. It’s like a discredit disease that spreads.

With a further step back, we can see how all of these fallacies work in tandem with the Echo Chamber’s information-filtering system. The filters let friendly info in, the fallacies twist it to make it even friendlier, then the filters further refine things by elevating the best-manipulated info into further prominence. This ongoing tag-team effort is so effective that not only will everyone in PDW have the same digitized viewpoint on every issue, they’ll be saying the same exact sentences about it, word for word.

When everyone is saying the same thing, a feedback loop takes hold—the kind we talked about in Chapter 1 (when we were supposedly talking about our ancestors):

You can take humans out of the Power Games…

Politics in 3D

Our Psych Spectrum has helped us see the usual left-center-right—

—in 2D, where it looks more like an arch.

Our third dimension—Emergence Tower—lets us see an even bigger picture. What looks like an arch of 300 million individuals on the lowest floor of Emergence Tower looks like four giants from higher up on the tower:

The people who make up the high-rung giants aren’t that different from the people in the low-rung giants. But the giants themselves are nothing alike. Low-rung giants are the product of ancient human survival software—they’re the kinds of giants that the software builds when it’s able to run the show. In the high-rung giants, Higher Minds have managed to band together to define the culture and override the software’s usual output.

In Part 2 of this series, we kept things simple and imagined how a country like the U.S. might work in an ideal scenario. Under the First Amendment’s protection, the U.S. would become a grand marketplace of ideas where the minds of individual Americans would link up like neurons and form a giant superbrain. Individual thinking on most topics would yield a Thought Pile with a clean bell curve shape, and that shape would be lit up with activity by a Speech Curve that would sit right on top of it.

As people talked, the big brain would think, and over time, it would ooze its way along Thought Spectrums to ever wiser places.

This is kind of what does happen in the U.S. today. Except there’s a big asterisk.

What we didn’t talk about in Part 2 were the inevitable Echo Chambers that would resist Enlightenment Values and function culturally like mini dictatorships. Echo Chambers are like frozen spots in a free nation’s superbrain—dark regions of the brain where thinking can’t happen.

If high-rung politics is a marketplace of ideas that yields bell curves along the Idea Spectrum, the frozen Echo Chambers of low-rung politics look more like tall vertical towers. Put together, they make most political topics look like a camel.

A camel curve moves slower towards progress than a bell curve. The science and business worlds can advance quickly because bad ideas fail quickly. In the world of ideas, Echo Chambers, with their sacred and taboo viewpoints, keep bad ideas alive way longer they would in a normal marketplace. With so many voters locked up in the humps, politicians have to spend a lot of their energy catering to the low-rung ideas and speaking to the low-rung political mentality. The humps distort the shape of the Overton window, making the national brain less intelligent, less adaptable, less rational, and less wise.

None of this means the system isn’t working. As we’ve discussed, the vision of the Enlightenment wasn’t to completely repress the human Primitive Mind—it was to ensure that unlike most societies in the past, the Primitive Mind wouldn’t be able to completely take over. It wasn’t meant to generate perfect bell curves of national thinking—it was meant to thaw out static frozen towers enough to end up with stubborn but movable camel humps. With a species like ours, this may be the best we can hope for.

Let’s zoom out further. If we move another floor up Emergence Tower, we can see a country like the U.S. as two huge political giants.

One way to do that is to slice our 2D political space down the middle vertically, leaving us with a Left giant and a Right giant.

The real Left—the complete Left—is the combination of the high-minded, high-rung progressive giant up top and the primitive-minded, Power-Games-playing blue giant down below. Same deal for the Right.

Each of these giants is like a large-scale human being—the product of an internal struggle between fire and light.

Each of us is on our own little mountain, ebbing and flowing in maturity and wisdom. We all have good days and bad days, good years and bad years. We’re each a mix of admirable qualities and character flaws, and we spend our lives trying to become a little better. We’re all human, and so is our society.

Like each of us, the political Left and Right are in a constant struggle to grow up. Sometimes they’re childish. Sometimes they’re wise. Like each of us, they can grow up with age—and like each of us, they also sometimes revert and go backwards.

Every person is working on two projects all the time: them against the world and them against themselves. High-rung political giants are in the same situation, fighting a two-front battle at all times: a horizontal battle against their high-rung counterpart, in the struggle to determine how the country changes and evolves; and a vertical battle against the low-rung giant that masquerades under the same political banner—a battle that, if lost, threatens to destroy their reputation, hijack its movements, and undermine its progress.

There’s another way political parties are like people: in both cases, the individual struggle of one can influence the individual struggles of others nearby.

When a couple gets into a fight, it’s often because their Primitive Minds have started going at it with each other. The Primitive Mind of one member of the couple doesn’t want to fight with the Higher Mind of the other—it wants to fight with its primitive little friend. When it’s worked up, it calls the other Primitive Mind out to play, and it usually gets a response. A vicious cycle takes hold as things quickly devolve into nastiness. When one of the Higher Minds in the couple manages to wrest control of their person for long enough to get a word in—something like, “I do see where you’re coming from, I’d feel frustrated in this situation too”—the fight pretty quickly winds down. Once the Higher Minds start communicating with each other, they can regain the edge and take control of the interaction.

Between what I’ve observed about politics and what I’ve read about history, political giants seem to work the same way. If, instead of looking at the two-giant U.S. as Left versus Right, we slice our political region horizontally, we see two pairs that function as teams as much as they do as adversaries.

The high-rung giants argue with each other constantly, but they know they’re ultimately on the same team with the same overarching goal. It’s harder to see it on the bottom, but the low-rung giants are a team too. Remember, without Jafar, Aladdin is just some guy. The low-rung giants need their counterpart. It’s the key villain in their narrative—the key uniting force that holds everything together. Nothing delights members of a low-rung giant more than the other low-rung giant behaving badly. It makes them furious, but in a super fun way. It lights their fires and injects meaning into their lives. And it justifies a wave of their own childish behavior, which in turn fires up their rival giant even more—like what happens to a couple as they descend into a nastier and nastier fight. When the low-rung giants really get each other riled up, the high-rung giants become increasingly helpless and muted.

People in the high-rung political world think of politics as a positive-sum game, and the way they do politics, it is. The clash of the high-rung giants is a classic Value Games clash—it yields progress and wisdom.

In the low-rung political world, politics is seen as a zero-sum game—when one side wins, the other loses, and that’s that. But the actual game they’re playing ends up being negative-sum. Their fighting pulls the country downward on the same mountain the high-rung giants are trying to climb.

I finished Part 2 with a depiction of the U.S., trudging up the mountain on its mission to become a more perfect nation:

Back then, we could only see the nation as it looked on the surface. Now, with some more tools in our bag, we can look deeper into the image and see the situation for what I’ve come to believe it really is: an eternal tug-of-war between the nation’s collective Higher Mind and the nation’s collective Primitive Mind.

This is the real political picture in the U.S. It’s not only Right vs. Left. It’s High vs. Low. Forward vs. Backward. Wise vs. Foolish. Value Games vs. Power Games. It’s not only wing politics—it’s also rung politics. Many of our political struggles are, in fact, horizontal. But that’s all in the shadow of the big political tug-of-war. Which is vertical.

___________

This was me, heading off to college:

The world was my oyster. It was exciting. But then the political conversations started.

For the first time in my life, my political views were being challenged. It was like I was standing there living my life and these new friends were trying to shove me off a cliff:

I didn’t know it at the time, but I was standing on a very common intellectual path, commonly referenced as the Dunning-Kruger effect.18 Here’s how I think of it.

It’s a lot like a roller coaster. At the time, I had spent my life doing the roller coaster’s big first creeping uphill part. Suddenly, I was at that terrifying moment where the car levels out and starts to tilt downward…

I was left with two options:

Option 1: Stay up on Child’s Hill. I could decide that I didn’t actually like these friends after all, that they were arrogant ignorant assholes, and distance myself from them. I could seek out new friends more like the people I was used to talking to and try to forget about this whole bad early college experience. Re-isolate myself from dissent, reconfirm my established beliefs, and restore my confidence (which the backfire effect suggests wouldn’t have taken long).

Option 2: Take the plunge. Let go of my comfortable conviction and embrace these new bad feelings of self-doubt and existential confusion.

I went tumbling.

As I tumbled, it sunk in that to be as opinionated as I had been entering college, you either have to be an expert or full of shit—and I wasn’t an expert. I was a Democrat mostly for the same reason that I was a Red Sox fan. They were my team, and that was that.

Pretty soon I had no idea what I thought or who I was or what was right or wrong. I didn’t feel like a proud Democrat anymore. Tim the Democrat was a fraud and I was determined not to be a fraud ever again. But a Republican? Me? A Republican? No way. I had been indoctrinated too hard for too long to fully switch teams. I started to dread political conversation because I wasn’t sure who I was supposed to be when these conversations happened. It was a bad situation. I was here:

Insecure Canyon is where you are when you’re past the “Wait I actually don’t know shit” epiphany, but not yet past the “Ohhhh no one else knows shit either” epiphany. The two-part epiphany, when still incomplete, leaves a thinker self-aware enough to know what they don’t know but not yet wise enough to know that not knowing is a healthy, productive state. The unpleasant feeling of existential confusion and intellectual insecurity is the gateway drug to real intellectual growth—but when you haven’t had the complete epiphany, it doesn’t feel that way. It feels shameful and embarrassing. You feel stupid and wishy-washy, and you hope no one finds out how little you know. That’s where I was.

And then it happened. I was in my freshman dorm room and one of my roommate’s friends was hanging out, and he said something like, “And really, all the reasonable people are centrist anyway.”

It all clicked. I was a Centrist. It was the perfect new identity. Fuck all those political extremists. I was a thoughtful, nuanced, moderate thinker who acknowledged that both sides had some good points and some bad points.

We all look back on our previous selves and cringe about certain things. We’ve each got a list. Right near the top of mine is me coming home for Thanksgiving during my freshman year of college and declaring to anyone who would listen about how I was a Centrist. Wincey as fuck.

People in Insecure Canyon are super vulnerable. They’re perfect targets for indoctrination into a new dogma, because they’re still too hazy to understand how knowledge works, and they’re dying to feel smart again. That’s why many people in Insecure Canyon end up making the mistake Tim the Centrist Moderate Independent made—they jump onto another dogma boat. This feels like a step forward. But it’s the opposite. It’s a young chick flying for the first time, feeling the cold winds, and making a U-turn right back to the nest. This is what I did. I had tried to solve the bad feelings of Insecure Canyon by running back up to the top of Child’s Hill, just with a new identity cloak on. I went from a Fraud Democrat to a Fraud Centrist.

The whole thing reminds me of a drawing from another post.

Thankfully, some self-awareness eventually crept in. My brief foray into Centrism turned out to be like getting out of a long relationship with a crazy person only to immediately jump into a rebound fling with the next person I met. But the fling had taught me something. If I were ever going to really figure out who I was and get myself into a healthy future relationship, I’d have to be okay with being single for a while.

So my identity shifted again, this time to a guy who was Still Asking Questions. I became a SAQist.

Over the next few years, I started to look up for the first time and notice the y-axis of the political space. This whole time, I had been staring down at the ground, searching for the right spot along the What You Think axis—when the real answer was above me.

Looking up at the vertical axis for the first time, I felt like these monkeys.

On the roller coaster, I was now standing here, a born-again SAQist, ready to start a life of climbing:

I’d like to tell you that it’s been a straightforward trudge up Grown-Up Mountain since then.

But old habits die hard, and it turns out it’s really hard to stay on Grown-Up Mountain. When I declared myself an unattached SAQist, I didn’t realize just how attached my Primitive Mind was to the color blue.

I’d go through all the right motions—reading op-eds by the most convincing conservative writers and seeking out flaws in Democrat politicians or their platforms. I played the “Why?” Game with myself about my lingering instinct that the left’s policies were more logical and more reasonable and search for evidence that those instincts were no more than a bad habit. I genuinely began to feel conflicted and confused about whether the Right or the Left made more sense when it came to fiscal and foreign policy and the optimal size of government.

But then election season would come around, and I’d feel like I was rooting for the Red Sox again. The Democrats still felt like “my people,” no matter how hard I tried to shake the feeling off. Were the Democrats actually just more in line with my values, or was it just my Primitive Mind doing this? Or was it a little of both?

Whatever the cause of my attachment, the Republicans of the 2000s—with their Iraq War and their snowballs and their traditional marriage and their stem cell bans—weren’t helping the situation. As I tried to rid myself of the notion that the Democrats were “my people,” the Republicans—with their Sarah Palin and their Sean Hannity and their Perry ad and their just watch this for 30 seconds—would continually make it crystal clear that they were certainly not my people.

Well good news! Over the past decade, the Left finally did it. They regressed so far that they became as “not my people” as the Republicans. They actually went insane enough to free me from my tribal handcuffs. I spent a lot of years saying I was “an Independent” while not truly believing it. Today, I can say it with a straight face.

It’s amazing how much clearer your vision gets when you really—actually—separate your identity from a tribe. I can see reality better now. The bad news is that I don’t like what I see with my new eyes. It’s…the situation is pretty scary.

We’ve got a problem and we need to fix it.

This whole series so far has been getting us ready to dive head first into that problem, with clearer eyes than normal. That’s where we’ll be headed in the final group of chapters.

Chapter 10: A Sick Giant

___________

If you like Wait But Why, sign up for the email list and we’ll send you new posts right when they come out. It’s a super unannoying list I promise.

Huge thanks to our Patreon supporters for making this series free for everyone. To support Wait But Why, visit our Patreon page.

___________

More vertical tugs-of-war:

The productivity tug-of-war

The social tug-of-war

The awareness tug-of-war

___________

Sources and related reading

At the heart of an effort to grow in our political lives has to be a continual effort to get better at thinking and communicating. There are a lot of great writers on the internet dedicating themselves to helping people think and argue more rationally. I’ve learned a lot from them. Some of my favorites:

The mecca of rationalism, Less Wrong, run by Eliezer Yudkowsky and his ragtag gang of rationalists. Whenever there’s a cutting-edge new idea making the rounds, Eliezer was writing about it 5-10 years ago. A deep dive on Less Wrong will make you smarter. This collection is a nice place to start.

A Less Wrong offspring, Scott Alexander’s blog Slate Star Codex is a giant pile of clarity. If you liked this post, you’ll really like SSC. Specific further reading on ideas in this post: Scott on motte-and-baileying, weak-manning, and the inoculation effect.

Another big pile of wisdom: Paul Graham’s essays. You can read about his “hierarchy of disagreement” I referenced here.

Julia Galef, co-founder of the Center for Applied Rationality, is a great explainer of rational concepts. Go on a spiral through these sometime.

Adam Grant spends his life using research to embarrass conventional wisdom. Exceptional communicator but very bald.

Shane Snow dives deep on how we can think better, and he makes it fun. His awesome article on intellectual humility is especially relevant to this series.

Other resources:

A great collection of research that I referenced in the post: The Cognitive Science of Political Thought. And a summary of some of the findings.

The study I referenced about how we process challenges to our political and non-political beliefs with different parts of our brain. By Jonas T. Kaplan, Sarah I. Gimbel and Sam Harris. The article’s citation list is full of interesting research. Other studies I referenced about how politics makes us bad at thinking: 1, 2, 3, 4, 5

Some nice examples of straw-manning and weak-manning in politics, by Yvonne Raley and Robert Talisse (who seems to have coined the term “weak man”). To go deeper, here’s their paper on the topic.

Research on how progressives tend to be more concerned about the global and conservatives more about the local. By Adam Waytz, Liane Young, Ravi Iyer, and Jonathan Haidt.

Cool interactive exploring how Fox, CNN, and MSNBC differ in what stories they cover and how they present them.

Wikipedia has nice compilations of cognitive biases and fallacies.

Rapoport’s Rules for how to be a great arguer by doing the opposite of straw-manning (sometimes called steel-manning).

The original explanation of the motte and bailey doctrine by Nicholas Shackel, who coined the term.

The original explainer on the inoculation effect.

Fun reminder of how idiotic it is to assume correlation implies causation.

A book to remind you that you don’t know shit.