(1700 words; 8 minute read.)
[9/4/21 update: if you'd like to see the rigorous version of this whole blog series, check out the paper on "Rational Polarization" I just posted.]
A Standard Story
I haven’t seen Becca in a decade. I don’t know what she thinks about Trump, or Medicare for All, or defunding the police.
But I can guess.
Becca and I grew up in a small Midwestern town. Cows, cornfields, and college football. Both of us were moderate in our politics; she a touch more conservative than I—but it hardly mattered, and we hardly noticed.
After graduation, we went our separate ways. I, to a liberal university in a Midwestern city, and then to graduate school on the East Coast. She, to a conservative community college, and then to settle down in rural Missouri.
I––of course––became increasingly liberal. I came to believe that gender roles are oppressive, that racism is systemic, and that our national myths let the powerful paper over the past.
You and I can both guess how her story differs. She’s probably more concerned by shifting gender norms than by the long roots of sexism; more worried by rioters in Portland than by police shootings in Ferguson; and more convinced of America’s greatness than of its deep flaws.
In short: we started with similar opinions, set out on different life trajectories, and, 10 years down the line, we deeply disagree.
So far, so familiar. The story of me and Becca is one tiny piece of the modern American story: one of pervasive—and increasing—political polarization.
It’s often noted that this polarization is profound: partisans now disagree so much that they often struggle to understand each other.
It’s often noted that this polarization is persistent: when partisans sit down to talk about their now-opposed beliefs, they rarely rethink or revise them.
But what’s rarely emphasized is that this polarization is predictable: people setting out on different life trajectories can see all this coming. When Becca and I said goodbye in the summer of 2010, we both suspected that we wouldn’t be coming back. That when we met again, our disagreements would be larger. That we’d understand each other less, trust each other less, like each other less.
And we were right. That’s why I haven’t seen her in a decade.
Told this way, the story of polarization raises questions that are both political and personal. What should I now think of Becca—and of myself? How should I reconcile the strength of my current beliefs with the fact that they were utterly predictable? And what should I reach for to explain how I came to disagree so profoundly with my old friends?
The standard story: irrationality.
The story says, in short, that politics makes us stupid. That despite our best intentions, we glom onto the beliefs of our peers, interpret information in biased ways, defend our beliefs as if they were cherished possessions, and thus wind up wildly overconfident. You’ve probably heard the buzzwords: “confirmation bias”, “the group-polarization effect”, “motivated reasoning”, “the overconfidence effect”, and so on.
This irrationalist picture of human nature has quite the pedigree—it has won Nobel Prizes, started academic subfields, and embedded itself firmly in the popular imagination.
When combined with a new wave of research on the informational traps of the modern internet, the standard story offers a simple explanation for why political polarization has exploded: our biases have led us to mis-use our new informational choices. Again, you’ve probably heard the buzzwords: “echo chambers”, “filter bubbles”, “fake news”, “the Daily Me”, and so on.
The result? A bunch of pig-headed people who increasingly think that they are right and balanced, while the other side is wrong and biased.
It’s a striking story. But it doesn’t work.
It says that polarization is predictable because irrationality is predictable: that Becca and I knew that, due to our biases, I’d get enthralled by liberal professors and she’d get taken in by conservative preachers.
But that’s wrong. When I looked ahead in 2010, I didn’t see systematic biases leading to the changes in my opinions. And looking back today, I don’t see them now.
If I did see them, then I’d give up those opinions. For no one thinks to themselves, “Gender roles are oppressive, racism is systemic, and national myths are lies—but the reason I believe all that is that I interpreted evidence in a biased and irrational way.” More generally: it’s incoherent to believe that your own beliefs are irrational. Therefore, so long as we hold onto our political beliefs, we can’t think that they were formed in a systematically irrational way.
So I don’t see systematic irrationality in my past. Nor do I suspect it in Becca’s. She was just as sharp and critically-minded as I was; if conservative preachers changed her mind, it was not for a lack of rationality.
It turns out that tellers of the irrationalist tale must agree. For despite the many controversies surrounding political (ir)rationality, one piece of common ground is that both sides are equally susceptible to the factors that lead to polarization. As far as the psychological evidence is concerned, the “other side” is no less rational than you––so if you don’t blame your beliefs on irrationality (as you can’t), then you shouldn’t blame theirs on it either.
In short: given that we can’t believe that our own beliefs are irrational, the irrationalist explanation of polarization falls apart.
Suppose you find this argument convincing. Even so, you may find yourself puzzled. After all: what could explain our profound, persistent, and predictable polarization, if not for irrationality? As we’ll see, there’s a genuine philosophical puzzle here. And when we can’t see our way to the solution, it’s very natural to fall back on irrationalism.
In particular: since we can’t view our own beliefs as irrational, it’s natural to instead blame polarization on the other side’s irrationality: “I can’t understand how rational people could see Trump so differently. But I’m not irrational—so the irrational ones must be Becca and her conservative friends, right?”
That thought turns our disagreement into something more. Not only do we think the other side is wrong—we now think they are irrational. Or biased. Or dumb. And that process of demonization—more than anything else—is the sad story of American polarization.
(650 words left)
A Reasonable Story
What if it need not be so? What if we could think the other side is wrong, and not think they are dumb? What if we could tell a story on which diverging life trajectories can lead rational people—ones who care about the truth—to be persistently, profoundly, and predictably polarized?
That’s what I’m going to do. I’m going to show how findings from psychology, political science, and philosophy allow us to see polarization as the result of reasonable people doing the best they can with the information they have. To argue that the fault lies not in ourselves, but in the systems we inhabit. And to paint a picture on which our polarized politics consists largely of individually rational actors, collectively acting out a tragedy.
Here is the key. When evidence is ambiguous––when it is hard to know how to interpret it—it can lead rational people to predictably polarize.
This is a theorem in standard (i.e. Bayesian) models of rational belief. It makes concrete and confirmed empirical predictions. And it offers a unified explanation of our buzzwords: confirmation bias, the group-polarization effect, motivated reasoning, and the overconfidence effect are all to be expected from rational people who care about the truth but face systematically ambiguous evidence.
More than that: this story explains why polarization has exploded in recent decades. Changes in our social and informational networks have made is so that, with increasing regularity, the evidence we receive in favor of our political beliefs tends to be unambiguous and therefore strong—while that we receive against them tends to be ambiguous and therefore weak. The rise in this systematic asymmetry is what explains the rise in polarization.
In short: the standard story is right about which mechanisms lead people to polarize, but wrong about what this means about people. People polarize because they look at information that confirms their beliefs and talk to people that are like-minded. But they do these things not because they are irrational, biased, or dumb. They do them because it is the best way to navigate the landscape of complex, ambiguous evidence that pervades our politics.
That’s the claim going to defend over the coming weeks.
Here’s how. I’ll start with a possibility proof—a simple demonstration of how ambiguous evidence can lead rational people to predictably polarize. Our goal will then be to figure out what this demonstration tells us about real-world polarization.
To do that, we need to dive into both the empirical and theoretical details. In what sense has the United States become increasingly ”polarized”—and why? What would it mean for this polarization to be “rational”—and how could ambiguous evidence make it so? How does such evidence explain the mechanisms that drive polarization—and what, therefore, might we do about them? I’ll do my best to give answers to each of these questions.
This, obviously, is a big project. That means two things.
First, it means I’m going to present it in two streams. The core will be this blog, which will explain the main empirical and conceptual ideas in an intuitive way. In parallel, I’ll post an expanding technical appendix that develops the details underlying each stage in the argument.
Second, it means that this is a work in progress. It’ll eventually be a book, but—though I’ve been working on it for years—getting to a finished product will be a long process. This blog is my way of nudging it along.
That means I want your help! The more feedback I get, the better this project will become. I want to know which explanations do (and don’t) makes sense; what strands of argument are (and are not) compelling; and—most of all—what you find of value (or not) in the story I’m going to tell.
So please: send me your questions, reactions, and suggestions.
And together, I hope, we can figure out what happened between me and my old friends—and, maybe, what happened between you and yours.
If you’re interested in this project, consider signing up for the newsletter, following me on Twitter, or spreading the word.
Next post: an experiment that demonstrates how ambiguous evidence can lead people to polarize.
PS. Thanks to Cosmo Grant, Rachel Fraser, and Ginger Schultheis for helpful feedback on previous drafts of this post—and to Liam Kofi Bright, Cailin O'Connor, Kevin Zollman, and especially Agnes Callard for much help and advice getting this project off the ground.
9/5/2020 06:05:25 pm
Thank you for sharing this project. After reading the article and working through some of the points that confused me, I have the following questions.
9/6/2020 03:40:29 am
Thanks for the great questions! Yeah, a lot of these issues will come up in future posts in more detail, but here's some quick thoughts:
9/6/2020 01:15:36 am
From game theoretic simulations comes a striking result: When holding a certain level of cooperative equilibrium as a constant, tolerance varies as a function of density, or how many other players you expect to encounter in a given time interval. Expecting people living in different densities to display the same tolerance is literally asking them to undermine the cooperative equilibrium on which their social milieu depends.
9/6/2020 03:27:29 am
Sounds fascinating---I haven't heard of this result. Could you send a reference? Would love to learn more!
9/6/2020 03:54:38 am
I don't know if ambiguous evidence can cause people to rationally polarize. I'll have to wait on see on that one but one thing that can't be true is that you are always rational and yet expect your future beliefs to change in a particular direction.
9/6/2020 04:04:01 am
I'm open to the idea that certain information can cause people to rationally polarize and even that your exposure to such information can be predictable. But you can't rationally predict that your beliefs will change in a specific direction. That's the only aspect of your story I feel clearly shows irrationality.
9/6/2020 07:56:31 pm
That's a good consideration to think through. I've got about four different ways in mind that might respond to that concern.
9/7/2020 04:59:21 pm
9/7/2020 05:14:07 pm
Let me just add on part B (where I spoke too quickly as it involves biased evidence but not your bias) you are still violating rationality there. Informally, I’d put the point this way…you can always compensate for a bias you think will be there and that compensation can be made smaller and smaller. There are two options. Either you eventually can find some compensation factor so small that you aren’t sure which direction (if any) you’ll be biased in after that compensation. But now that method of updating is the rational one and not the uncompensated one.
9/8/2020 11:00:06 am
Ha, you hit the nail on the head, Peter. That is *exactly* what this is all about. For it turns out that the sense of "ambiguous evidence" I mean here (higher-order uncertainty; failure of rational credences to be introspective) is (in a sense to be made precise in week 4) necessary and sufficient for failures of that "expectation-matching" principle you mention. In particular, I can be rational to have a given credence 0.5 in Q, and be rational to have an expectation for my future (rational, no-info-loss) credence that is higher in Q.
9/10/2020 03:58:04 pm
That is super interesting and I will stay tuned. I guess I don't really believe there is a fact of the matter about what rationality refers to so I’m fine if someone wants to define these models as in some sense irrational (even if it's just that they got to the word first). I mean we already know we aren't logically omniscient so the real question is which models are more useful in understanding and choosing when to critique individuals and I’m very much open to the idea that the kind of models you are talking about are superior in that regard for the kind of questions we care about in this context.
9/6/2020 07:32:12 pm
I could hardly agree more. It looks like you're bound and determined to write everything I've been thinking about two years earlier than me. I guess here is one small difference in the theoretical apparatus I would want to use: I think that a top-down 'respecting the evidence' that can allow you to rationally believe things that violate probability theory or first order logic does a lot of the work here. I think that might be the same as the role you're ascribing to 'ambiguous evidence'.
9/9/2020 06:47:41 am
Glad to hear we're on the same wavelength! Sorry to be jumping the gun :).
9/9/2020 03:24:49 pm
Fascinating post, Kevin.
9/11/2020 10:51:48 am
Thanks for the question, Adam! I agree this is a natural line for the irrationalist story to take, and certainly there are people who will say that. The point I want to press is that anyone who really internalizes the irrationalist story isn't going to be able to think this *while* holding on to their own political beliefs. If they know the irrationalist story well, then they know that the mechanisms they describe apply equally to all both sides of the political spectrum.
9/9/2020 03:48:02 pm
Not a philosophical comment at this stage: but are you aware that Scott Adams - author of Dilbert cartoons - writes interestingly on how, in the (non-science) public sphere, notionally rational human beings routinely process apparently the same data to come up with polarised views?
9/11/2020 10:58:15 am
Cool, I didn't know about Adams's post on this. Thanks for sharing!
9/9/2020 04:03:19 pm
This project sounds very promising, and I am especially intrigued about the nature of "ambiguity" and the Bayesian nature of this fascinating project--specifically, the relation between our political and moral preferences and our credences or "degrees of belief" in political and moral propositions.
9/11/2020 11:02:53 am
Thank you! Yes, there's going to be a lot of subtlety involved at times with translating between Bayesian models of attitudes and our moral/political beliefs and values. This'll probably come up a bit more in the appendix than taking center-stage in the blog, in part because the models I'm using let us slot in claims with any content (whether it's "Trump will win the election" or "It'd be good if Trump won the election") as the things people will polarize on. But do let me know if at certain stages it feels especially relevant!
9/9/2020 05:35:56 pm
Do you ever cite political scientists? There is a vast literature about this very topic in the field that actually studies politics.
9/11/2020 11:09:36 am
Of course! See the current version of the technical appendix (https://www.kevindorst.com/reasonably-polarized-technical-appendix.html) for a start---but there'll be many more to come, especially in the empirical sections.
9/9/2020 06:20:36 pm
"People polarize because they look at information that confirms their beliefs and talk to people that are like-minded. But they do these things not because they are... biased."
9/14/2020 10:24:19 am
Great question! More on it in later posts, in part because the answer is subtle. I think it depends on a terminological choice for what we mean by "confirmation bias", and whether exhibiting it implies irrationality.
9/14/2020 12:06:54 pm
Thanks for the response. And yes, this helps.
9/16/2020 08:42:20 am
Yeah definitely! Jess Whittlestone's recent dissertation (http://wrap.warwick.ac.uk/95233/1/WRAP_Theses_Whittlestone_2017.pdf) is the best thing I've seen on this (here's a blog post where she summarizes it quickly [https://jesswhittlestone.com/blog/2018/1/10/reflections-on-confirmation-bias]---definitely worth a read). On her account, there's a far amount of unclarity on what exactly confirmation bias is, even within the relevant literature, so I suspect it's hard to get a clear and uncontroversial definition.
9/9/2020 07:33:23 pm
Looks good, Kevin. If you are looking for relevant work in social science I have a whole book on the subject, but you could start with the links in this article: https://theconversation.com/coronavirus-responses-highlight-how-humans-are-hardwired-to-dismiss-facts-that-dont-fit-their-worldview-141335
9/14/2020 10:25:56 am
Fantastic, thanks Adrian! Will check out the article, and add the book (this one? https://global.oup.com/academic/product/the-truth-about-denial-9780190062279?cc=gb&lang=en&) to my reading list!
9/10/2020 11:16:51 am
I'm curious to see what you say about specific heuristics. Take motivated reasoning for example.
9/16/2020 08:57:46 am
Thanks for the questions! I hadn't read the Ditto and Lopez paper, so thanks for adding that one to my list. Skimming it, it *looks* like the general arguments I'm going to give will apply to it as well, but will of course have to look at the details later.
9/11/2020 08:55:51 am
I wanted to add the following bits of evidence on "irrationality" side. General idea being that things like "systemic racism" are extremely speculative, currently barely testable hypotheses which nonetheless play a huge role in some people's *moral* (not necessarily scientific), economy. They are candidates for "best explanation" for their experiences or experience they are told other people are having. Just like moral convictions can be fostered by selective exposure to data (by biasing the data you receive), support for the empirically sounding theories of blame can be boosted by propaganda campaigns. "Rational path dependency" here is "rational exposure dependency". Just like marketing can have effects on those who watch TV more than on those who dont, propaganda has stronger effects on those who are more exposed. By choosing where you live and what jobs you hope to get, you change the patterns of exposure. It is important that most of the"big" beliefs are ultimately untestable: they are under-determined by data projections into the future. For me that means that no, you cannot settle who is right by just looking at the data, the fact that people fail to do that is rational land to be expected.
10/3/2020 12:39:06 pm
Why do you think these claims are evidence of irrationality? Though I'd disagree with you on the details of particular campaigns, I totally agree that the media manufactures polarizations through the way it presents evidence. The argument will be that the way such manufactured polarization works is by making it *rational* for individuals to polarize—e.g. liberals see clear evidence in favor of liberal views and ambiguous evidence against them; vice versa for conservatives.
9/11/2020 11:43:20 am
Great post. I do wonder how you would respond to the view of Bryan Caplan, that most of the issues you and Becca would drift apart are subject to rational irrationality - that is, it is perfectly rational for both of you to conform to your social environment ignoring the potential irrationality of your beliefs. You may argue that it would be irrational to think that your beliefs are correct _and_ that in general political beliefs are irrationally determined by the person's environment. I would respond that while irrational thoughts cannot be true, people are still perfectly capable of holding them...
10/3/2020 10:28:25 am
Thanks Tiago! Sorry to be slow on the reply here. It's a good question, and one that I just tried to address in the beginning of a post from last week. (https://www.kevindorst.com/stranger_apologies/what-is-rational-polarization)
9/12/2020 10:01:04 am
This stuff is embarrassing. It's like someone publishing a physics theory which gives the wrong equations, predicts that an electron is the size of a star, gives no juice for the squeeze.
10/3/2020 10:33:41 am
Thanks for the objections! I think you might be latching on to the wrong piece of my argument that we shouldn't blame polarization on irrationality. The premise is that YOU can't think that YOUR beliefs about Trump are irrational. I'm sure you don't, given the strength of your convictions. That's the philosophical premise, and it's one I take it you agree with.
4/15/2021 05:41:13 pm
Dear Professor Dorst,
4/17/2021 09:37:41 am
Email sent! Thanks a bunch for your question; curious to see the paper when it's ready!
9/26/2022 10:00:47 am
Leave a Reply.
Philosopher at MIT, trying to convince people that their opponents are more reasonable than they think
- What this blog is about
- Reasonably Polarized series
- RP Technical Appendix
Follow me on Twitter or join the newsletter for updates.