KEVIN DORST
  • Bio
  • Research
  • Teaching
  • Public Philosophy
  • Stranger Apologies

Stranger Apologies

A Plea for Political Empathy

2/18/2020

26 Comments

 
2400 words; 10 minute read.

The Problem
We all know that  people now disagree over political issues more strongly and more extensively than any time in recent memory.  And—we are told—that is why politics is broken: polarization is the political problem of our age.

Is it?
Polarization standardly refers to some measure of how much people disagree about a given topic—say, whether Trump is a good president. Current polling suggests that around 90% of Republicans approve of Trump, while only around 10% of Democrats do.  We are massively polarized—and that is the problem. Right?

Not obviously. Consider religion. On any metric of polarization, Americans have long been massively polarized on religious questions—even those that crisscross political battle lines. For example, 84% of Christians believe that the Bible is divinely inspired, compared to only 32% of the religiously unaffiliated (who now make up over a quarter of the population). Yet few in the United States think that religious polarization is the political problem of our age. Why? Because most Americans who disagree about (say) the origin of the bible have learned to get along: large majorities maintain friendships across the religious/non-religious divide, trust members of each group equally, and are happy to agree to disagree. In short: although Americans are polarized on religious questions, they do not generally demonize each other as a result.

Contrast politics. Whatever your opinion about whether Trump is a good president, consider your attitude toward the large plurality who disagree with you—the “other side.” Obviously you think they are wrong—but I’m guessing you’re inclined to say more than that. I’m guessing you’re inclined to say that they are irrational. Or biased. Or (let’s be frank) dumb.

You are not alone. A 2016 PEW report found that majorities of both Democrats and Republicans think that people from the opposite party are more “close-minded” than other Americans, and large pluralities think they are more “dishonest”, “immoral”, and “unintelligent.”

​This is new. Between 1994 and 2016 the percentage of Republicans who had a “very unfavorable” attitude toward Democrats rose from 21% to 58%, and the parallel rise for Democrats’ attitudes’ toward Republicans was from 17% to 55%:
Picture
So we do have a problem, and polarization is certainly part of it. But there is a case to be made that the crux of the problem is less that we disagree with each other, and more that we despise each other as a result.

In a slogan: The problem isn’t mere polarization—it’s demonization.

If this is right, it’s important.  If mere polarization were the problem, then to address it the opposing sides would have to come to agree—and the prospects for that look dim. But if demonization is a large part of the problem, then to address it we don’t need to agree. Rather, what we need is to recover our political empathy: to be able to look at the other side and think that although they are wrong, they are not irrational. Or biased. Or dumb.

The case of religion shows that it’s possible to do this while still harboring profound disagreements: polarization doesn’t imply demonization. And that raises a question: When it comes to politics, why do we suddenly feel the need to demonize those we disagree with? How have we come to be so profoundly lacking in political empathy?

The Story
Here, I think, is part of the story.

Though “rational animal” used to be our catch-phrase, the late 20th century witnessed a major change in the cultural narrative on human nature.  The potted history: 
Once psychologists started probing the assumption that people are rational, they found that it couldn’t be further from the truth. Instead, people use a grab-bag of simple ”heuristics” in their everyday reasoning that result in systematic, deep, and (in principle) easily avoidable biases. Rather than a paragon of rationality, humanity turned out to be the epitome of irrationality
You can see the results yourself. Search “cognitive biases”, and wikipedia will offer up a list of nearly 200 of them—each one discussed by anywhere from a few dozen to over 10,000 scientific articles.  You will be told that people are inept at (many of) the basic principles of reasoning under uncertainty; that they close-mindedly search for evidence that confirms their prior beliefs, and ignore or discount evidence that contravenes those beliefs; that as a result they are systematically overconfident in their opinions; and so on.

This movement in psychology had a major impact both within academia and in the broader culture, fueling the appearance of (many) popular irrationalist narratives about human nature.  Again, you can see the results for yourself. Using Google Ngram, graph how often terms like “biased” and “irrationality” appeared in print across the 20th century. You’ll find, for example, that “biased” was 18 times more common (percentage-wise) at the dawn of the 21st century than it was at the dawn of the 20th:
Picture
Picture
Upshot: we are living in an age of rampant irrationalism.

How—according to the story that I’m telling—does this rampant irrationalism feed into political demonization?  Two steps.

Step one is simple: we are now swimming in irrationalist explanations of political disagreement.  It is not hard to see how these go. If people tend to reason their way to their preferred conclusions, to search for evidence that confirms their prior beliefs, to ignore opposing evidence, and so on, then there you have it: irrational political polarization falls out as a corollary of the informational choices granted by the modern information age. (Examples: Heuvelen 2007, Sunstein 2009, 2017; Carmichael 2018; Nguyen 2018; Lazer et al. 2018; Robson 2018;  Pennycook and Rand 2019; Koerth-Baker 2019)

Step two is more subtle. Suppose you read one of these articles claiming that disagreement on political issues is caused by irrational forces. Consider your opinion on some such issue—say, whether Trump is a good president. (Whatever it is, I’m guessing it’s quite strong.)  Let’s imagine that you think he’s a bad president. Having come to believe that the disagreement on this question is due to irrationality, what should you now think—both about Trump, and about the rationality of various attitudes toward him?

One thing you clearly should not think is: “Trump is a bad president, but I’m irrational to believe that.” (That would be what philosophers call an “akratic” belief—a paradigm instance of irrationality.)  

What, then, will you think?  You can do one of two things:
  1. ​Acknowledge that you have been irrational, and so give up your belief that Trump is a bad president.
  2. Maintain your belief about Trump, and so think that you have not been irrational in forming it.
No one will—and arguably no one should—give up such a strong belief, based on such a wide variety of evidence, on the basis of a psychology op-ed they came across in the newspaper. So option (1) is out—you’ll go with option (2).

Now what will you think? You’ve just been told by an authoritative source that widespread irrationality is the cause of the massive disagreement about whether Trump is a good president.  You’ve concluded that your opinion on the matter wasn’t due to massive irrationality. So whose was?  Why, the other side’s, of course!  You were always puzzled about how they could believe that Trump is a good president, and now you have an explanation: they were the ones who selectively ignored evidence, became overconfident, and all the rest. (Meanwhile, of course, those on the other side  are going through exactly parallel reasoning to conclude that your side is the irrational one!)

The result? When we come to think that irrationality caused polarization, we thereby come to think that it was the other side’s irrationality that did so.  

And once we view them as irrational, other terms of abuse follow. After all, irrationality breeds immorality: choices that inadvertently lead to bad outcomes are unfortunate but not immoral if they were rationally based on all the available evidence—but are blameworthy and immoral if they were irrationally based on a biased view of the evidence.  So once we start thinking people are biased, we’ll also start thinking that they are “immoral”, “close-minded”, and all the rest.

In a slogan: Irrationalism turns polarization into demonization.

That’s the proposal, at least. Obviously it’s only part of the story in the rise of political demonization—and I don’t claim to know how big a role it’s played. (I haven’t found any literature on the connection; if you have, please share it with me!) But given how quickly discussions about political disagreements are linked to irrationalist buzz-words like “confirmation bias”, “motivated reasoning”, and the like, I would be shocked if there were no connection at all.

As one small piece of evidence, consider this. Daniel Kahneman and Amos Tversky are the founders of the modern irrationalist narrative—winning a Nobel Prize for doing so, and becoming rock-star scientists known well outside academia. Question: when did their work—which was largely conducted in the 70s and 80s—achieve this rock-star status?  Again, you can see it for yourself.  Use Google Scholar to graph their citations per year. Next to that, graph the trends in political demonization we’ve already seen.  On each of those graphs, draw a line through 2001. Here’s what you get:
Picture
Picture
Picture
Upshot: political demonization began its steady climb at the same time as the influence of the modern irrationalist narratives did.  Maybe it wasn’t a coincidence.

The Project
If the problem is demonization—not mere polarization—then part of the solution is to restore political empathy. And if we lack political empathy in part because of rampant irrationalist narratives, then one way to restore it is to question those narratives.

That’s what I’m going to do. I’m going to write apologies—in the archaic sense of the word—for the strangers we so readily label as “biased,” “irrational,” and all the rest.  

How? By going back to that list of 200-or-so cognitive biases, and giving it a closer look. I’m a philosopher who studies theories and models of what thinking and acting rationally amounts to. There are (multiple) entire subfields devoted to these questions. There are no easy answers. So it’s worth taking a closer look at how we all came to take it for granted that people are irrational, biased, and (let’s be frank) dumb.  

I’m going to be tackling this project from different angles. This blog will be relatively exploratory; my professional work will (try to) be rigorous and well-researched; my other public writings will (try to) distill that work into an accessible form. Sometimes (as in this post) I’ll make a big-picture argument; more often, I’ll take an in-depth look at some particular empirical finding that has been taken to show irrationality. Stay tuned for topics like: overconfidence, confirmation bias, the conjunction fallacy, group polarization, the Dunning-Kruger effect, the base-rate fallacy, cognitive dissonance, and so on.  

As we’ll see, the bases for the claims that these phenomena demonstrate irrationality are sometimes remarkably shaky. Overgeneralizing a bit: often psychologists observe some interesting phenomenon, show how it could result from irrational processes, but move quickly over a crucial step—namely, offering a realistic model of what a rational person would think or do in the relevant situation, and an explanation of why it would be different. The reason for this is no mystery: psychologists do not spend most of their time developing realistic models of rational belief and action. Philosophers do. That’s why philosophers have things to contribute here. Because—as we’ll see—when we do the work of deploying such realistic models, often the supposedly irrational phenomenon turns out to be what we should expect from perfectly rational people.


The Hope
In starting this project, my hope is to contribute to a different narrative about human nature—one often voiced by a different group of cognitive scientists. These are the scientists who attempt to build machines that can duplicate even the simplest actions and inferences we humans perform every day, and discover just how fantastically difficult it is to do so. It turns out that most of the problems we solve every waking minute are objectively intractable. Every time you recognize a scene, walk across a street, or reply to a question, you perform a computational and engineering feat that the most sophisticated computers cannot. I would say that the computations required for such feats make the calculations involved in the most sophisticated rational model look like child’s play—but that would be forgetting that child’s play itself involves those same computational feats. In short: rather than resulting from a grab-bag of simple heuristics, our everyday reasoning is the result of the most computationally sophisticated system in the known universe running on all cylinders.

I find this narrative compelling—far more so than the irrationalist one that (I’ll argue) it’s in competition with. That is why I am starting this project.

I’m doing so with a bit of trepidation, as I’m a philosopher straying outside my comfort zone into other fields. No doubt sometimes I’ll mess it up. (When I do, please tell me!) But I think it’s worth the risk.

Partly because I have things to say.  But mostly because I hope to convince more people that there are still things here worth saying. How irrational are we, really?  I think the question is far from settled. And part of what I’m going to be arguing is that it is a question that is sufficiently subtle to appear on the pages of the The Philosophical Review, and at the same time sufficiently consequential to appear on the pages of The New York Times.  I hope you’ll come to agree.


What next?

If you’re an interested philosopher, reach out to me. There are a variety of people working on these topics, and I’m hoping to help build and grow that network.
If you’re an interested social scientist, reach out to me. Whether it’s to inform me of literature I don’t know (good!), to correct my mistakes about it (great!), or to propose something about it we might explore together (wonderful!), I would love to hear from you.
If you’re just interested, sign up for the newsletter or follow me on Twitter for updates; check out this detailed piece scrutinizing the putative evidence for overconfidence, or this bigger-picture piece on the possibility of rational polarization; and stay tuned for new posts in the coming weeks and months.


PS. Thanks to Cosmo Grant, Rachel Fraser, Kieran Setiya, and especially Liam Kofi Bright for much guidance with writing this post and starting this project. It should go without saying that any mistakes (here, or to come) are my own.
26 Comments
APC
2/18/2020 09:42:10 am

Nice post. Quick question: how is the explanation of why political disagreements involve despising “the other” compatible with the putative contrast between political and religious disagreements?

Reply
KMD
2/19/2020 01:43:37 am

Good question! One clarification, and some thoughts.

The clarification: I intend the point of the religious example to be rather limited. It shows that it's *possible* to have deep and pervasive disagreements without demonizing the other side, and so to suggest that demonization my be more of the problem. It's not meant to be (I don't think it is) a perfect parallel or contrast to the political case, since there are a lot of differences between them. So it could be that some further factor is relevant in politics (increasing geographical and ideological sorting, perhaps?) that exacerbates the dynamics I have in mind, but isn't present in the religious case.

But setting that aside, and just focusing on the "irrationalism leads to demonization" idea, my basic thought for why there's a difference here is that we generally don't see (or think we need) an irrationalist story for why people disagree about religious questions. At the least (as far as I know), there hasn't been a rise in popular explanations of religious disagreement by appeal to the biases (etc.) found by psychologist. Which means the sort of mechanism I have in mind ("I won't give up my beliefs or believe I'm irrational, so if irrationality caused it it's got to be someone else's") won't kick in.

Of course, there's a further question about *why* we don't see as many irrationalist explanations of religious disagreement, and I'm not sure I have a great story about that. But it seems to me most people have a fairly mundane story in mind on this front ("people grow up in different communities, exposed to different practices and beliefs; they see the virtues of their own community's practice and beliefs; that's why they adopt them"––or something like that).

Anyways, those are a few thoughts. Good question; it's something I should think more about!

Reply
RJB
2/18/2020 06:49:34 pm

This promises to be interesting. I would encourage you to consider two perspectives, which I can't quite tell if you plan to adopt.

First, while it's true that social scientists have documented a long list of biases, the typical framing (which I endorse) is that humans make reasonable judgments and decisions in a complicated world by using rules of thumb (heuristics) that generally work quite well. But once we understand the heuristic, social scientists can identify the situations in which it won't work well, and demonstrate predictable biases. For example, it's a pretty good heuristic that if someone does one thing very well, they probably do other things well. This 'halo effect' heuristic works pretty well overall, since it tends to be true, but leads to predictable errors when researchers create or look at situations where someone is good at one thing but not another.This framing turns the list of errors from a story about irrationality to a story about how humans act astonishingly effectively in the face of complexity.

The second perspective is one I will call the weaponization of heuristics and biases. Social science provides political actors a roadmap for exploiting heuristics that are generally effective, but with the right propaganda can put ordinary citizens in situations where they make massive predictable mistakes, and through clever targeting can create polarization and demonization. When this happens, it is obviously problematic to demonize ordinary citizens who respond predictably to such tactics. But those employing these tactics might well be culpable for the results.



Reply
KMD
2/19/2020 01:50:37 am

Thanks for your comment. Totally agreed, on both fronts!

One version of the sorts of arguments I'm going to be making is one that favors the sort of "smart" or "fast and frugal" heuristics picture you mention––taking the same empirical regularity which is often labeled as a bias, and emphasizing the ways in which it *is* smart/adaptive/whatever.

A more ambitious version is one that looks at the data from a rational-modeling perspective in such a way that it gives us reason to be skeptical of the heuristic mechanism being posited in the first place. This is more tentative, and it will of course vary with the case at hand. But as an example some work I'm doing on the conjunction fallacy (more on that to come) makes me skeptical of many of the "representativeness heuristic"-style explanation to begin with.

As to your second point, I totally agree: nothing I'm doing here will be in any way defending propagandists or people who spread misinformation or otherwise use psychological tactics to polarize people! I'm going to be writing defenses of the reasoning of ordinary people caught up in political disagreements––not the people who drive and profit off of those disagreements.

Thanks again!

Reply
Lee Jussim link
2/18/2020 06:59:20 pm

Hi Kevin, I'm the guy whose book someone on Twitter recommended. I've been making both of these arguments for A LONG TIME BUT NEVER CONNECTED THEM in the way you have here.

I'm a social psychologist, so come at this stuff from slightly different angles,sometimes with ... data! Two points for now:

Gigerenzer, in psych, has probably done the most to push back on the Kahneman/irrationality emphasis that has dominated psych since around the 1980s.

To find the "short easy" version of what *I* do, go here:
https://www.psychologytoday.com/us/blog/rabble-rouser and just peruse a few blogs...

Feel free to contact me about any aspect of this. FWIW, I am also chair of my dept, so sometimes can be very very slow to reply to emails -- and sometimes lose them entirely. If that happens, don't let it discourage you - ping me again.

Lee

Reply
KMD
2/19/2020 01:53:28 am

Awesome, thanks Lee! Will definitely check our your blog. (A friend also recommended a book of yours, which I will read with interest!)

I'll be in touch! Would love to talk more about this stuff.

Reply
Derek Baker
2/18/2020 07:38:23 pm

This is interesting, but it at least seems like the opposite of what I've seen personally. Generally it seems like people want to assume that Trump supporters, say, are ignorant or irrational, because the alternative is that they roughly know what Trump is doing, and they like it. Stupid and irrational is the charitable interpretation.

Could you say more on why regarding the other side as rational and informed would lead to a more sympathetic view of them? I would have thought it would have the opposite effect: they know what they are doing, and either they actively want to do that or they don't care.

Reply
KMD
2/19/2020 02:13:34 am

Good (difficult!) question. This is something I've debated a lot, and I'm not completely sure what to think; but two main thoughts.

First, I don't think calling someone "irrational" (and certainly not "stupid") is a genuine way to be more charitable to them. I certainly don't think it helps prevent one from thinking of them as immoral. Suppose I see someone doing something bad––say, supporting a political who promotes hate. Two explanations: (1) they see the politician the way I do, and just have really different (bad) moral values; (2) they irrationally see the politician in a radically different way than I do, and have moral values similar to mine. I don't think (2) is any more rosy a picture than (1)––someone with moral values like mine who bases their political views on an imbalanced assessment of the evidence is morally culpable; they should've known better!

The alternative that I have in mind is something like (3): this person shares many (not all) of my more values, but they rationally and in a blameless way have a very different view of the facts––either about this politician, or the consequences of the policies, or the scope of the other threats those policies are trying to address, or whatever. Obviously there are limits to how often that style of explanation can apply. But insofar as we are able to give those sorts of explanations, I think that is something we should do. After all, (3) is really the only type of person that you can reasonably engage with and talk to about the issues: if you just dismiss them as either (1) immoral or (2) irrational, there's no way you'll have a productive conversation about the disagreement. But if you can see them as (3) rationally coming to the table with a radically different view than your own, then maybe you can learn from each other––and hopefully you can convince them of one or two things.

Anyways, that's a first stab at your question. But it's a deep one about this project, so I'm not pretending it's fully satisfactory or there's not a lot more to think or say.

Thanks!
Kevin

Reply
Don Moorr
2/19/2020 11:18:21 am

Great post! It may be worth distinguishing irrationality from error. Someone can be making sensible but erroneous inferences from the imperfect information at their disposal. My guess is that that may relate to many of your critiques of biases.

Reply
Kevin
2/19/2020 02:59:35 pm

I completely agree. One thing that I should make sure to be very clear on in the post critiquing of the overconfidence literature, for instance, is that whether or not the miscalibration is rational, still when you learn about it that should affect your confidence. That is: even if you know you are a perfectly rational Bayesian, if you learn that your confidence in your answers exceeds the proportion that are true, you should still lower your confidence in your answers.

So the upshot of any of the rationalist critiques of the irrationalist narratives is never going to be "there's nothing to worry about here." Rather, it's going to be (to a first approximation): "the things we should worry about here are different than we thought they were"

Reply
Sydney Penner
2/19/2020 03:08:32 pm

Fascinating post! I plan to mull over it some more.

Meanwhile, though, I'd like to push back on one point. You say:

"One thing you clearly should not think is: 'Trump is a bad president, but I’m irrational to believe that.'"

I agree that thinking this would be irrational. I do, however, think that something in the neighbourhood is exactly the right response: namely, "Trump is a bad president, but I was irrational in the degree of confidence I had in that belief."

Perhaps people are not as irrational as recent treatments have suggested, but we're clearly affected by some degree of irrationality. That includes me. Consequently, I should become a little bit less confident in my beliefs when faced with widespread disagreement, rather than merely concluding that the other side is irrational.

In other words, if we allow more fine-grained options than just maintaining or giving up my belief that Trump is a bad president, then it seems to me that we have a path towards greater political empathy, regardless what level of irrationality we attribute to human beings.

An anecdote, for what it's worth: a couple of years ago I spent most of a Saturday afternoon talking to a farmer who was a lifelong Kentuckian and registered Democrat but now an enthusiastic supporter of Trump. I simultaneous gained considerable respect for this farmer as a person -- he struck me as possessing a good deal of practical wisdom -- and found some of his comments so comically ignorant that I wanted to disappear into the ground in embarrassment merely from hearing them uttered. By the end of the afternoon I had gained considerable political empathy, I think, and was perhaps a bit less confident about some of my views ... but I was just as convinced as ever that there is a streak of irrationalism in humanity.

Reply
Kevin
2/20/2020 02:39:29 pm

Thanks so much for your thoughtful reply!

I completely agree about the point that there are options in between the 3 "coarse-grained" ones I mentioned in terms of all-out beliefs, and in reality our attitudes are more fine-grained and so there are options for lowering confidence in our beliefs while also maintaining them. (In fact, my whole dissertation project was on how to think about these notions of "akrasia" and other such "coherence between levels of opinions" norms in a fine-grained Bayesian framework. If you're interested, the relevant paper is here: https://philpapers.org/rec/DOREAG). So I was definitely being fast-and-loose in that argument.

Nevertheless, it does seem to me that even when we allow the more fine-grained options, there will be an asymmetry in how you will and should react. You are told that irrationality caused the disagreement. You can very easily go back through the (broad swaths of) the reasoning that led you to your own case; and when you do so, you will find a fair number of reasons for those beliefs. (This is especially true if my more "rationalist" picture is right, and both sides of the disagreement are rational in their beliefs. But should be true regardless.) Meanwhile, you can't do that for the "other sides" beliefs––especially not nowadays, when people are so geographically and socially sorted that they have few people they can talk to across the divide. Under those conditions, you have asymmetric information about the two sides, and I do think it makes sense (if you believe the psychologists) for you to attribute *less* irrationality to yourself than to the other side. And as long as we get some asymmetry there, the story is up and running.

I also love your political empathy anecdote! Of course, I would think that the general connection I was proposing here could still be right even with particular instances––especially personal ones––failing to meet the pattern (not that you were suggesting otherwise).

Reply
Sydney Penner
2/21/2020 04:28:26 pm

I'm wondering about the claim that in the asymmetric information case it makes sense for me to attribute less irrationality to myself. Another commentator pointed out that your view appears to rely on Permissivism. As it happens, I'm sympathetic to Permissivism and so also inclined to accept some version of what you're saying here. Still, it seems to me that in many political disputes this license to attribute greater irrationality to the other side will be a very limited license.

Note, in the first place, that the asymmetry you describe will apply even in cases where it is clear that the experts are arrayed against me. It will still be the case that I can easily run through the reasoning for my view, but not as easily the reasoning for the other side. (In fact, if the issue involves technical issues, I may not be able to understand the reasoning of the experts at all.) Of course, I may well have evidence for the expertise of my opponents and perhaps for the fact that there are far more experts on the other side than on my side. But this seems evidence of a different sort. Still, this evidence means it would be a mistake for me to conclude that the other side is more irrational than I am, right? Even though I have better access to the reasoning for my view?

But if I'm right about that, isn't something similar the case with most significant political disagreements? Even if the experts aren't all arrayed on one side, isn't the fact that half of the country disagrees with me, including many people who seem otherwise intelligent enough, considerable evidence against the thesis that the disagreement is explained by my simply being more rational than them? So if we set aside fringe views and disagreements only involving a handful of people, it's not clear to me that the asymmetrical information you're talking about licenses the conclusion that I'm less irrational in any strong sense.

Kevin
2/23/2020 06:18:29 am

Thanks for the follow-up! I certainly agree that to get to the bottom of the dynamics here we have to go a fair way into the underlying epistemology––and it'll get quite hairy quite quick. But I think those are great questions for epistemologists!

In fact, I'm working on some version of this question in a related paper (the one not posted yet, on "rational polarization". Suffice it to say that there *are* Bayesian models that aren't permissive and will get you some asymmetries in at least some cases structurally like this. How far those models go to explaining the real thing we see is a super tricky question, which I hope/need to think a lot more about!

One thing I do think is right, though, is this: to the extent that you increase your confidence that you have been irrational in forming your beliefs, your confidence in your beliefs has to drop proportionally (not necessarily exactly the same amount). And here's an empirical fact: people *won't* significantly drop their confidence that Trump is (say) a bad president upon reading a psych study like this. It follows that if they don't become akratic, then they'll think the other side was more irrational.

In the post I floated the idea that this could be a rational response to the evidence. You're pushing back on that, and I totally agree that it's a really subtle question to work out. But even if that's wrong, and the asymmetric response itself is irrational, I still think the descriptive claim is plausibly true: when people come to think irrationality caused the disagreement, they come to blame the other side's irrationality more than their own.

Matt Ferkany
2/19/2020 03:23:33 pm

Really nice post and love the project. It supplies a nice structure to some problems I've been thinking about in a less structured way. I have committed to print the view that it can be virtuous to see the opposition as rational provided they are interested in reciprocal exchange. If they aren't, I am unsure whether it is right to think of them as irrational, but do think it can be a failure of virtue to worry much about engaging with them. I'd be pleased to stay apprised of your work.

Reply
Kevin
2/20/2020 02:40:33 pm

Thanks so much Matt! Could you point me to the relevant paper? Would love to read it!

Reply
Kristinn Már
2/20/2020 07:10:07 pm

Just wanted to point to a large literature on this topic, summarized in a recent Annual Review: https://doi.org/10.1146/annurev-polisci-051117-073034

Reply
Kevin
2/23/2020 06:19:11 am

Thanks!

Reply
Murali
2/21/2020 11:05:40 am

Hi Kevin. I'm looking at this from the rational disagreement angle. In order for your project to get off the ground. Permissivism, the claim that people can rationally disagree given the same total evidence must be true. However, I think it's false. There is extensive literature on this and I can't do justice to this here. So how will you handle the permissivism issue

Reply
Kevin
2/23/2020 06:28:25 am

Thanks for your question Murali!

I don't think the argument hinges on permissivism––even if there is a unique rational response to a given body of evidence (I was a Roger White student, so this is a claim I'm sympathetic to!), the people on the opposing sides have different bodies of evidence. There are all sorts of bits of information they share, but they certainly don't share all of it. (Imagine trying to remember––let alone share––every bit of news coverage you've seen one way or the other on Trump's presidency.) Tom Kelly has a good paper on related issues (https://philpapers.org/rec/KELDDA) which might be relevant. And, as I just mentioned in an above comment, I'm currently working on a big project showing how it's possible to have rational Bayesians for whom uniqueness holds who still come to predictably, massively disagree on things like this. (Paper in progress...) So suffice it to say that it's *possible* to get this sort of massive disagreement without permissivism. This comes out the interesting structural features you get when evidence is ambiguous or you have "higher-order uncertainty" about how to respond to it, and the failures of Reflection-style principles that happen there. (Bernhard Salow has a great paper on this: https://philpapers.org/rec/SALTEG, though I don't think it touches explicitly on the disagreement issue.)

Now, the gap between it being possible and it being actually what happens in our real-life case is HUGE, of course. So I take the point that in our real case, without permissivism, it is much less clear that the disagreement we see is compatible with rationality. I totally agree! But I think it's a question worth looking at closely.

For the record, no one should––and I'm certainly not––arguing that ALL of this is rational. So even if certain parts of it are irrational (like, perhaps, an asymmetric blaming the other side for irrationality), there's a question over how much of a role rational vs. irrational forces are playing in the dynamics of disagreement.

Reply
Mybrid Wonderful link
2/25/2020 05:28:20 pm

PT Barnum is missing here. In 1994 Rush Limbaugh started his campaign of hatred and Fox News not long after. There is a sucker born every minute and conservatives have been making money off of propaganda since 1994. There is a difference between liberal bias and right wing hate. AM Radio has for decades been a concerted effort to hate liberals. I'm rather disappointed that this factor is missing from the analysis. Every AM radio has litany of "Liberals hate us", "Liberals are un-American", "Liberals are baby killers", "Liberals are communists", and on and on. These litanies are hate speech. That's what they sell for money. Everyone in politics works backwards from a desired conclusion when interpreting world events. But conservatives have no genuine self-criticism, liberals do. Take George Bush. They are not critical of him, at all. Instead they just pretend he never existed. Conservatives just criticize liberals for money. This approach sets up a false equivalence of "both sides" being the same and this is just not true. AM Hate Radio is solely the domain of conservatives because all they do is peddle hatred of liberals. Liberals are not interested in piling on with corresponding "conservatives hate us" radio. The fellow who spawned the invention of the phrase "fake news" on Facebook said he tried to make money equally with liberal and conservative audiences but it was only mostly the conservative audience that made him money. Why is that? This analysis is missing this aspect. Until and unless the hate cash machine is factored into this analysis then I fail to see how it will have the efficacy desired.

Reply
Kevin
2/27/2020 03:48:02 am

I see your concern, and I understand your frustration. I would like to flag that the point of this space is as one where people who disagree with each other can try to find common ground. So concerns like the ones you raise are perfectly legitimate, but I would like to keep them from being framed as denunciations.

To your concerns:

I take one of your concerns to be that there are other factors, besides attributions of irrationality (for instance, propaganda) that divide us. That is surely true, and nothing I said was meant to deny that. All I am suggesting is that attributions of irrationality are *one* (symmetric) cause of demonization; there may be other asymmetric ones.

I take it that you have a second concern, which is that there are differences between how the two sides demonize each other, and the explanations for why they do so. That is surely correct as well––it is an interesting and hard empirical question separating out the different (past and continuing) causes here.

You seem to have a third concern, which is that not only are there differences, but somehow those differences make conservatives more blameworthy for their demonization of liberals than liberals are for their demonization of conservatives. I would like to treat that proposal with extreme caution, to say the least. To make an obvious point: there is plenty of liberal hate for conservatives as well. And how to say whether one side is more blameworthy for the hate between them is an extremely thorny moral and empirical question. It does sound to me, rather strikingly, like the story I told in the blog post applies here: if people think irrationality caused the disagreement, they think it was the *other side's* irrationality (gullibility; falling for propaganda; etc.) that did so––as you seem to think of the conservatives. That is a pattern of feeling that I argued should be expected from people even given *symmetric* disagreements. So I think that should give us some pause in being too confident in our judgments that the other side is more blameworthy.

At the least, I'll say this. I think that politics is made worse, not better, by each side demonizing the other side. So if we can find ways to understand the other side––to disagree with them but not hate them––I think we should try to do so. That's the point of this blog, at least, and I think that it is something worth doing. Even if you think there is more to the story, or other important factors, or other things we should be doing, I hope you can agree that there's value in what I'm trying to do here.

Thanks for your comment,
Kevin

Reply
Kolja
2/27/2020 02:39:16 pm

Kevin, I think that is a great response, and I think you're exactly right. If someone exclusively listens to AM talk radio and does not interact with any "liberals" with some frequency, they would not be very irrational in thinking "the liberals" are evil and irrational.

What I think is the really important question is how you could make it rational for someone that is a habitual AM talk radio listener to no longer think that all liberals are evil and irrational. And I don't think liberals decrying the talk show hosts as evil and irrational can accomplish that.

From what I would expect, modelling this in the proper way would just lead to the radio listener to even further downgrade the credibility of "the liberals" as testifiers, rather than downgrade the credibility of AM radio as testifier. Or, to be more precise, if someone just asserts *that* AM radio is propaganda, that would be the result (call that a propaganda claim de dicto). However, if the AM radio listener is approached with evidence that is a) consistent with their evidence about the content of AM radio and b) consistent with what they would themselves identify as constituting propaganda/bad reasoning, (call that a subjectively endorsed propaganda claim de re) without necessarily stating that "AM radio is propaganda", I suspect it would lead to some softening of the divide.

That's all still very non formal and from the armchair, but it would make more sense from my understanding of what it means to rationally respond to evidence.

Here is a slightly stronger claim: I think that when considering these structural issues the only way forward is internalist epistemology, because any externalist approach to actually helping would have to pre-judge which side gets it right in order to say who is and isn't being irrational.

Peter Michael Gerdes
3/16/2020 12:46:54 am

I’m a bit puzzled at the role you think the academic research is playing here.

I mean the problem you mention seems to be one about communication. Namely that even though the notion of irrationality doesn’t equate to blameworthy as used in academia it does in colloquial english and the problem is the bad writing which doesn’t make this point more clear.

Reply
Kevin
3/20/2020 05:11:23 am

I'm not sure I agree with you that it's a communication problem. I think the terms "irrationality" and "bias" in natural language are robustly normative terms, such that if you're thinking is irrational or biased then you're not thinking as you should. I also think that there are pretty deep conceptual links between whether you are "thinking as you should", and whether you are blameworthy for how you are thinking an acting. So I'm inclined to say, first of all, that *if* psychologists intend to be using "irrational" and "biased" in a way that is disconnected from blame, then they have chosen the wrong words---that it's at best very misleading, and certainly not outsider's faults for misinterpreting them. The problem would be not with the communication of the research but with the explanation of the research itself (since the academic work is peppered with these terms and other normative variants).

Moreover, I'm inclined to deny that psychologists are consistently intended to use these terms in a way that is divorced from blame. Sometimes this is very clear, as when psychologists investigate racial or gender biases---they clearly intend those to be things that people are blameworthy for having (even if there is a caveat that "we all have these so we're all equally blameworthy"). At other times they explicit say that they have a normative interpretation; here's Kahneman and Tversky replying to a critique of calibration research from Gigerenzer on this front: "Our disagreement [with Gigerenzer (1991)] is normative, not descriptive. We believe that subjective probability judgments should be calibrated, whereas Gigerenzer appears unwilling to apply normative criteria to such judgments" (1996, "On the Reality of Cognitive Illusions").

So I guess I agree that there's something that has gone wrong in the chain here, but I'm less inclined to think it's merely a problem in communication of the empirical research, and more inclined to think it's a problem with the normative interpretations and presuppositions built into the research itself.

Reply
Layla
6/29/2020 04:56:57 am

Thanks for a great read! I agree strongly with what you're written and I think there's a lot more here to unpack. Another main concern for me is our current tendancy to 'preach to the choir' and only listen and talk to those who share our same opinions.

Reply



Leave a Reply.

    Kevin Dorst

    Philosopher at MIT, trying to convince people that their opponents are more reasonable than they think

    Quick links:
    - What this blog is about
    - ​Reasonably Polarized series
    - RP Technical Appendix

    Follow me on Twitter or join the newsletter for updates.

    RSS Feed

    Archives

    April 2021
    March 2021
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020

    Categories

    All
    All Most Read
    Conjunction Fallacy
    Framing Effects
    Gambler's Fallacy
    Overconfidence
    Polarization
    Rationalization
    Reasonably Polarized Series

  • Bio
  • Research
  • Teaching
  • Public Philosophy
  • Stranger Apologies