Work in ProgressRational Polarization (Handout; Presentation) Being Rational and Being Wrong Fine-Tuning Divine Indifference (with Chris Dorst) Academic PublicationsDeference Done Better (with Ben Levinstein, Bernhard Salow, Brooke E. Husic, and Branden Fitelson ). Forthcoming. Philosophical Perspectives. [Mathematica notebook] (Almost) All Evidence is Higher-Order Evidence (with Brian Hedden). Forthcoming. Analysis. Assertion is Weak (with Matt Mandelkern). Forthcoming. Philosophers' Imprint. Good Guesses (with Matt Mandelkern). Forthcoming. Philosophy and Phenomenological Research. Be modest: you're living on the edge. Forthcoming. Analysis. Evidence: A Guide for the Uncertain. 2020. Philosophy and Phenomenological Research. 100 (3): 586-632. Abominable KK Failures. 2019. Mind. 128 (512): 1227–1259 Lockeans Maximize Expected Accuracy. 2019. Mind. 128 (509): 175–211 Higher-Order Uncertainty. 2019. In M. Skipper & A. Steglich-Petersen (eds.), Higher-Order Evidence: New Essays. Oxford University Press, 35–61. Can the Knowledge Norm Co-Opt the Opt-Out? 2014. Thought: A Journal of Philosophy 3 (4): 273–282. Handbook Articles and ReviewsHigher-Order Evidence. Forthcoming. In Maria Lasonen-Aarnio and Clayton Littlejohn (eds.), The Routledge Handbook for the Philosophy of Evidence. Routledge. Review of Epistemic Consequentialism , by Kristoffer Ahlstrom-Vij and Jeffrey Dunn (eds.). 2020. Philosophical Review 129 (3): 484-489. Public PhilosophyStranger Apologies blog, including the series Reasonably Polarized: Why politics is more rational than you think. How America Polarized. Arc Digital . Feb 3, 2021. The Other Side is More Rational Than You Think. Arc Digital . September 25, 2020. The Rational Question. . The Oxonian Review . March 14, 2020. Why Rational People Polarize. Phenomenal World. January 24, 2019. Podcasts/Interviews:
Dissertation: Modest EpistemologyThinking properly is hard. Sometimes I mess it up. I definitely messed it up yesterday. I’ll likely mess it up tomorrow. Maybe I’m messing it up right now.
I’m guessing you’re like me. If so, then we’re both modest: we’re unsure whether we’re thinking rationally. And, it seems, we should be: given our knowledge of our own limitations, it’s rational for us to be unsure whether we’re thinking rationally. How, then, should we think? How does uncertainty about what it’s rational to think affect what it’s rational to think? And how do our judgments of people’s (ir)rationality change once we realize that it can be rational to be modest? My dissertation makes a start on answering those questions. Chapter 1 ("Higher-Order Uncertainty") introduces a general framework for modeling situations in which you are rational to be unsure what the rational opinions are. I first show how this framework allows us to precisely formulate the questions from the higher-order evidence literature. I then use it to argue that rational modesty is needed to explain the epistemic force of peer disagreement, and therefore that any theory of such disagreement must be based on a general theory of rational modesty. Many have suggested that such a theory can be easily formulated based on the enkratic intuition that your first-order opinions must “line up” with your higher-order ones. But I argue that this is incorrect: whenever modesty is rational, so too is epistemic akrasia. We need to look elsewhere for a general theory of rational modesty. Chapter 2 ("Evidence: A Guide for the Uncertain") offers one. I build a theory that—in a precise sense—allows as much modesty as possible while still guaranteeing that rationality is a guide. The key principle—which I call Trust—formalizes the truism that it’s likely that what the evidence supports is true. I show that Trust permits modesty, ensures that rational opinions are correlated with truth, and is necessary and (in a wide class of scenarios) sufficient to vindicate the truism that you should always prefer to respond rationally to your evidence. In sum, Trust establishes that there is a principled way for rational people to be modest. Chapter 3 ("Rational Polarization") applies this theory of rational modesty to the psychology of human reasoning. In particular, a wide range of studies suggest that people have a tendency to predictably polarize in the face of conflicting evidence: to gather and interpret evidence in a way that leads them to predictably strengthen their prior beliefs. This “confirmation bias” is standardly taken to be a hallmark of human irrationality. It need not be. I first prove that whenever modesty can be rational, so too can predictable polarization. I then argue, further, that this abstract possibility may play a role in the actual polarization we observe. In particular, given common structures of rational modesty generated by the process of cognitive search, rational agents who care only about the truth should sometimes exhibit confirmation bias. |