Academic PublicationsEvidence: A Guide for the Uncertain. (Forthcoming). Philosophy and Phenomenological Research. [Link]
Abominable KK Failures. (2019). Mind 128 (512): 1227–1259. [Link]
Lockeans Maximize Expected Accuracy. (2019). Mind 128 (509): 175–211. [Link]
Higher-Order Uncertainty. (2019). In M. Skipper Rasmussen & A. Steglich-Peterson (eds.), Higher-Order Evidence: New Essays. Oxford University Press, 35–61. [Link]
Can the Knowledge Norm Co-Opt the Opt-Out? (2014). Thought: A Journal of Philosophy 3 (4): 273 – 282. [Link]
Public PhilosophyWhy Rational People Polarize. (2019). The Phenomenal World, January 24. [Link]
Dissertation: Modest EpistemologyThinking properly is hard. Sometimes I mess it up. I definitely messed it up yesterday. I’ll likely mess it up tomorrow. Maybe I’m messing it up right now.
I’m guessing you’re like me. If so, then we’re both modest: we’re unsure whether we’re thinking rationally. And, it seems, we should be: given our knowledge of our own limitations, it’s rational for us to be unsure whether we’re thinking rationally. How, then, should we think? How does uncertainty about what it’s rational to think affect what it’s rational to think? And how do our judgments of people’s (ir)rationality change once we realize that it can be rational to be modest? My dissertation makes a start on answering those questions. Chapter 1 ("Higher-Order Uncertainty") introduces a general framework for modeling situations in which you are rational to be unsure what the rational opinions are. I first show how this framework allows us to precisely formulate the questions from the higher-order evidence literature. I then use it to argue that rational modesty is needed to explain the epistemic force of peer disagreement, and therefore that any theory of such disagreement must be based on a general theory of rational modesty. Many have suggested that such a theory can be easily formulated based on the enkratic intuition that your first-order opinions must “line up” with your higher-order ones. But I argue that this is incorrect: whenever modesty is rational, so too is epistemic akrasia. We need to look elsewhere for a general theory of rational modesty. Chapter 2 ("Evidence: A Guide for the Uncertain") offers one. I build a theory that—in a precise sense—allows as much modesty as possible while still guaranteeing that rationality is a guide. The key principle—which I call Trust—formalizes the truism that it’s likely that what the evidence supports is true. I show that Trust permits modesty, ensures that rational opinions are correlated with truth, and is necessary and (in a wide class of scenarios) sufficient to vindicate the truism that you should always prefer to respond rationally to your evidence. In sum, Trust establishes that there is a principled way for rational people to be modest. Chapter 3 ("Rational Polarization") applies this theory of rational modesty to the psychology of human reasoning. In particular, a wide range of studies suggest that people have a tendency to predictably polarize in the face of conflicting evidence: to gather and interpret evidence in a way that leads them to predictably strengthen their prior beliefs. This “confirmation bias” is standardly taken to be a hallmark of human irrationality. It need not be. I first prove that whenever modesty can be rational, so too can predictable polarization. I then argue, further, that this abstract possibility may play a role in the actual polarization we observe. In particular, given common structures of rational modesty generated by the process of cognitive search, rational agents who care only about the truth should sometimes exhibit confirmation bias. Work in Progress"Overconfidence" is Rational
Rational Polarization
Evidence of Evidence: A Higher-Order Approach (with Branden Fitelson and Brooke Husic)
|