Department or Program

Philosophy

Abstract

We assign probabilities to represent our epistemic states about future events. We do this for tomorrow’s weather prediction and timelines of Superintelligent A.I. systems. We may hold the same probability assignment for different events, but yet something feels different about these probabilities. The second case is oftentimes labeled as a case of deep uncertainty. This paper will offer an epistemic account of deep uncertainty with an appeal to higher-order uncertainty. It aims to be theoretical rather than prescriptive. I first defend the Higher-Order Uncertainty Thesis (HOU): S has deep uncertainty towards P(A)=q, S’s probability assignment q for some proposition A, iff S has not responsibly integrated higher-order uncertainty q*, which is below some threshold some threshold q**, towards our probability assignment. I then defend two thesis about the upshots of my argument: Belief Thesis: If we have deep uncertainty towards P(A)=q, then we should not believe that q represents the rational probability of A given higher-order uncertainty. Assertion Thesis: If we have deep uncertainty towards P(A)=q, then we should not assert P(A)=q.

Level of Access

Open Access

First Advisor

Schilling, Haley

Date of Graduation

5-2024

Degree Name

Bachelor of Arts

Number of Pages

38

Open Access

Available to all.

Share

COinS