Published on July 29, 2025 6:01 PM GMT
Nick Bostrom defines existential risk as
Existential risk – One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.
The problem with talking about "doom" is that many worlds that fall to existential risk but don't involve literal extinction are treated as non-doom worlds. For example, leaving humanity the Solar System rather than a significant portion of the 4 billion galaxies reachable from the Solar System is plausibly a "non-doom" outcome, but it's solidly within Bostrom's definition of x-risk.
Thus when people are discussing P(doom), the intent is often to discuss only extreme downside outcomes, and so low P(doom) such as 20% doesn't imply that the remaining 80% don't involve permanently and drastically curtailed future of humanity. In other words, a P(doom) of 20% is perfectly compatible with P(x-risk) of 90-98%.
Ben Mann of Anthropic said in a recent interview (at 50:24 in the podcast) that his x-risk probability is 0-10%. If he really did mean Bostrom's x-risk rather than extinction risk (which "x-risk" isn't an abbreviation for), that seems like a relatively unusual claim.
Low P(doom) seems like a much more defensible position than low x-risk, and so low P(doom) might often be taking the role of motte to the bailey of low x-risk (I'm not sure how it could possibly be made defensible, if x-risk is taken in Bostrom's sense). Or one publicly claims low P(doom), but then implicitly expects high x-risk, meaning a high probability of drastically curtailed potential, of cosmic endowment being almost completely appropriated by AIs.
Between Extinction and Permanent Disempowerment
The three existing terms in recent use for extreme downside of superintelligence are extinction, doom, and x-risk. Permanent disempowerment (some level of notably reduced potential, but not extinction) covers a lot of outcomes between extinction and x-risk. Doom is annoyingly ambiguous between including only extreme levels of permanent disempowerment within itself, and including even very slight permanent disempowerment in the sense that some nontrivial portion of cosmic endowment goes to AIs that are not a good part of humanity's future.
Also there are worlds within x-risk that involve neither extinction nor permanent disempowerment, where originally-humans are constrained or changed in cruel and unusual ways, or not given sufficient autonomy over their own development. These outcomes together with extinction form a natural collection of extremely bad outcomes, but there is no word for them. In the framing of this post, they are the outcomes that are worse than mere permanent disempowerment. "Doom" would've worked to describe these outcomes if it wasn't so ambiguous and didn't occasionally include moderate permanent disempowerment within itself, in particular for people who want to publicly claim very high P(doom), this time making it the motte of their position that obscures the implicit expectation of extinction with only a relatively moderate probability such as 50%.
Avoid "Doom", Clarify Stance on Permanent Disempowerment
Meaningful use of "doom" seems hopeless. But clarifying your stance on permanent disempowerment seems like a straightforward recipe for disambiguating intended meaning. This doesn't risk obscuring something central to your position on the topic as much as only talking about extinction or x-risk, even though they are well-defined, because there could be a lot of permanent disempowerment outside of extinction or inside x-risk, or alternatively only a little bit.
If the permanent disempowerment outcomes are few, extinction gets much closer to x-risk, in which case doom is less ambiguous, but that's only visible if you name both extinction risk and x-risk, and remains extremely ambiguous if you only state a somewhat low extinction risk or a somewhat high x-risk. On the other hand, claims of very high extinction risk or claims of very low x-risk are unambiguous, because they leave little space for permanent disempowerment to hide in.
Discuss