Published on July 13, 2025 10:36 PM GMT
This is a bit of a rant but I notice that I am confused.
Eliezer said in the original Sequences:
But it's pretty obvious that LessWrong is not about winning (and Eliezer provides a more accurate definition of what he means by rationality here). As far as I can tell LW is mostly about cognitive biases and algorithms/epistemology (the topic of Eliezer's sequences), self-help, and a lot of AI alignment.
But LW should be about winning! LW has the important goal of solving alignment, so it should care a lot about the most efficient way to go about it, in other words about how to win, right?
So what would it look like if LW had a winning attitude towards alignment?
Well, I think this is where the distinction between the two styles of rationality (cognitive algorithm development VS winning) matters a lot. If you want to solve alignment and want to be efficient about it, it seems obvious that there are better strategies than researching the problem yourself, like don't spend 3+ years on a PhD (cognitive rationality) but instead get 10 other people to work on the issue (winning rationality). And that 10x s your efficiency already.
My point is that we should consider all strategies when solving a problem. Not only the ones that focus directly on the problem (cognitive rationality/researching alignment), but also the ones that involve acquiring a lot of resources and spending these to solve the problem (winning rationality/getting 10 other people to research alignment).
This is especially true when other strategies get you orders of magnitude more leverage on the problem. To pick an extreme example, who do you think has more capacity to solve alignment, Paul Christiano, or Elon Musk? (hint: Elon Musk can hire a lot of AI alignment researchers).
I am confused because LW teaches cognitive rationality so it should notice all that and recognize that epistemology and cognitive biases and a direct approach is not the most efficient way to go about alignment (or any ambitious goal), and start studying how people actually win in the real world.
But it's not happening (well, not much at least).
As far as I can tell cognitive rationality helps but winning seems to be mostly about agency and power really. So maybe LW should talk more about these (and how to use them for good)?
Discuss