少点错误 2024年08月21日
Trying to be rational for the wrong reasons
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

文章探讨了理性主义与自我欺骗的关系,认为支持理性的理由存在循环论证,且在某些情况下自我欺骗可能更具工具性用途,同时提出了几种应对观点和个人的真实看法。

🎯理性主义者对理性有非理性的偏好,支持理性的理由存在循环论证,难以打动原本就不关心理性的人。

💡提出了一些应对观点,如双思模式,即同时保持两种现实模型,但存在一些假设问题;多主体模式,父亲作为完美理性主义者,培养儿子相信理性和自我服务的最佳组合;分时间模式,先成为完美理性主义者,再设计洗脑计划让自己更具获胜优势。

🤔作者表示自己对真理和理性有非理性的偏好,认为某些策略需要放弃自己真正关心的东西,以换取不确定的未来价值,这不是一个好交易,但对于不关心理性的人,这可能是一种更平衡的选择。

Published on August 20, 2024 4:18 PM GMT

Rationalists are people who have an irrational preference for rationality.

This may sound silly, but when you think about it, it couldn't be any other way. I am not saying that all reasons in favor of rationality are irrational -- in fact, there are many rational reasons to be rational! It's just that "rational reasons to be rational" is a circular argument that is not going to impress anyone who doesn't already care about rationality for some other reason.

So when there is a debate like "but wouldn't the right kind of self-deception be more instrumentally useful than perfectly calibrated rationality? do you care more about rationality or about winning?", well... you can make good arguments for both sides...

On one hand, yes, if your goal is to maximize your utility function U then "maximizing U by any means necessary" is by definition ≥ "maximizing U using rationality". On the other hand, if you take a step back, how would you know whether your approach X actually maximizes U, if you gave up on rationality? The self-deception that you chose instrumentally as a part of strategy X could as a side effect bias your estimates about how much U you really get by following X... but there may be ways to deflect this counter-argument.

1) Doublethink. Keep simultaneously two models of reality, one of them rational, the other optimized by the former to be winning. There are some shaky assumptions here. It may be computationally impossible for a human to keep two separate models of reality; to make sure that it's the former that nudges the latter (rather than the other way round, or both nudging each other), but it's the latter (rather than a mix of both) that influences System 1. But this sounds like a nirvana fallacy: the people who choose rationality over doublethink are not doing rationality perfectly either! So let's compare the average human doublethink against the average human rationality (instead of a hypothetical perfect rationality). Now it is not so clear that the rationality wins.

2) Multiple agents. Imagine a father who wants his son to be winning as much as possible. The father could be a perfect rationalist, while raising his son to believe the optimal mix of rationality and self-serving bullshit. Here the objections against self-deception do not apply; the father is not deceiving himself about anything. (We could make a different objection, that the son will not be able to provide the same kind of service to his children. But that's moving the goalpost.)

3) Split time. Become a perfect rationalist first, then design the perfect plan for brainwashing yourself into someone more winning (at the cost of losing some rationality), then brainwash yourself. Most of the objections you make against this idea can be answered by: yeah, assume that the original perfect rationalist considered this possibility and adjusted their plans accordingly. Yeah, in some Everett branches something completely unexpected might happen in exactly the right way that the original rationalist could have prevented a disaster, but the brainwashed person no longer can. But again, compare the average outcomes. The small probability of a disaster might be an acceptable cost to pay over a large probability of winning more.

Frankly, "if you are no longer a rationalist, you cannot be sure that you are doing the optimal thing" was never my true rejection. I am quite aware that I am not as rational as I could be, so I am not doing the optimal thing anyway. And I don't even think that the outcome "you are doing the optimal thing, and you think that you are doing the optimal thing, but because you have some incorrect beliefs, you don't have a justified true belief about doing the right thing" is somehow tragic; that sounds like something too abstract to care about, assuming that the optimal thing actually happens regardless.

My true rejection is more like this: I have an irrational preference for things like truth and reason (probably a side effect of mild autism). You provide an argument that is maybe correct or maybe incorrect, I am not really sure. From my perspective, what takes away the temptation is that your strategy requires that I give up a lot of what I actually care about, now, forever, with certainty... and in return maybe get some other value (possibly much greater) in some unspecified future, assuming that your reasoning is correct, and that I can execute your proposed strategy correctly. This simply does not sounds like a good deal.

But the deal might be more balanced for someone who does not care about rationality. Then it's just two strategies supported by similarly sounding, very abstract arguments. And you are going to make some mistakes no matter which one you choose, and in both cases an unlucky mistake might ruin everything. There is too much noise to make a solid argument for either side.

...which is why I consider "arguing that rationality is better than optimal self-deception" a waste of time; despite the fact that I made my choice and feel strongly about it. The arguments in favor of rationality are either circular (on the meta level), or irrational.



Discuss

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

理性主义 自我欺骗 应对观点 个人看法
相关文章