少点错误 2024年12月23日
Vision of a positive Singularity
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

文章探讨了人工智能和超智能发展带来的焦虑,以及如何构建一个尊重不同价值观的未来。核心观点是,在技术奇点到来后,应允许多元共存,而非强制统一。文章提出了技术分层概念,类似于人类社会中不同技术水平的群体,如现代文明、阿米什人、未接触部落等,并将其扩展到人工智能。文章认为,应尊重不同群体的意愿,允许他们在各自的技术领域发展,避免冲突。同时,文章也探讨了现有不道德行为的处理方式,以及如何应对不同群体间不兼容的愿望,强调在后奇点时代,我们不必再强调“我们都在一起”,而应尊重各群体选择自己道路的权利。

🌍 技术分层共存:文章提出技术分层概念,类似于人类社会中不同技术水平的群体,如现代文明、阿米什人、未接触部落等,并将其扩展到人工智能,旨在实现不同技术水平的群体共存。

🚀 尊重多元选择:文章强调,在技术奇点到来后,应尊重不同群体的意愿,允许他们在各自的技术领域发展,避免冲突,而非强制统一价值观和生活方式。

🧑‍💻 技术领域隔离:文章建议,通过物理位置和工作类型来隔离不同技术水平的群体,例如,超智能可以专注于太空基础设施建设,而限制其在非增强人类社会中的活动。

⚖️ 现有伦理考量:文章探讨了如何处理现有不道德行为,如工厂化养殖,提出可以允许其在地球上继续存在,但不应将其扩展到其他星球。同时,也反思了现代文化中一些不必要的行为,如老式汽车文化。

🤝 兼容性挑战:文章承认,并非所有人的愿望都兼容,有些人可能反对任何形式的扩张,但强调应尊重各群体选择自己道路的权利,而非强迫统一。

Published on December 23, 2024 2:19 AM GMT

Introduction

Many people feel significant anxiety about how AI and superintelligence will play out, regarding both the ultimate outcome and the intermediate stages. There is the sense that some kind of loss is inevitable as humanity becomes more powerful in this way. There is also the concern that there will be no place for a person or society with existing values any more as things progress. I try to think up a plan and system of values that will respect everyone's desires as much as possible. The goal is to ensure coexistence among groups with differing values, minimizing conflict over competing visions for the future.

A clear positive vision is important in times of uncertainty we need to know and fully imagine how things can go well just as much as it can go badly. Positive visions can inspire people to make good choices.

Spheres of control and influence

The basic idea is to build on what we already have for groups and creatures with different intellectual and technological capabilities. For humans there is modern civilization, then say the Amish, then uncontacted tribes. You can take this further and include nature. At one end of the spectrum are great apes with complex social structures, followed by ecosystems dominated by insects, single-celled organisms, and finally lifeless environments.

In most of these cases we have the concept that increasing a place or group up the scale is not done without thought.

You can start with bringing life to a lifeless place, say tardigrades to the moon. Some people think this is spoiling a pristine environment and the moon has some right to be a pristine environment from now to the end of time (I don’t feel that way). Then there is the concept of invasive species, even if it is one bacteria or very simple organism replacing another. Many people would be strongly against the prospect of planting a thriving forest in one of the dry valleys in Antarctica if it became possible even though there would be more varied life there as a result.

We also respect the rights of groups of humans that don’t want more technology, starting with the obvious step of leaving uncontacted tribes mostly alone, to the generally positive sentiment towards the Amish as far as I know. Additionally effort is made to let indigenous people keep their historical way of life where possible. For example if a group has been using a fishing technique for hundreds of years they often get that right protected going forward. They may get first rights to fishing quotas, and more effective fishing techniques in the area would not be allowed.

Can we apply such a system to groups today? The difference from most peoples point of view is that they would now not be on the most extreme tech frontier, they would be Amish in many ways. If we where to try to apply this principle, then AI would not be allowed to disrupt some professions, groups or regions.

A clear way to segregate is by physical location. Lets consider starships first, at the end stage of the Singularity. It should be clear that non-biological craft will be able to withstand greater acceleration, reproduce faster. Such mind upload/AI will not be taking the place of biological humans, and they will take >99% of future humanities territory. Even if the biological humans take all that they can expand to, that is still far less than non-bio. You could then restrict significantly AI enhanced humans with the likes of Neuralink to new societies where most if not all people had them, say space colonies or new cities in the desert.

The difficulty is deciding how to achieve this. The first neurally enhanced human can’t live in their own city. However we could more feasibly have rules that superintelligences don’t run non-enhanced countries.

TAI or soon to become superintelligent mind uploads could be restrained by the kind of work they are allowed to do, and then by physical location. In the comedy TV series “upload”, mind uploads are not allowed to work, including write software I think.

While on earth, they could be limited to designing and building space infrastructure, curing aging and disease, enabling mind uploads if not already created.

We probably will want them to reverse or mitigate the negative environmental impact of our current tech. How far is a question? Do we want them to enable people to drive large SUV, overfish because it is now part of “historical” culture? That is create synthetic fuels, fix atmospheric CO2 levels, breed and release animals to hunt/fish? As a society we are OK with indigenous practices that are sustainable, but what about protecting more modern ones that are not. Old car culture will look a lot like past fishing practices soon. It already does to my young son, he just cannot understand why anyone would want a loud car or motorbike.

Current unethical practices (Factory farming, harmful culture)

It is not so clear how things will play out with existing practices that are arguably unethical. One approach is to ignore them because they are insignificant compared to the consciousness to be created on the billions of stars probably available. That is let it continue but not spread. You can keep factory farming, but only on earth, with resources you can sustainably create without AI help. Some people view evolved life itself like this and claim that it is net negative and suffers more than enjoys itself. In that case we would leave nature alone, but not spread it to the stars, instead only spread parts that had been adapted to have an ethically positive existence.

Incompatible desires

This system could work for many peoples desires but not everyone. Few people want to spread factory farming or slavery to the stars, but some regard any human expansion as inherently bad. Those people want to stop others from going their own way. E.g. you can’t colonize Mars as it isn’t yours to go to, but it is our right to stop you. This could apply to current conflicts as well - "we desire to destroy this group or people"

We are not all in this together anymore

Currently it is fashionable to say that space exploration must be for the good of humanity or all people on earth and that “we are all in this together”. If instead you explicitly recognize groups rights to go their own way, this does not apply so much anymore. Instead of arguing that their values and lifestyle is best, people could recognize that they are destined for different places post Singularity.

Swapping between tech spheres?

The different tech spheres would need to decide when people can swap between them. For example someone born into a 2020 tech level without AGI and anti-aging may decide at 70 they would rather move to the Moon, get enhanced and rejuvenation therapy than die of old age. Advancing to a higher technological sphere, such as adopting Neuralink or immortality, seems more feasible than reverting to a lower-tech lifestyle. Lower tech groups may not allow people from higher tech groups to join.

No spoilers?

Part of the culture of a sphere could be that its inhabitants want to discover scientific truths for themselves. So no sharing the solution of the Riemann hypothesis or even if it can be solved from the superintelligences to others.

Summary

The main point of this article is that we need a collective vision for a positive Singularity and it needs to respect peoples different values as much as possible. At present if most people think about it at all, they probably assume a high degree of technological conformity will be enforced on everyone with a common set of values. Maybe this is how things will play out, but other options should be properly considered. It is easier to see it happening with some alignment paths and plans than others.



 



Discuss

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

技术奇点 多元共存 技术分层 价值观 人工智能
相关文章