TechCrunch News 前天 00:21
The ‘OpenAI Files’ push for oversight in the race to AGI
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

“OpenAI 文件”是由非营利组织 Midas 项目和 Tech Oversight 项目联合发起的一项档案项目,旨在揭示 OpenAI 在人工智能发展过程中面临的治理、领导力和文化问题。该项目关注人工智能可能带来的巨大影响,并呼吁对 OpenAI 等 AI 领导者进行更严格的监管。文件详细描述了 OpenAI 在追求 AGI 过程中,为了迎合投资者而进行的结构性改革,以及在安全评估和领导层诚信方面存在的问题。该项目希望通过揭示这些问题,促使行业关注人工智能发展中的责任、伦理和共享利益,从而引导人工智能朝着更负责任的方向发展。

🤔 **项目背景与目标**: “OpenAI 文件”旨在揭示 OpenAI 在人工智能发展过程中的关键问题,特别是关于治理、领导力和组织文化方面的担忧。其主要目标是提高公众意识,并为 OpenAI 及其他 AI 领导者提出一条侧重于负责任治理、伦理领导和共享利益的道路。

💰 **投资者压力与结构性改革**: OpenAI 最初是一家非营利组织,曾计划将投资者的利润限制在 100 倍以内,以确保实现 AGI 后的收益能惠及全人类。然而,为了迎合投资者的需求,OpenAI 改变了这一策略,取消了利润上限,这引发了对公司治理和优先事项的质疑。

⚠️ **安全评估与领导层问题**: “OpenAI 文件” 强调了 OpenAI 在安全评估流程中存在的不足,以及公司内部的“鲁莽文化”。此外,文件中还提到了 OpenAI 董事会成员和 Sam Altman 本人可能存在的利益冲突,以及对 Altman 诚信的质疑,这些都引发了人们对 OpenAI 领导层和决策透明度的担忧。

OpenAI CEO Sam Altman has said humanity is only years away from developing artificial general intelligence that could automate most human labor. If that’s true, then humanity also deserves to understand and have a say in the people and mechanics behind such an incredible and destabilizing force. 

That is the guiding purpose behind “The OpenAI Files,” an archival project from the Midas Project and the Tech Oversight Project, two nonprofit tech watchdog organizations. The Files are a “collection of documented concerns with governance practices, leadership integrity, and organizational culture at OpenAI.” Beyond raising awareness, the goal of the Files is to propose a path forward for OpenAI and other AI leaders that focuses on responsible governance, ethical leadership, and shared benefits.

“The governance structures and leadership integrity guiding a project as important as this must reflect the magnitude and severity of the mission,” reads the website’s Vision for Change. “The companies leading the race to AGI must be held to, and must hold themselves to, exceptionally high standards.”

So far, the race to dominance in AI has resulted in raw scaling — a growth-at-all-costs mindset that has led companies like OpenAI to hoover up content without consent for training purposes and build massive data centers that are causing power outages and increasing electricity costs for local consumers. The rush to commercialize has also led companies to ship products before putting in necessary safeguards, as pressure from investors to turn a profit mounts.

That investor pressure has shifted OpenAI’s core structure. The OpenAI Files detail how, in its early nonprofit days, OpenAI had initially capped investor profits at a maximum of 100x so that any proceeds from achieving AGI would go to humanity. The company has since announced plans to remove that cap, admitting that it has made such changes to appease investors who made funding conditional on structural reforms. 

The Files highlight issues like OpenAI’s rushed safety evaluation processes and “culture of recklessness,” as well as the potential conflicts of interest of OpenAI’s board members and Altman himself. They include a list of startups that might be in Altman’s own investment portfolio that also have overlapping businesses with OpenAI.

The Files also call into question Altman’s integrity, which has been a topic of speculation since senior employees tried to oust him in 2023 over “deceptive and chaotic behavior.” 

Techcrunch event

Boston, MA | July 15

REGISTER NOW

 “I don’t think Sam is the guy who should have the finger on the button for AGI,” Ilya Sutskever, OpenAI’s former chief scientist, reportedly said at the time.

The questions and solutions raised by the OpenAI Files remind us that enormous power rests in the hands of a few, with little transparency and limited oversight. The Files provide a glimpse into that black box and aim to shift the conversation from inevitability to accountability. 

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

OpenAI 人工智能 治理 伦理
相关文章