少点错误 2024年09月15日
How you can help pass important AI legislation with 10 minutes of effort
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

加州州长加文·纽森将于9月30日前决定是否签署SB1047,该法案要求大型AI模型开发商测试其模型的潜在危害,并制定安全协议以防止滥用。该法案引发了热烈讨论,支持者认为它可以保护公众安全,而反对者则担心它会阻碍AI创新。

🤔 **SB1047法案的核心内容:**该法案要求训练成本超过1亿美元的大型AI模型开发商,对其模型进行测试,以评估其是否可能造成严重危害,例如对关键基础设施的网络攻击或生物武器的制造。开发商还必须制定安全协议,详细说明他们将如何采取合理措施防止这些危害,并公开发布该协议。违反法案的公司将对由此造成的损害负责。

🚀 **SB1047法案的意义:**该法案标志着对AI监管的重大转变,因为它要求大型AI模型开发商对其模型的安全性负责。此前,AI政策主要依赖政府的报告要求和AI开发商的自愿承诺。SB1047将为学术研究人员和初创企业提供公共云计算资源,促进AI研究的可及性,并为大型AI公司员工提供举报保护机制。

🤝 **SB1047法案的争议:**该法案得到了许多学术研究人员、大型AI公司员工和组织的支持,但遭到了OpenAI、谷歌、Meta、风险投资公司A16z以及一些其他学术研究人员和组织的反对。一些人认为该法案过于严格,可能会阻碍AI创新,而另一些人则认为它可以保护公众安全。

📢 **公众参与:**如果您支持该法案,可以通过多种方式帮助它通过。您可以写一封信给州长纽森,表达您的支持,并鼓励家人和朋友也这样做。您还可以组织信件写作活动,帮助更多人参与进来。

Published on September 14, 2024 10:10 PM GMT

Posting something about a current issue that I think many people here would be interested in. See also the related EA Forum post.

California Governor Gavin Newsom has until September 30 to decide the fate of SB 1047 - one of the most hotly debated AI bills in the world. The Center for AI Safety Action Fund, where I work, is a co-sponsor of the bill. I’d like to share how you can help support the bill if you want to.

About SB 1047 and why it is important

SB 1047 is an AI bill in the state of California. SB 1047 would require the developers of the largest AI models, costing over $100 million to train, to test the models for the potential to cause or enable severe harm, such as cyberattacks on critical infrastructure or the creation of biological weapons resulting in mass casualties or $500 million in damages. ​​AI developers must have a safety and security protocol that details how they will take reasonable care to prevent these harms and publish a copy of that protocol. Companies who fail to perform their duty under the act are liable for resulting harm. SB 1047 also lays the groundwork for a public cloud computing resource to make AI research more accessible to academic researchers and startups and establishes whistleblower protections for employees at large AI companies.

So far, AI policy has relied on government reporting requirements and voluntary promises from AI developers to behave responsibly. But if you think voluntary commitments are insufficient, you will probably think we need a bill like SB 1047.

If SB 1047 is vetoed, it’s plausible that no comparable legal protection will exist in the next couple of years, as Congress does not appear likely to pass anything like this any time soon.

The bill’s text can be found here. A summary of the bill can be found here. Longer summaries can be found here and here, and a debate on the bill is here. SB 1047 is supported by many academic researchers (including Turing Award winners Yoshua Bengio and Geoffrey Hinton), employees at major AI companies and organizations like Imbue and Notion. It is opposed by OpenAI, Google, Meta, venture capital firm A16z as well as some other academic researchers and organizations. After a recent round of amendments, Anthropic said “we believe its benefits likely outweigh its costs.”

SB 1047 recently passed the California legislature, and Governor Gavin Newsom has until September 30th to sign or veto it. Newsom has not yet said whether he will sign it or not, but he is being lobbied hard to veto it. The Governor needs to hear from you.

How you can help

If you want to help this bill pass, there are some pretty simple steps you can do to increase that probability, many of which are detailed on the SB 1047 website.

The most useful thing you can do is write a custom letter. To do this:

Once you’ve written your own custom letter, you can also think of 5 family members or friends who might also be willing to write one. Supporters from California are especially helpful, as are parents and people who don’t typically engage on tech issues. Then help them write it! You can:

Organize an event! Consider organizing a letter writing event to help get even more letters. Please email thomas@safe.ai if you are interested.



Discuss

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI安全 SB1047 加州 人工智能 监管
相关文章