a16z 02月19日
Setting the Agenda for Global AI Leadership: Assessing the Roles of Congress and the States
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

DeepSeek的发布促使美国重新关注发展具有全球竞争力的AI产品,强调需制定国家AI政策。文章探讨了州和联邦政府在AI监管中的角色,认为联邦应主导制定政策,州应发挥传统作用,还提出了治理AI的路线图。

DeepSeek发布促使美国重视发展有竞争力的AI产品,需制定国家AI政策

联邦政府应主导AI监管,避免州法律拼凑影响全国AI战略的创建

州政府在政策制定中有重要作用,如监管辖区内活动、进行政策实验

制定全国性AI政策对美国竞争力至关重要,有利于小科技公司发展

治理AI需联邦政府和州政府共同发挥作用,包括制定法律、进行监管等

DeepSeek’s release has spurred calls for a renewed American focus on developing AI products that are globally competitive. Some policymakers have referred to the developments of the last week as AI’s new Sputnik moment that provides a “wake-up call,” emphasizing the need to “step up our game” to set a national AI policy that puts “American innovation” as the “north star.”The question, of course, is how we get there. With state legislative sessions now in full swing and a new Congress and Presidential administration beginning in Washington, it is important to consider the respective roles that state and federal governments might play in regulating AI.As with other new technologies, our federalist system dictates that states and the federal government each occupy an important role in governing the use of artificial intelligence. But a patchwork of state laws that regulate AI development will interfere with the creation of a national AI strategy to produce competitive AI products and establish the US as the clear global leader.When we set out to win the space race, America’s space policy was not set by Texas or California. Faced with a similar technological challenge today, Congress and the Administration should take the lead in setting national AI policy, including any regulation of the design and development of AI systems. States should play their traditional roles in policymaking: serving as “laboratories” for policy experimentation while policing activity within their jurisdictions.The role of the federal governmentTraditionally, the federal government has taken the lead in establishing national policy in areas where a 50-state patchwork would be harmful for commerce and innovation. For many technology products, separate governance regimes in New York and Texas, or in neighboring states like Virginia and North Carolina, would degrade the user experience and present challenges for companies trying to design, build, and operate these products across state lines.Imagine if a messaging company had to offer one version of a product to a user in Florida and a different one to a user in California. Or, what if a Pennsylvania resident travels to Texas, Ohio, or New York? Would they open their phone to find a different messaging experience each time they cross a state border? People expect technology products to deliver information quickly and easily, regardless of where the user lives. State-by-state legal patchworks frustrate that objective. When it’s hard for users to have a consistent product experience across state lines, and hard for companies to offer them, product development stagnates and innovation slows.To avoid these patchwork scenarios, the Constitution explicitly gives Congress the authority to regulate interstate commerce, and the federal government takes the lead in foreign relations and national security. In the past, when state borders threaten to undermine the adoption of new technologies that have the potential to deliver massive economic and social benefits to the nation as a whole, Congress has stepped in to establish a national governance regime. For instance, in the 1930s, Congress passed the Telecommunications Act to establish a federal framework to govern telecommunications technology. It updated the law in 1996 to account for the rise of the internet. Creating a national market for telecommunications and information services helped to fuel generations of innovation that established the United States as a global technology leader.To ensure that Congress has the power to regulate these types of national markets, the Constitution specifies that once Congress acts, federal law becomes the standard if any state law conflicts with it. Even when the federal government does not act, states may be prohibited from enacting laws that cause a significant burden on interstate commerce.The role of state governmentsStates also have an important role to play in policymaking. States retain the power to police activity within their own jurisdictions, and the Constitution specifies that any power not given explicitly to the federal government is reserved for the states. States have traditionally taken the lead in areas like education and public safety. States have filed their own cases to address concerns about minors’ online safety.In some areas of the law, states and the federal government both play a role. They both enforce their own criminal, antitrust, civil rights, and consumer protection laws. For example, state attorneys general joined with the FTC and the Department of Justice to bring antitrust cases against Big Tech companies. A recent legal advisory published by the California Attorney General emphasized that California’s existing law in areas like unfair business practices and civil rights can serve to protect consumers from the misuse of AI tools.A national AI policy is critical for American competitivenessCongress, the White House, and executive branch agencies are best positioned to take the lead in regulation that will define the AI model market. AI models are critical to America’s national security and competitiveness, to America’s geopolitical objectives, and to the future economic and social welfare of the nation. If even a handful of states pass laws that establish divergent approaches to AI model governance, it may become difficult for companies to offer AI products across borders, and users may start to see the pace of model development slow.Slowing innovation isn’t just bad for consumers; it also makes it harder for the United States to compete in AI with other countries. If American developers struggle to build competitive products because they must devote substantial resources to navigating a state regulatory patchwork, then users will simply get their AI products from other countries, potentially including foreign adversaries like China and Russia. The release of DeepSeek emphasizes that this risk is not simply a theoretical possibility. It is our current reality.Historically, startups, or Little Tech, have been essential to American competitiveness, but they are likely to be hit the hardest by a state-by-state, patchwork approach to regulating AI model development. While large platforms might prefer to avoid dealing with a patchwork of state regulations because it is inconvenient, complicated, and costly, they typically have the resources and experience to manage it. They may have hundreds or even thousands of lawyers on their legal teams, so dedicating a percentage of their time to state legal compliance may not have a significant impact. They also have large engineering teams, and so if they need to alter their products to manage changes in state law, they can assign engineers to the task without necessarily impacting core product development.Startups don’t have these luxuries. They may have minimal legal resources they can devote to compliance, and some startups don’t even have a full-time lawyer on staff. If they need to make changes to their product to comply with a new state law, they would need to pull valuable engineers away from working on baseline elements of product development and monetization. Competition to gain market share – already daunting – becomes even more difficult. Patchworks of state laws may burden large tech platforms, but they have the power to cripple Little Tech and hinder American efforts to compete with AI development in other countries.A roadmap for governing AITo provide a consistent national standard that will make it possible for Little Tech to compete in the development and application of AI models and that will strengthen America’s leadership globally, the federal government should assume the responsibility of enacting laws related to the design, construction, and performance of AI models. Congress should also take the lead in delineating content and intellectual property liability in AI products, as it has in the past with creating similar liability regimes for the internet. Federal agencies like the FTC should play the lead role in enforcing any laws that Congress passes in these areas.Of course, states have an important role to play as well. They should enforce existing state law in areas like consumer protection, violations of criminal law, civil rights, and antitrust, as they have when other new technologies have been introduced. They should also continue to set the terms for important components of corporate law, such as business registration and insurance. Lawmaking in these areas will leave a meaningful footprint for states in AI regulation and will be important in shaping individuals’ experiences with this technology, even if the federal government assumes responsibility for regulating model design, construction, and performance.States engage not only in the substance of new policy frameworks, but also in the process. States are often referred to as the laboratories of democracy, and they have a rich history of policy experimentation in areas of traditional state lawmaking. Recently, for instance, several states have enacted regulatory sandboxes, which allow companies to test new products in short-term, modified regulatory settings. Some organizations have started proposing sandboxes as a way to incentivize experimentation in AI. Sandboxes have the potential to produce data that can inform future policymaking, just as clinical trials in medicine produce information that leads to healthier drug development. Given their traditional roles, states are well poised to take an experimental approach to policymaking. Of course, the federal government might also use experimental approaches to explore productive policy related to the design, construction, and performance of AI models.An additional consideration is timing. A report by the Bipartisan House Task Force on Artificial Intelligence discussed the option of imposing a moratorium on state regulation for a specific period of time, during which policymakers and researchers can gather more data on the costs and benefits of both AI technology and AI regulation. This approach could have the benefit of establishing a national approach to AI while the technology is in a nascent phase, and preventing the long-lasting, anticompetitive effects that a state-by-state patchwork is likely to have. At the same time, it leaves the door open to future state regulation once an initial learning period is complete.Startups have always been the vanguard of American technological supremacy and innovation, from Edison and Ford to Tesla and AirBnB. They will be critical to maintaining our economic competitiveness and protecting our national security as AI rapidly accelerates. To preserve the potential of this new technology, we must focus regulation on its use, rather than its development, and look to both the federal government and state governments to play their respective traditional roles in shaping AI governance.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

美国AI政策 联邦政府 州政府 AI监管 竞争力
相关文章