a16z 02月19日
AI and the Promise of Hardware Iteration at Software Speed
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文探讨了机器学习(ML)在物理仿真领域的应用,旨在加速硬件开发迭代过程。传统硬件开发依赖耗时的仿真软件,而ML通过预训练学习传递函数,大幅缩短仿真时间,甚至将数天缩短至数秒。虽然ML物理仿真初创公司涌现,但行业采用仍面临挑战,包括工程师对ML工具的认知不足、缺乏评估基准、对“黑盒”工具的信任问题以及用户体验不佳等。文章建议初创公司采取白手套服务、免费提供给大学团队、定位为早期设计工具等策略,以克服这些障碍,最终实现硬件迭代速度的提升。

🚀ML物理仿真通过预训练学习传递函数,显著加速仿真过程,将传统仿真所需时间从数天缩短至数秒,极大地提高了效率。

🧪 现有分析工具无法充分利用GPU技术,因为它们的求解器通常不兼容,需要重新编写。这表明行业在利用现代硬件方面存在滞后,为创新提供了空间。

🎓初创公司可以通过提供白手套服务、免费提供给大学团队以及将产品定位为早期设计工具等策略,克服行业采用ML物理仿真所面临的障碍。

🚗 汽车行业比航空航天行业更愿意承担风险,大型组织内的研发团队不太受严格流程的约束,因此更有可能采用有助于他们完成概念设计的新技术。

Software simulations are central to developmental engineering in every hardware industry. In aerospace, we simulate the vibrations and forces of a rocket launch and the on-orbit thermal balances; in aviation, lift and drag over wings; in medicine, drug delivery through the bloodstream. Today, these simulations are based on fundamental physics equations, mathematically derived centuries ago by people like Newton and Bernoulli. They often involve nonlinear partial differential equations (PDEs) and often have no closed-form analytical solutions. As a result, these simulations use iterative numerical approximations and are incredibly time- and compute-intensive — and often inaccurate. This is all about to change. Machine learning methods applied to physics will speed up simulation times by orders of magnitude, have the potential to improve accuracy, and will revolutionize the engineering development process. Next-generation physics ML simulations are the first step toward enabling hardware iteration at software speeds. The traditional engineering processHardware development today follows an iterative process from design to analysis (simulation), followed by prototype and test. A single iteration cycle can take weeks to months. And although there are improvements to be made across the entire product development flow, simulation software may be the first frontier, as there are promising early signals in the world of physics machine learning.Current computational physics simulations, for example in fluid dynamics, work by discretizing the geometry of interest (say, an airplane wing) into a mesh of small polygonal elements and nodes. Numerical solvers then combine the fundamental equations describing fluid flow (Navier-Stokes) with initial and boundary conditions, and use iterative solvers to estimate solutions. Models often have millions (or even billions) of nodes over which to evaluate solutions, and simulations can easily take hours (and sometimes days) to complete. For engineering teams, this iteration process is incredibly time consuming as each design modification results in days to weeks of simulation time. As a result, only a limited number of iterations can take place before engineering teams are forced to accept their working design and move to prototype development.In part, this is because the market for simulation software is dominated by incumbent juggernauts with multibillion dollar market caps, such as Ansys and Siemens, whose legacy products have historically prioritized stability and incremental improvements over innovation. Many of these tools are written in ancient programming languages such as Fortran, lack intuitive UX, and fail to fully take advantage of modern hardware like GPUs. The field has been stagnant for decades, and is ripe for disruption. But if simulations took seconds instead of days, engineers would be unshackled. They’d have the freedom to get more creative, explore design spaces, optimize in software, and, overall, carry out many more iterations before delivering a product.ML-based approaches to simulationMachine learning approaches to physics simulations work broadly the same way as computational physics approaches: creating a 3D embedding of the model geometry (either with point clouds, graphs, or other similar techniques); encoding initial and boundary conditions; and outputting results in terms of pressure and velocity fields or stress plots. The key difference is that machine learning approaches are pre-trained to learn a transfer function from initial conditions to outputs based on a trove of (generally) computational physics simulation data. At runtime, this means that machine learning simulations are only computing a forward pass through a neural network rather than a huge number of iterative computations at every node in the mesh.This can speed things up by many orders of magnitude. Simulations that previously took days can be solved in seconds. And simulations that previously were constrained to lower resolutions to enable reasonable compute time can be run at far greater resolutions, resulting in greater accuracy.There is some precedent here. For decades, weather forecasting was performed in much the same way as engineering simulations — by discretizing the globe into small sections and computing fundamental, nonlinear partial differential equations of momentum, surface pressure, temperature, and more in a process that required some of the most advanced supercomputers in the world. In 2023, AI weather models separately released by Huawei and Google DeepMind, trained on decades of weather observations and simulations, improved compute time by five orders of magnitude while achieving accuracy comparable to the best existing models. There are early signs of momentum toward disruption in this space, with a number of startups emerging to take on the incumbents. Approaches differ in terms of ML architecture, and broadly can be segmented into architectures that rely more on fundamental physics vs those that correlate to simulation or experimental data, as well as approaches that focus on a single domain vs those that provide general multiphysics simulation capabilities. NVIDIA has also released Modulus, an open-source physics ML platform that allows users to select ML architectures and train models. Here are some of the startups working on this technology.Progress, adoption, and challengesHowever, while many physics ML startups exist today with very impressive technology, these approaches are still nascent and lack significant utilization among engineers. One of the reasons for this is that startups tend to focus less on what we view as one of the key challenges: overcoming barriers to adoption. Thus far, for example, most improvements in simulation have come through transitioning compute from dedicated desktops to the cloud, rather than from novel improvements in software capabilities or ML models. Notably, existing analysis tools can’t benefit from the boom in GPU technology, as their solvers are generally incompatible and would need to be re-written.In addition to building great new technologies, we also hope to see more startups laser-focused on the small things that will help customers fully utilize their products.Here are some insights from dozens of conversations with potential customers of this technology. Our advice to founders is to do everything in their power — on the product front, as well as from an educational perspective — to help potential customers overcome these hurdles and biases:Awareness and trainingHardware engineers tend to have a limited understanding of the current capabilities of different ML tools, and where they might perform better than traditional simulation tools. Engineers also generally lack the right training and skillset to use ML physics simulations in their current form, as nearly all have been trained only on traditional tools. Thus, the ROI on AI is not clear for engineering leaders who make software purchasing decisions.Evaluation and trustWithout good benchmarks, ML physics simulation tools are tough for engineering leaders to evaluate, and these people often do not have the time or bandwidth to invest in determining what the right tool is. Engineers can also be skeptical of “black box” ML tools that don’t directly use physics and first principles to produce simulation results in a fully predictable manner. The reluctance to trust new tools is even greater if the design being simulated carries humans or has a lot of kinetic energy.UX and speedNew platforms, while enabling significantly faster simulation times, are often unintuitive and present new users with a steep learning curve. Anyone without solid programming skills and/or a working knowledge of ML techniques can struggle to see the benefits early on and is at risk of abandoning a new system prematurely.Go-to-market strategiesWe’re already seeing startups experiment with a suite of go-to-market strategies to overcome some of these obstacles. Here are the strategies we believe to be most promising:  Offering white-glove / full-service models initially (even going so far as embedding engineers), before transitioning to self-service. This provides companies with real-world feedback that can help them improve UX, and other issues, for when they’re ready to scale their sales motion. Getting software into the hands of every college rocket and racecar team for free, and letting them bring it to industry when they graduate. Creating evangelists that are appalled at the speeds of legacy industry tools helps accelerate adoption from the bottom up, much in the way that Benchling uses a free tier for academic researchers to drive product-led growth in the biotech industry. Additionally, startups that can find their way into university class syllabi (where engineers currently learn legacy tools) will likely be very effective in the long run.Marketing the product as an early-stage design tool that helps engineers rapidly assess initial designs, evaluate trade spaces, and respond quickly to shifting requirements. Once established in an enterprise, startups can continue to introduce features that address ever-greater parts of the product-development process. This will also allow startups a chance to gain trust with engineers around the accuracy of the simulations, building toward a long-term goal of being an end-to-end simulation tool.Focusing on the industries and groups within enterprises most likely to adopt. For example, the automotive sector tends to be more risk tolerant than the aerospace sector. R&D groups within larger organizations tend to be less tied to strict processes and could be more open to new technology that helps them progress through conceptual designs. After adopting, these groups could also help champion new tools to the broader organization.Hardware iteration at software speedsWhile simulations are a core part of engineering development, decades without innovation from incumbents has left product iteration cycles long, tedious, and stagnant. Emerging physics machine learning simulation technology has the potential to revolutionize hardware iteration, and to short-circuit the time it takes for products to go from design to production.We are certainly a long way away from hardware iteration at software speeds, and there are other pieces of the puzzle left to solve, including design and rapid mass-manufacturing. Yet progress is happening quickly, and this future is beginning to look like an inevitability. We can’t wait to live in a world where ideas become reality almost as quickly as we can imagine them.If you are a hardware or simulation engineer, or a founder building in this space, please reach out.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

机器学习 物理仿真 硬件迭代 工程开发
相关文章