The Networking Nerd 2024年07月05日
Copilot Not Autopilot
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文以航空领域的自动驾驶和副驾驶为例,探讨了AI在工作中扮演的角色。作者认为,将AI比喻为“副驾驶”比“自动驾驶”更贴切,因为它更强调人机协作,而非完全替代。AI可以帮助完成繁琐的任务,让人类专注于更关键的工作,并提供额外的检查,避免错误。

🤔 AI应用的“副驾驶”趋势:许多软件开始加入AI功能,并将其称为“副驾驶”。作者认为,这种趋势并非单纯的品牌借用,而是反映了AI在工作中扮演的辅助角色,类似于飞机上的副驾驶。

✈️ 自动驾驶VS副驾驶:自动驾驶系统可以完成一些简单的任务,例如飞机的自动降落,但需要人类在关键时刻进行干预。副驾驶则更多地扮演辅助和检查的角色,帮助飞行员完成更复杂的操作。

✍️ AI写作的局限性:作者以AI写作为例,指出AI虽然可以生成内容,但其输出质量并不稳定,且缺乏人类的创造力和思考能力。因此,AI写作需要人类的审核和编辑,才能确保输出内容的准确性和质量。

🤝 人机协作的未来:AI的应用应该以辅助人类为主,而不是完全替代人类。通过将AI视为“副驾驶”,可以让人类专注于更具创造力和战略性的工作,提升效率和创造力。

🚀 积极的AI应用:作者认为,将AI视为“副驾驶”可以更好地推动AI的应用,让人们更乐于接受AI带来的帮助,并共同创造更美好的未来。

I’ve noticed a trend recently with a lot of AI-related features being added to software. They’re being branded as “copilot” solutions. Yes, Microsoft Copilot was the first to use the name and the rest are just trying to jump in on the brand recognition, much like using “GPT” last year. The word “copilot” is so generic that it’s unlikely to be to be trademarked without adding more, like the company name or some other unique term. That made me wonder if the goal of using that term was simply to cash in on brand recognition or if there was more to it.

No Hands

Did you know that an airplane can land entirely unassisted? It’s true. It’s a feature commonly called Auto Land and it does exactly what it says. It uses the airports Instrument Landing System (ILS) to land automatically. Pilots rarely use it because of a variety of factors, including the need for minute last-minute adjustments during a very stressful part of the flight as well as the equipment requirements, such as a fairly modern ILS system. That doesn’t even mention that use of Auto Land snarls airport traffic because of the need to hold other planes outside ILS range to ensure only one plane can use it.

The whole thing reminds me of when autopilot is used on most flights. Pilots usually take the controls during takeoff and landing, which are the two more critical phases of flight. For the rest, autopilot is used a lot of the time. That’s the boring sections where you’re just flying a straight line between waypoints on your flight plan. That’s something that automated controls excel at doing. Pilots can monitor but don’t need to have their full attention on the readings every second of the flight.

Pilots will tell you that taking the controls for the approach and landing is just smart for many reasons, chief among them that it’s something they’re trained to do. More importantly, it places the overall control of the landing in the hands of someone that can think creatively and isn’t just relying on a script and some instrument readings to land. Yes, that is what ILS was designed to do but someone should always be there to ensure that what’s been sent is what should be followed.

Pilot to Copilot

As you can guess, the parallels in this process for using AI in your organization are a good match. AI may have great suggestions and may even come up with more novel ways of making you more productive but it’s not the only solution to your problems. I think the copilot metaphor is perfectly illustrated with the rush to have GPT chatbots write reports and articles last year.

People don’t like writing. At least, that’s the feeling that I got when I saw how many people were feeding prompts to OpenAI and having it do the heavy lifting. Not every output was good. Some of it was pretty terrible. Some of it was riddled with errors. And even the things that looked great still had that aura of something like the uncanny valley of writing. Almost right but somehow wrong.

Part of the reason for that was the way that people just assumed that the AI output was better than anything they could have come up with and did no further editing to the copy. I barely trust my own skills to publish something with minimal editing. Why would a trust a know-it-all computer algorithm? Especially with something that has technical content? Blindly accepting an LLM’s attempt at content creation is just as crazy as assuming that there’s no need to doublecheck math calculations if the result is outside of your expectations.

Copilot works for this analogy because copilots are there to help and to be a check against error. The old adage of “trust by verify” is absolutely the way they operate. No pilot would assume they were infallible and no copilot would assume everything the pilot said was right. Human intervention is still necessary in order to make sure that the output matches the desired result. The biggest difference today is that when it comes to AI art generation or content creation a failure to produce a desired result means wasted time. In a situation with an autopilot on an airline making a mistake in landing the results are more horrific.

People want to embrace AI to take away the drudgery of their jobs. It’s remarkably similar to how automation was going to take away our jobs before we realized it was really going to take away the boring, repetitive parts of what we do. Branding AI as “autopilot” will have negative consequences for adoption because people don’t like the idea of a computer or an algorithm doing everything for them. However, copilots are helpful and can take care of boring or menial tasks leaving you free to concentrate on the critical parts of your job. It’s not going to replace us as much as help us.


Tom’s Take

Terminology matters. Autopilot is cold and restrictive. Having a copilot sounds like an adventure. Companies are wise not to encourage the assumption that AI is going to take over jobs and eliminate workers. The key is that people should see the solution as offering a way to offload tasks and ask for help when needed. It’s a better outcome for the people doing the job as well as the algorithms that are learning along the way.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

人工智能 AI应用 副驾驶 人机协作 未来
相关文章