AI News 03月10日
From punch cards to mind control: Human-computer interactions
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文回顾了人机交互界面的发展历程,从早期的打孔卡片到键盘鼠标,再到图形用户界面和触屏技术,每一次进步都极大地提升了计算机的可访问性和易用性。随着人工智能和扩展现实技术的融合,语音识别和AI聊天机器人逐渐兴起,为用户提供了更自然的交互方式。未来,脑机接口等新兴技术有望实现用意念控制计算机,进一步模糊数字与物理现实之间的界限,最终实现无缝的人机交互体验。

🧮早期计算机依赖打孔卡片进行数据输入和二进制计算,操作繁琐且容易出错。ENIAC的出现,通过手动设置开关和插线的方式进行配置,虽然有所改进,但仍不够便捷。

🖱️图形用户界面(GUI)和鼠标的出现,用图标、菜单和窗口代替了文本命令,极大地降低了计算机的使用门槛,使得计算机得以在家庭和办公室普及。触屏技术的出现,进一步简化了交互方式,推动了智能手机的普及。

🤖人工智能(AI)与扩展现实(XR)的融合,通过语音识别和AI聊天机器人,实现了更自然的交互。AR技术将数字信息叠加到物理环境中,XR技术则通过眼球追踪、手势和触觉反馈,使用户能够在物理环境中与数字对象进行交互。Mawari Network正在通过XR技术将AI代理和聊天机器人带入现实世界,创造更具意义的互动。

🧠脑机接口(BCI)技术通过放置在头皮上的电极,接收大脑产生的电信号,有望实现用意念控制计算机。尽管尚处于初期阶段,但这项技术预示着未来人机交互的终极形态。

The way we interact with our computers and smart devices is very different from previous years. Over the decades, human-computer interfaces have transformed, progressing from simple cardboard punch cards to keyboards and mice, and now extended reality-based AI agents that can converse with us in the same way as we do with friends.

With each advance in human-computer interfaces, we’re getting closer to achieving the goal of interactions with machines, making computers more accessible and integrated with our lives.

Where did it all begin?

Modern computers emerged in the first half of the 20th century and relied on punch cards to feed data into the system and enable binary computations. The cards had a series of punched holes, and light was shone at them. If the light passed through a hole and was detected by the machine, it represented a “one”. Otherwise, it was a “zero”. As you can imagine, it was extremely cumbersome, time-consuming, and error-prone.

That changed with the arrival of ENIAC, or Electronic Numerical Integrator and Computer, widely considered to be the first “Turing-complete” device that could solve a variety of numerical problems. Instead of punch cards, operating ENIAC involved manually setting a series of switches and plugging patch cords into a board to configure the computer for specific calculations, while data was inputted via a further series of switches and buttons. It was an improvement over punch cards, but not nearly as dramatic as the arrival of the modern QWERTY electronic keyboard in the early 1950s.

Keyboards, adapted from typewriters, were a game-changer, allowing users to input text-based commands more intuitively. But while they made programming faster, accessibility was still limited to those with knowledge of the highly-technical programming commands required to operate computers.

GUIs and touch

The most important development in terms of computer accessibility was the graphical user interface or GUI, which finally opened computing to the masses. The first GUIs appeared in the late 1960s and were later refined by companies like IBM, Apple, and Microsoft, replacing text-based commands with a visual display made up of icons, menus, and windows.

Alongside the GUI came the iconic “mouse“, which enabled users to “point-and-click” to interact with computers. Suddenly, these machines became easily navigable, allowing almost anyone to operate one. With the arrival of the internet a few years later, the GUI and the mouse helped pave the way for the computing revolution, with computers becoming commonplace in every home and office.

The next major milestone in human-computer interfaces was the touchscreen, which first appeared in the late 1990s and did away with the need for a mouse or a separate keyboard. Users could now interact with their computers by tapping icons on the screen directly, pinching to zoom, and swiping left and right. Touchscreens eventually paved the way for the smartphone revolution that started with the arrival of the Apple iPhone in 2007 and, later, Android devices.

With the rise of mobile computing, the variety of computing devices evolved further, and in the late 2000s and early 2010s, we witnessed the emergence of wearable devices like fitness trackers and smartwatches. Such devices are designed to integrate computers into our everyday lives, and it’s possible to interact with them in newer ways, like subtle gestures and biometric signals. Fitness trackers, for instance, use sensors to keep track of how many steps we take or how far we run, and can monitor a user’s pulse to measure heart rate.

Extended reality & AI avatars

In the last decade, we also saw the first artificial intelligence systems, with early examples being Apple’s Siri and Amazon’s Alexa. AI chatbots use voice recognition technology to enable users to communicate with their devices using their voice.

As AI has advanced, these systems have become increasingly sophisticated and better able to understand complex instructions or questions, and can respond based on the context of the situation. With more advanced chatbots like ChatGPT, it’s possible to engage in lifelike conversations with machines, eliminating the need for any kind of physical input device.

AI is now being combined with emerging augmented reality and virtual reality technologies to further refine human-computer interactions. With AR, we can insert digital information into our surroundings by overlaying it on top of our physical environment. This is enabled using VR devices like the Oculus Rift, HoloLens, and Apple Vision Pro, and further pushes the boundaries of what’s possible.

So-called extended reality, or XR, is the latest take on the technology, replacing traditional input methods with eye-tracking, and gestures, and can provide haptic feedback, enabling users to interact with digital objects in physical environments. Instead of being restricted to flat, two-dimensional screens, our entire world becomes a computer through a blend of virtual and physical reality.

The convergence of XR and AI opens the doors to more possibilities. Mawari Network is bringing AI agents and chatbots into the real world through the use of XR technology. It’s creating more meaningful, lifelike interactions by streaming AI avatars directly into our physical environments. The possibilities are endless – imagine an AI-powered virtual assistant standing in your home or a digital concierge that meets you in the hotel lobby, or even an AI passenger that sits next to you in your car, directing you on how to avoid the worst traffic jams. Through its decentralised DePin infrastructure, it’s enabling AI agents to drop into our lives in real-time.

The technology is nascent but it’s not fantasy. In Germany, tourists can call on an avatar called Emma to guide them to the best spots and eateries in dozens of German cities. Other examples include digital popstars like Naevis, which is pioneering the concept of virtual concerts that can be attended from anywhere.

In the coming years, we can expect to see this XR-based spatial computing combined with brain-computer interfaces, which promise to let users control computers with their thoughts. BCIs use electrodes placed on the scalp and pick up the electrical signals generated by our brains. Although it’s still in its infancy, this technology promises to deliver the most effective human-computer interactions possible.

The future will be seamless

The story of the human-computer interface is still under way, and as our technological capabilities advance, the distinction between digital and physical reality will more blurred.

Perhaps one day soon, we’ll be living in a world where computers are omnipresent, integrated into every aspect of our lives, similar to Star Trek’s famed holodeck. Our physical realities will be merged with the digital world, and we’ll be able to communicate, find information, and perform actions using only our thoughts. This vision would have been considered fanciful only a few years ago, but the rapid pace of innovation suggests it’s not nearly so far-fetched. Rather, it’s something that the majority of us will live to see.

(Image source: Unsplash)

The post From punch cards to mind control: Human-computer interactions appeared first on AI News.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

人机交互 人工智能 扩展现实 脑机接口
相关文章