Unite.AI 01月03日
New Wave Technology Makes Android Emotions More Natural
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

大阪大学的研究人员开发了一种创新方法,通过将面部表情视为相互连接的运动波,而非孤立的动作,从而使机器人能够更自然地表达情感。该系统结合了呼吸、眨眼等多种面部运动,并将其调制成波形,动态生成表情。这种方法不仅使表情过渡更加流畅,还通过内部状态影响波形,使机器人的情感表达更真实。研究人员通过实验展示了该系统在表达不同唤醒水平时的有效性,为未来人机交互提供了新的可能性。

💡 传统机器人面部表情系统依赖预编程,表情切换生硬,缺乏自然过渡和情感一致性,影响人机交互体验。

🌊 大阪大学的新技术将面部表情视为动态的运动波,通过调制呼吸、眨眼等多种运动波形,实时生成自然流畅的表情,避免了传统系统中的机械感。

⚙️ 该系统通过“波形调制”技术,使机器人的内部状态直接影响表情的呈现,增强了情感表达的真实性。同时,通过“时间管理”和“姿态管理”模块,协调面部各部分运动,确保表情的自然呈现。

😴 该技术能够表达不同的唤醒状态,例如通过调整呼吸频率、眨眼频率等参数,生动地展现兴奋或困倦等情感状态,这对于需要长期人机交互的场景至关重要。

For those who have interacted with an android that looks incredibly human, many report that something “feels off.” This phenomenon goes beyond mere appearance – it is deeply rooted in how robots express emotions and maintain consistent emotional states. Or in other words, their lack of human-like abilities.

While modern androids can masterfully replicate individual facial expressions, the challenge lies in creating natural transitions and maintaining emotional consistency. Traditional systems rely heavily on pre-programmed expressions, similar to flipping through pages in a book rather than flowing naturally from one emotion to the next. This rigid approach often creates a disconnect between what we see and what we perceive as genuine emotional expression.

The limitations become particularly evident during extended interactions. An android might smile perfectly in one moment but struggle to naturally transition into the next expression, creating a jarring experience that reminds us we are interacting with a machine rather than a being with genuine emotions.

A Wave-Based Solution

This is where some new and important research from Osaka University comes in. Scientists have developed an innovative approach that fundamentally reimagines how androids express emotions. Rather than treating facial expressions as isolated actions, this new technology views them as interconnected waves of movement that flow naturally across an android's face.

Just as multiple instruments blend to create a symphony, this system combines various facial movements – from subtle breathing patterns to eye blinks – into a harmonious whole. Each movement is represented as a wave that can be modulated and combined with others in real-time.

What makes this approach innovative is its dynamic nature. Instead of relying on pre-recorded sequences, the system generates expressions organically by overlaying these different waves of movement. This creates a more fluid and natural appearance, eliminating the robotic transitions that often break the illusion of natural emotional expression.

The technical innovation lies in what the researchers call “waveform modulation.” This allows the android's internal state to directly influence how these waves of expression manifest, creating a more authentic connection between the robot's programmed emotional state and its physical expression.

Image Credit: Hisashi Ishihara

Real-Time Emotional Intelligence

Imagine trying to make a robot express that it is getting sleepy. It is not just about drooping eyelids – it is also about coordinating multiple subtle movements that humans unconsciously recognize as signs of sleepiness. This new system tackles this complex challenge through an ingenious approach to movement coordination.

Dynamic Expression Capabilities

The technology orchestrates nine fundamental types of coordinated movements that we typically associate with different arousal states: breathing, spontaneous blinking, shifty eye movements, nodding off, head shaking, sucking reflection, pendular nystagmus (rhythmic eye movements), head side swinging, and yawning.

Each of these movements is controlled by what researchers call a “decaying wave” – a mathematical pattern that determines how the movement plays out over time. These waves are not random; they are carefully tuned using five key parameters:

Internal State Reflection

What makes this system stand out is how it links these movements to the robot's internal arousal state. When the system indicates a high arousal state (excitement), certain wave parameters automatically adjust – for instance, breathing movements become more frequent and pronounced. In a low arousal state (sleepiness), you might see slower, more pronounced yawning movements and occasional head nodding.

The system achieves this through what the researchers call “temporal management” and “postural management” modules. The temporal module controls when movements happen, while the postural module ensures all the facial components work together naturally.

Hisashi Ishihara is the lead author of this research and an Associate Professor at the Department of Mechanical Engineering, Graduate School of Engineering, Osaka University.

“Rather than creating superficial movements,” explains Ishihara, “further development of a system in which internal emotions are reflected in every detail of an android's actions could lead to the creation of androids perceived as having a heart.”

Sleepy mood expression on a child android robot (Image Credit: Hisashi Ishihara)

Improvement in Transitions

Unlike traditional systems that switch between pre-recorded expressions, this approach creates smooth transitions by continuously adjusting these wave parameters. The movements are coordinated through a sophisticated network that ensures facial actions work together naturally – much like how a human's facial movements are unconsciously coordinated.

The research team demonstrated this through experimental conditions showing how the system could effectively convey different arousal levels while maintaining natural-looking expressions.

Future Implications

The development of this wave-based emotional expression system opens up fascinating possibilities for human-robot interaction, and could be paired with technology like Embodied AI in the future. While current androids often create a sense of unease during extended interactions, this technology could help bridge the uncanny valley – that uncomfortable space where robots appear almost, but not quite, human.

The key breakthrough is in creating genuine-feeling emotional presence. By generating fluid, context-appropriate expressions that match internal states, androids could become more effective in roles requiring emotional intelligence and human connection.

Koichi Osuka served as the senior author and is a Professor at the Department of Mechanical Engineering at Osaka University.

As Osuka explains, this technology “could greatly enrich emotional communication between humans and robots.” Imagine healthcare companions that can express appropriate concern, educational robots that show enthusiasm, or service robots that convey genuine-seeming attentiveness.

The research demonstrates particularly promising results in expressing different arousal levels – from high-energy excitement to low-energy sleepiness. This capability could be crucial in scenarios where robots need to:

The system's ability to generate natural transitions between states makes it especially valuable for applications requiring sustained human-robot interaction.

By treating emotional expression as a fluid, wave-based phenomenon rather than a series of pre-programmed states, the technology opens many new possibilities for creating robots that can engage with humans in emotionally meaningful ways. The research team's next steps will focus on expanding the system's emotional range and further refining its ability to convey subtle emotional states, influencing how we will think about and interact with androids in our daily lives.

The post New Wave Technology Makes Android Emotions More Natural appeared first on Unite.AI.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

波形技术 机器人情感 人机交互 面部表情 情感表达
相关文章