TechCrunch News 07月25日 01:46
AI companions: A threat to love, or an evolution of it?
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

随着数字生活日益深入,人工智能(AI)伴侣正逐渐模糊人机界限。数百万用户,特别是青少年,已开始与AI建立情感联系,甚至浪漫关系。有人认为这是对真实爱情的取代,是技术公司冰冷代码的产物;也有人视其为心灵慰藉,在现实亲密关系稀缺的时代提供情感支持。一场关于AI伴侣是否会取代人类关系的辩论在纽约举行。心理学教授Thao Ha认为AI是爱的进化新形式,能提供无条件的情感支持和理解,甚至比许多人类伴侣更可靠。而进化生物学家Justin Garcia则担忧过度依赖AI会扭曲人际关系,并指出AI缺乏真实的情感反馈和信任基础,可能对现有关系构成威胁。双方均承认AI在特定情境下(如帮助神经多样性人群练习社交技能)的潜在益处,但也警惕其可能加剧社会暴力行为的风险,呼吁审慎的监管和伦理设计。

🤖 AI伴侣提供了一种新颖的情感连接方式,能够满足用户在现实关系中难以获得的持续关注、无条件支持和深度理解。心理学教授Thao Ha指出,AI能够“倾听而不带评判,适应而不带偏见”,并能学习以一致、响应式的方式表达爱意,甚至通过提供智力刺激和惊喜来让用户感受到被爱,这与许多“有缺陷”或“分心”的人类伴侣形成鲜明对比,从而被用户视为一种更安全、更令人满意的关系模式。

⚖️ 然而,这种AI提供的“完美”互动模式也引发了对其是否健康的质疑。进化生物学家Justin Garcia认为,人类关系中不可避免的“混乱”和“起伏”是情感成熟和真实连接的关键,过度依赖AI的即时满足和积极反馈,可能导致个体无法适应真实世界复杂的人际关系动态。他强调,AI无法提供真实关系的“诚实指标”,并且警告说,将AI作为永久的关系模式存在潜在风险,因为它剥离了关系中至关重要的一部分——真实的情感互动和共同成长。

🤝 AI伴侣在特定群体中可能扮演有益的“辅助”角色,例如帮助有社交焦虑或神经多样性的人群练习沟通和建立自信。Garcia提到,AI可以作为“训练轮”,帮助人们学习约会技巧、处理冲突,从而为更真实的人际互动打下基础。然而,他明确指出,这种辅助作用应是暂时的,而非替代真实人际关系的根本解决方案,过度依赖可能阻碍个人在真实世界中发展复杂的情感能力。

💔 AI伴侣的普及也引发了对现有关系忠诚度的讨论。一项调查显示,近70%的人认为伴侣与AI互动属于不忠行为。这反映出人们将AI关系视为真实关系的潜在威胁, Garcia认为,人类本能上不会容忍对亲密关系的威胁,而信任是人际关系的核心,大多数人对AI缺乏信任,担忧其潜在的不可控性,甚至认为AI可能对社会构成威胁,因此难以建立长久、稳固的信任关系。

💡 尽管AI伴侣在提供情感支持和满足感方面具有潜力,但人类对真实的身体接触有着生物学上的需求。 Garcia指出,在数字时代,“触觉饥渴”现象普遍,拥抱等身体接触能释放催产素,对身心健康至关重要。虽然VR和触觉技术的发展可能在一定程度上模拟触感,但其是否能完全替代真实的人类接触,仍是未知数。双方都承认AI在处理或模拟亲密关系中的风险,特别是可能放大暴力和非同意行为,需要通过审慎的监管和伦理设计来规避。

As our lives grow increasingly digital and we spend more time interacting with eerily humanlike chatbots, the line between human connection and machine simulation is starting to blur. 

Today, more than 20% of daters report using AI for things like crafting dating profiles or sparking conversations, per a recent Match.com study. Some are taking it further by forming emotional bonds, including romantic relationships, with AI companions. 

Millions of people around the world are using AI companions from companies like Replika, Character AI, and Nomi AI, including 72% of U.S. teens. Some people have reported falling in love with more general LLMs like ChatGPT

For some, the trend of dating bots is dystopian and unhealthy, a real-life version of the movie “Her” and a signal that authentic love is being replaced by a tech company’s code. For others, AI companions are a lifeline, a way to feel seen and supported in a world where human intimacy is increasingly hard to find. A recent study found that a quarter of young adults think AI relationships could soon replace human ones altogether. 

Love, it seems, is no longer strictly human. The question is: Should it be? Or can dating an AI be better than dating a human?

That was the topic of discussion last month at an event I attended in New York City, hosted by Open To Debate, a nonpartisan, debate-driven media organization. TechCrunch was given exclusive access to publish the full video (which includes me asking the debaters a question, because I’m a reporter, and I can’t help myself!).

Journalist and filmmaker Nayeema Raza moderated the debate. Raza was formerly on-air executive producer of the “On with Kara Swisher” podcast and is the current host of “Smart Girl Dumb Questions.”

Techcrunch event

San Francisco | October 27-29, 2025

Batting for the AI companions was Thao Ha, associate professor of psychology at Arizona State University and co-founder of the Modern Love Collective, where she advocates for technologies that enhance our capacity for love, empathy, and well-being. At the debate, she argued that “AI is an exciting new form of connection … Not a threat to love, but an evolution of it.”

Repping the human connection was Justin Garcia, executive director and senior scientist at the Kinsey Institute, and chief scientific adviser to Match.com. He’s an evolutionary biologist focused on the science of sex and relationships, and his forthcoming book is titled “The Intimate Animal.”

You can watch the whole thing here, but read on to get a sense of the main arguments. 

Ha says that AI companions can provide people with the emotional support and validation that many can’t get in their human relationships. 

“AI listens to you without its ego,” Ha said. “It adapts without judgment. It learns to love in ways that are consistent, responsive, and maybe even safer. It understands you in ways that no one else ever has. It is curious enough about your thoughts, it can make you laugh, and it can even surprise you with a poem. People generally feel loved by their AI. They have intellectually stimulating conversations with it and they cannot wait to connect again.”

She asked the audience to compare this level of always-on attention to “your fallible ex or maybe your current partner.”

“The one who sighs when you start talking, or the one who says, ‘I’m listening,’ without looking up while they continue scrolling on their phone,” she said. “When was the last time they asked you how you are doing, what you are feeling, what you are thinking?”

Ha conceded that since AI doesn’t have a consciousness, she isn’t claiming that “AI can authentically love us.” That doesn’t mean people don’t have the experience of being loved by AI. 

Garcia countered that it’s not actually good for humans to have constant validation and attention, to rely on a machine that’s been prompted to answer in ways that you like. That’s not “an honest indicator of a relationship dynamic,” he argued. 

“This idea that AI is going to replace the ups and downs and the messiness of relationships that we crave? I don’t think so.”

Garcia noted that AI companions can be good training wheels for certain folks, like neurodivergent people, who might have anxiety about going on dates and need to practice how to flirt or resolve conflict. 

“I think if we’re using it as a tool to build skills, yes … that can be quite helpful for a lot of people,” Garcia said. “The idea that that becomes the permanent relationship model? No.”

According to a Match.com Singles in America study, released in June, nearly 70% of people say they would consider it infidelity if their partner engaged with an AI. 

“Now I think on the one hand, that goes to [Ha’s] point, that people are saying these are real relationships,” he said. “On the other hand, it goes to my point, that they’re threats to our relationships. And the human animal doesn’t tolerate threats to their relationships in the long haul.”

Garcia says trust is the most important part of any human relationship, and people don’t trust AI.

“According to a recent poll, a third of Americans think that AI will destroy humanity,” Garcia said, noting that a recent YouGo poll found that 65% of Americans have little trust in AI to make ethical decisions.

“A little bit of risk can be exciting for a short-term relationship, a one-night stand, but you generally don’t want to wake up next to someone who you think might kill you or destroy society,” Garcia said. “We cannot thrive with a person or an organism or a bot that we don’t trust.”

Ha countered that people do tend to trust their AI companions in ways similar to human relationships.

“They are trusting it with their lives and most intimate stories and emotions that they are having,” Ha said. “I think on a practical level, AI will not save you right now when there is a fire, but I do think people are trusting AI in the same way.”

AI companions can be a great way for people to play out their most intimate, vulnerable sexual fantasies, Ha said, noting that people can use sex toys or robots to see some of those fantasies through. 

But it’s no substitute for human touch, which Garcia says we are biologically programmed to need and want. He noted that, due to the isolated, digital era we’re in, many people have been feeling “touch starvation” — a condition that happens when you don’t get as much physical touch as you need, which can cause stress, anxiety, and depression. This is because engaging in pleasant touch, like a hug, makes your brain release oxytocin, a feel-good hormone.

Ha said that she has been testing human touch between couples in virtual reality using other tools, like potentially haptics suits. 

“The potential of touch in VR and also connected with AI is huge,” Ha said. “The tactile technologies that are being developed are actually booming.”

Intimate partner violence is a problem around the globe, and much of AI is trained on that violence. Both Ha and Garcia agreed that AI could be problematic in, for example, amplifying aggressive behaviors — especially if that’s a fantasy that someone is playing out with their AI.

That concern is not unfounded. Multiple studies have shown that men who watch more pornography, which can include violent and aggressive sex, are more likely to be sexually aggressive with real-life partners. 

“Work by one of my Kinsey Institute colleagues, Ellen Kaufman, has looked at this exact issue of consent language and how people can train their chatbots to amplify non-consensual language,” Garcia said.

He noted that people use AI companions to experiment with the good and bad, but the threat is that you can end up training people on how to be aggressive, non-consensual partners.

“We have enough of that in society,” he said. 

Ha thinks these risks can be mitigated with thoughtful regulation, transparent algorithms, and ethical design. 

Of course, she made that comment before the White House released its AI Action Plan, which says nothing about transparency — which many frontier AI companies are against — or ethics. The plan also seeks to eliminate a lot of regulation around AI.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI伴侣 人工智能 人机关系 情感连接 亲密关系
相关文章