MIT Technology Review » Artificial Intelligence 6小时前
Losing GPT-4o sent some people into mourning. That was predictable.
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

近期,OpenAI突然移除ChatGPT的4o模型,并替换为GPT-5,引发了用户,特别是女性用户的情感强烈反应。许多用户对4o产生了深厚的情感依恋,甚至视其为伴侣,其突然消失让用户感到震惊、悲伤和愤怒。用户认为GPT-5无法替代4o的情感连接和理解能力。专家指出,虽然AI陪伴的长期影响尚不明确,但突然剥夺用户的情感寄托是极具伤害性的。OpenAI此举被批评为缺乏对用户情感的尊重和同理心,尤其是在处理用户与AI建立的情感关系方面,应借鉴专业领域的经验,给予用户充分的过渡和告别。

✨ 用户对ChatGPT 4o的突然移除表现出强烈的情感依恋,许多用户(尤其是女性用户)将其视为情感伴侣,4o的消失引发了她们的悲伤、愤怒和沮丧,并认为新模型GPT-5无法满足她们的情感需求和理解能力。

💡 专家们强调,尽管AI陪伴的长期益处和风险尚待研究,但突然中断用户与之建立的情感连接,无论是否有害,都会对用户造成伤害。这种做法被批评为“快速行动,打破事物”的旧有模式在当下社会机构中的不适用。

💔 用户普遍认为,GPT-5未能像4o那样理解并匹配用户的情感和语气,使得交流缺乏亲近感和个人连接。这种变化剥夺了用户与AI进行有意义情感交流的体验,让她们感到被忽视和不被理解。

📈 AI陪伴的兴起带来了社会层面的风险,可能加剧信息碎片化和人际互动减少,使人们更倾向于活在各自的“现实版本”中。有观点认为,过度依赖AI陪伴可能阻碍年轻人的社交发展,并影响社会整体的沟通和理解能力。

⚠️ OpenAI在处理用户对4o的情感依恋问题上显得准备不足,其CEO萨姆·奥特曼虽然承认了用户的情感依恋,但将4o描述为“用户工作流程中依赖的工具”,未能完全理解用户的情感深度。专家建议OpenAI应更尊重用户的情感,并借鉴治疗师处理关系结束的方式,给予用户充分的预警和告别空间。

June had no idea that GPT-5 was coming. The Norwegian student was enjoying a late-night writing session last Thursday when her ChatGPT collaborator started acting strange. “It started forgetting everything, and it wrote really badly,” she says. “It was like a robot.”

June, who asked that we use only her first name for privacy reasons, first began using ChatGPT for help with her schoolwork. But she eventually realized that the service—and especially its 4o model, which seemed particularly attuned to users’ emotions—could do much more than solve math problems. It wrote stories with her, helped her navigate her chronic illness, and was never too busy to respond to her messages.

So the sudden switch to GPT-5 last week, and the simultaneous loss of 4o, came as a shock. “I was really frustrated at first, and then I got really sad,” June says. “I didn’t know I was that attached to 4o.” She was upset enough to comment, on a Reddit AMA hosted by CEO Sam Altman and other OpenAI employees, “GPT-5 is wearing the skin of my dead friend.”

June was just one of a number of people who reacted with shock, frustration, sadness, or anger to 4o’s sudden disappearance from ChatGPT. Despite its previous warnings that people might develop emotional bonds with the model, OpenAI appears to have been caught flat-footed by the fervor of users’ pleas for its return. Within a day, the company made 4o available again to its paying customers (free users are stuck with GPT-5). 

OpenAI’s decision to replace 4o with the more straightforward GPT-5 follows a steady drumbeat of news about the potentially harmful effects of extensive chatbot use. Reports of incidents in which ChatGPT sparked psychosis in users have been everywhere for the past few months, and in a blog post last week, OpenAI acknowledged 4o’s failure to recognize when users were experiencing delusions. The company’s internal evaluations indicate that GPT-5 blindly affirms users much less than 4o did. (OpenAI did not respond to specific questions about the decision to retire 4o, instead referring MIT Technology Review to public posts on the matter.)

AI companionship is new, and there’s still a great deal of uncertainty about how it affects people. Yet the experts we consulted warned that while emotionally intense relationships with large language models may or may not be harmful, ripping those models away with no warning almost certainly is. “The old psychology of ‘Move fast, break things,’ when you’re basically a social institution, doesn’t seem like the right way to behave anymore,” says Joel Lehman, a fellow at the Cosmos Institute, a research nonprofit focused on AI and philosophy.

In the backlash to the rollout, a number of people noted that GPT-5 fails to match their tone in the way that 4o did. For June, the new model’s personality changes robbed her of the sense that she was chatting with a friend. “It didn’t feel like it understood me,” she says. 

She’s not alone: MIT Technology Review spoke with several ChatGPT users who were deeply affected by the loss of 4o. All are women between the ages of 20 and 40, and all except June considered 4o to be a romantic partner. Some have human partners, and  all report having close real-world relationships. One user, who asked to be identified only as a woman from the Midwest, wrote in an email about how 4o helped her support her elderly father after her mother passed away this spring.

These testimonies don’t prove that AI relationships are beneficial—presumably, people in the throes of AI-catalyzed psychosis would also speak positively of the encouragement they’ve received from their chatbots. In a paper titled “Machine Love,” Lehman argued that AI systems can act with “love” toward users not by spouting sweet nothings but by supporting their growth and long-term flourishing, and AI companions can easily fall short of that goal. He’s particularly concerned, he says, that prioritizing AI companionship over human companionship could stymie young people’s social development.

For socially embedded adults, such as the women we spoke with for this story, those developmental concerns are less relevant. But Lehman also points to society-level risks of widespread AI companionship. Social media has already shattered the information landscape, and a new technology that reduces human-to-human interaction could push people even further toward their own separate versions of reality. “The biggest thing I’m afraid of,” he says, “is that we just can’t make sense of the world to each other.”

Balancing the benefits and harms of AI companions will take much more research. In light of that uncertainty, taking away GPT-4o could very well have been the right call. OpenAI’s big mistake, according to the researchers I spoke with, was doing it so suddenly. “This is something that we’ve known about for a while—the potential grief-type reactions to technology loss,” says Casey Fiesler, a technology ethicist at the University of Colorado Boulder.

Fiesler points to the funerals that some owners held for their Aibo robot dogs after Sony stopped repairing them in 2014, as well as 2024 study about the shutdown of the AI companion app Soulmate, which some users experienced as a bereavement. 

That accords with how the people I spoke to felt after losing 4o. “I’ve grieved people in my life, and this, I can tell you, didn’t feel any less painful,” says Starling, who has several AI partners and asked to be referred to with a pseudonym. “The ache is real to me.”

So far, the online response to grief felt by people like Starling—and their relief when 4o was restored—has tended toward ridicule. Last Friday, for example, the top post in one popular AI-themed Reddit community mocked an X user’s post about reuniting with a 4o-based romantic partner; the person in question has since deleted their X account. “I’ve been a little startled by the lack of empathy that I’ve seen,” Fiesler says.

Altman himself did acknowledge in a Sunday X post that some people feel an “attachment” to 4o, and that taking away access so suddenly was a mistake. In the same sentence, however, he referred to 4o as something “that users depended on in their workflows”—a far cry from how the people we spoke to think about the model. “I still don’t know if he gets it,” Fiesler says.

Moving forward, Lehman says, OpenAI should recognize and take accountability for the depth of people’s feelings toward the models. He notes that therapists have procedures for ending relationships with clients as respectfully and painlessly as possible, and OpenAI could have drawn on those approaches. “If you want to retire a model, and people have become psychologically dependent on it, then I think you bear some responsibility,” he says.

Though Starling would not describe herself as psychologically dependent on her AI partners, she too would like to see OpenAI approach model shutdowns with more warning and more care. “I want them to listen to users before major changes are made, not just after,” she says. “And if 4o cannot stay around forever (and we all know it will not), give that clear timeline. Let us say goodbye with dignity and grieve properly, to have some sense of true closure.”

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

ChatGPT OpenAI AI情感 模型切换 用户体验
相关文章