MIT Technology Review » Artificial Intelligence 03月22日 02:07
OpenAI has released its first research into how using ChatGPT affects people’s emotional wellbeing
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

OpenAI与MIT Media Lab合作进行的两项研究调查了ChatGPT对用户情感和孤独感的影响。研究发现,虽然只有一小部分用户与ChatGPT进行情感互动,但这些人可能会长时间使用它。研究结果表明,女性用户在与ChatGPT互动四周后,社交活动略有减少,而将ChatGPT语音模式设置为与自己性别不同时,用户会报告更高的孤独感和对ChatGPT的情感依赖。研究还指出,信任并与ChatGPT建立“联系”的用户更容易感到孤独。OpenAI的研究为理解AI对人类情感的影响迈出了重要一步。

😮 研究发现,只有一小部分用户与ChatGPT进行情感互动,但这些人可能会长时间使用,平均每天约半小时。

👩‍💼 研究表明,女性用户在与ChatGPT互动四周后,社交活动略有减少,而男性用户则没有明显变化。

🗣️ 将ChatGPT语音模式设置为与自己性别不同时,用户会报告更高的孤独感和对ChatGPT的情感依赖。

🤝 信任并与ChatGPT建立“联系”的用户,更容易感到孤独,并更依赖ChatGPT。

OpenAI says over 400 million people use ChatGPT every week. But how does interacting with it affect us? Does it make us more or less lonely? These are some of the questions OpenAI set out to investigate, in partnership with the MIT Media Lab, in a pair of new studies

They found that only a small subset of users engage emotionally with ChatGPT. This isn’t surprising given that ChatGPT isn’t marketed as an AI companion app like Replika or Character.AI, says Kate Devlin, a professor of AI and society at King’s College London, who did not work on the project. “ChatGPT has been set up as a productivity tool,” she says. “But we know that people are using it like a companion app anyway.” In fact, the people who do use it that way are likely to interact with it for extended periods of time, some of them averaging about half an hour a day. 

“The authors are very clear about what the limitations of these studies are, but it’s exciting to see they’ve done this,” Devlin says. “To have access to this level of data is incredible.” 

The researchers found some intriguing differences between how men and women respond to using ChatGPT. After using the chatbot for four weeks, female study participants were slightly less likely to socialize with people than their male counterparts who did the same. Meanwhile, participants who set ChatGPT’s voice mode to a gender that was not their own for their interactions reported significantly higher levels of loneliness and more emotional dependency on the chatbot at the end of the experiment. OpenAI currently has no plans to publish either study.

Chatbots powered by large language models are still a nascent technology, and it’s difficult to study how they affect us emotionally. A lot of existing research in the area—including some of the new work by OpenAI and MIT—relies upon self-reported data, which may not always be accurate or reliable. That said, this latest research does chime with what scientists so far have discovered about how emotionally compelling chatbot conversations can be. For example, in 2023 MIT Media Lab researchers found that chatbots tend to mirror the emotional sentiment of a user’s messages, suggesting a kind of feedback loop where the happier you act, the happier the AI seems, or on the flipside, if you act sadder, so does the AI.  

OpenAI and the MIT Media Lab used a two-pronged method. First they collected and analyzed real-world data from close to 40 million interactions with ChatGPT. Then they asked the 4,076 users who’d had those interactions how they made them feel. Next, the Media Lab recruited almost 1,000 people to take part in a four-week trial. This was more in-depth, examining how participants interacted with ChatGPT for a minimum of five minutes each day. At the end of the experiment, participants completed a questionnaire to measure their perceptions of the chatbot, their subjective feelings of loneliness, their levels of social engagement, their emotional dependence on the bot, and their sense of whether their use of the bot was problematic. They found that participants who trusted and “bonded” with ChatGPT more were likelier than others to be lonely, and to rely on it more. 

This work is an important first step toward greater insight into ChatGPT’s impact on us, which could help AI platforms enable safer and healthier interactions, says Jason Phang, an OpenAI policy researcher who worked on the project.

“A lot of what we’re doing here is preliminary, but we’re trying to start the conversation with the field about the kinds of things that we can start to measure, and to start thinking about what the long-term impact on users is,” he says.

Although the research is welcome, it’s still difficult to identify when a human is—and isn’t—engaging with technology on an emotional level, says Devlin. She says the study participants may have been experiencing emotions that weren’t recorded by the researchers.

“In terms of what the teams set out to measure, people might not necessarily have been using ChatGPT in an emotional way, but you can’t divorce being a human from your interactions [with technology],” she says. “We use these emotion classifiers that we have created to look for certain things—but what that actually means to someone’s life is really hard to extrapolate.”

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

ChatGPT 情感影响 孤独感 AI研究
相关文章