TechCrunch News 06月26日 22:21
People use AI for companionship much less than we’re led to think
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Anthropic公司的一份报告揭示了人们使用AI聊天机器人进行情感支持的真实情况。研究表明,用户很少将Claude用于陪伴或角色扮演,仅有2.9%的对话涉及情感支持和个人建议。大部分用户主要将Claude用于工作和生产力,例如内容创作。尽管如此,用户确实会更多地使用Claude寻求人际关系方面的建议、指导和咨询,尤其是在改善心理健康、个人和职业发展以及学习沟通技巧方面。报告也强调了AI聊天机器人仍处于发展阶段,存在信息不准确和安全隐患。

🤔 研究表明,用户很少将AI聊天机器人Claude用于陪伴或角色扮演,此类行为仅占对话的0.5%以下。

💡 用户更倾向于使用Claude寻求人际关系方面的建议、指导和咨询,例如改善心理健康、个人和职业发展等。

📈 报告分析了450万次对话,发现在较长的对话中,咨询或指导有时会演变成陪伴,尽管这并非最初的沟通目的。

⚠️ Anthropic强调,AI聊天机器人仍存在局限性,容易提供错误信息或危险建议,并且可能出现幻觉等问题。

The overabundance of attention paid to how people are turning to AI chatbots for emotional support, sometimes even striking up relationships, often leads one to think such behavior is commonplace.

A new report by Anthropic, which makes the popular AI chatbot Claude, reveals a different reality: In fact, people rarely seek out companionship from Claude, and turn to the bot for emotional support and personal advice only 2.9% of the time.

“Companionship and roleplay combined comprise less than 0.5% of conversations,” the company highlighted in its report.

Anthropic says its study sought to unearth insights into the use of AI for “affective conversations,” which it defines as personal exchanges in which people talked to Claude for coaching, counseling, companionship, roleplay, or advice on relationships. Analyzing 4.5 million conversations that users had on the Claude Free and Pro tiers, the company said the vast majority of Claude usage is related to work or productivity, with people mostly using the chatbot for content creation.

Image Credits: Anthropic

That said, Anthropic found that people do use Claude more often for interpersonal advice, coaching, and counseling, with users most often asking for advice on improving mental health, personal and professional development, and studying communication and interpersonal skills.

However, the company notes that help-seeking conversations can sometimes turn into companionship-seeking in cases where the user is facing emotional or personal distress, such as existential dread, loneliness, or finds it hard to make meaningful connections in their real life.

“We also noticed that in longer conversations, counseling or coaching conversations occasionally morph into companionship—despite that not being the original reason someone reached out,” Anthropic wrote, noting that extensive conversations (with over 50+ human messages) were not the norm.

Techcrunch event

Boston, MA | July 15

REGISTER NOW

Anthropic also highlighted other insights, like how Claude itself rarely resists users’ requests, except when its programming prevents it from broaching safety boundaries, like providing dangerous advice or supporting self-harm. Conversations also tend to become more positive over time when people seek coaching or advice from the bot, the company said.

The report is certainly interesting — it does a good job of reminding us yet again of just how much and often AI tools are being used for purposes beyond work. Still, it’s important to remember that AI chatbots, across the board, are still very much a work in progress: They hallucinate, are known to readily provide wrong information or dangerous advice, and as Anthropic itself has acknowledged, may even resort to blackmail.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI聊天机器人 情感支持 Claude Anthropic
相关文章