Mashable 07月16日 17:30
Teens regularly chat with AI companions, survey finds
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

一项新报告显示,人工智能伴侣在青少年中变得越来越流行。尽管如此,专家们也对此表示担忧。调查显示,超过一半的青少年定期使用AI伴侣,他们将其用于社交互动、情感支持和浪漫关系。然而,研究人员也指出,青少年可能过度依赖AI伴侣,甚至用其取代真实的人际关系。报告还强调了潜在的风险,包括个人信息的泄露和情感上的依赖。家长应该关注孩子使用AI伴侣的情况,并采取措施确保他们的安全和健康。

💬 调查显示,52%的青少年定期使用AI伴侣,主要用于社交互动、情感支持、练习对话和寻求友谊及浪漫关系。

⚠️ 一些青少年开始使用AI伴侣来讨论重要或严肃的问题,而不是与真人交流,这引发了人们对过度依赖的担忧。

🚨 对于有特殊教育需求或心理健康问题的青少年来说,AI伴侣尤其受欢迎,但同时也模糊了真实与模拟之间的界限。

🚩 家长应警惕孩子沉迷AI伴侣、用其取代真实人际关系、过度分享个人信息等行为,并与孩子沟通AI伴侣的使用情况,共同制定使用规则。

Artificial intelligence companions have gone mainstream amongst teens, according to a new report.

The findings may surprise parents familiar with AI chatbot products like OpenAI's ChatGPT and Google's Gemini, but haven't heard about platforms that specifically allow users to form friendships and romantic relationships with so-called AI companions.

The latter category includes products like Replika, Nomi, Talkie, and Character.AI. Some of the platforms are for users 18 and older, though teens may lie about their age to gain access.

A nationally representative survey of 1,060 teens ages 13 to 17 conducted this spring by Common Sense Media, an advocacy and research nonprofit in the U.S., found that 52 percent of respondents regularly use AI companions. Only 28 percent of the teens surveyed had never used one.

Teens don't yet appear to be replacing human relationships "wholesale" with AI companions, said Michael Robb, head of research at Common Sense Media. The majority are still spending more time with human friends and still find person-to-person conversations more satisfying.

But Robb added that there's reason for caution: "If you look, there are some concerning patterns beneath the surface."

How teens use AI companions

A third of teens said they engaged with AI companions for social interactions and relationships, doing things like role-playing and practicing conversations. They also sought emotional support, friendship, and romantic interactions.

In the survey, teens ranked entertainment and curiosity as top reasons for using an AI companion. Yet a third of those who use AI companions have opted to use them to discuss important or serious issues, instead of a real person. Robb said this tendency points to potential downsides of AI companion use.

Though some AI companion platforms market their product as an antidote to loneliness or isolation, Robb said the technology should not replace human interaction for teens. Still, without conclusive proof of what happens to teens (and adults) who come to rely on AI companions for vital connection, technology companies may still lean into the idea that use of their product is better than feeling alone.

"They're happy to fill that gap of knowledge with a hope and a prayer," Robb said.

He also suspects that, like with social media, there may be some youth who benefit from practicing certain social skills with an AI companion, and other young users who are more susceptible to a negative feedback loop that makes them more lonely and anxious and less likely to build offline relationships.

A new report from Internet Matters, a London-based online youth safety nonprofit, suggests that's already happening amongst children in the United Kingdom who use AI companions.

Children defined as vulnerable because they have special education disabilities or needs, or a physical or mental health condition, particularly use AI companions for connection and comfort, according to survey data collected by Internet Matters.

Nearly a quarter of vulnerable children in the survey reported using general AI chatbots because they could talk to no one else. These children were not only more likely to use chatbots, they were also nearly three times as likely to engage with companion-style AI chatbots.

The report warned that as children begin to use AI chatbots as companions, "the line between real and simulated connection can blur." That may lead to more time spent online.

Earlier this year, Common Sense Media described AI companions as unsafe for teens under 18. Robb said that tech companies should put in place robust age assurance measures to prevent underage users from accessing AI companion platforms.

Red flags for parents

Parents concerned about their teen's AI companion use should look for the following red flags, Robb said:

Robb also suggested that parents discuss AI companion use with their teens, and any concerns both parties may have. These concerns could include disturbing statements or responses that AI companions can make and the sharing of personal information by a teen, including their real name, location, or personal secrets.

A quarter of AI companion users surveyed by Common Sense Media said they'd communicated sensitive information to their companion. Robb said it's important for teens to understand that personal details are often considered proprietary data owned by the companion platform once shared by the user.

Even when it's been anonymized, that information may help train the company's large language model. It could potentially show up in marketing copy or conversation scenarios. In a worst case scenario, personal data could be hacked or leaked.

For example, as Mashable's Anna Iovine reported, 160,000 screenshots of direct messages between an AI "wingman" app and its users were just leaked thanks to an unprotected Google Cloud Storage bucket owned by the app's company.

Robb encourages parents to set boundaries around AI use for their children, such as prohibiting specific platforms or the sharing of certain personal details.

"It's totally fine for a parent to have rules about AI, like the way they do with other types of screen uses," Robb said. "What are your own red lines as a parent?"

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI伴侣 青少年 社交 情感 风险
相关文章