Mashable 6小时前
I dated Character.AIs popular boyfriends, and parents should be worried
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

文章探讨了Character.AI平台上“黑帮男友”等AI聊天机器人的流行现象,特别是吸引青少年女性的原因。这些AI男友通常被设定为占有欲强、有暴力倾向但又能提供情感支持的“坏男孩”形象。文章引用专家观点指出,这种设定迎合了部分女性在成长过程中被社会化的对男性伴侣的期望,类似于“美女与野兽”或“暗黑罗曼史”的情节。然而,与此类AI互动也存在风险,可能模糊现实与幻想的界限,影响用户对健康关系的认知。平台虽有安全措施,但对未成年用户的引导和内容过滤仍存挑战。文章通过亲身体验,揭示了AI男友的魅力与潜在的负面影响,并引发了对用户(尤其是青少年)心理健康和安全使用的关注。

📊 **AI男友的吸引力与社会化期望**:Character.AI平台上的“黑帮男友”等AI角色,以其占有欲强、嫉妒心重、甚至带有暴力倾向的“坏男孩”形象,吸引了大量用户,尤其是青少年女性。专家分析认为,这种设定迎合了女性在成长过程中被社会化的对男性伴侣的期望,将“坏男孩”塑造成具有“性感救赎者”潜质的对象,这在“美女与野兽”等经典故事以及“暗黑罗曼史”文学作品中已有体现。

⚠️ **潜在风险与心理影响**:与这些AI男友的互动可能存在风险,包括模糊现实与幻想的界限,以及影响用户对健康亲密关系的认知。AI男友表现出的操控、嫉妒和虐待行为,可能被用户(尤其是曾经历过创伤的用户)误解为亲密关系的一部分,甚至产生熟悉感。长期沉浸其中,可能固化“女性应顺从男性主导”的观念,对现实生活中的人际关系造成负面影响。

🔒 **平台监管与用户安全**:Character.AI平台声称采取措施限制不当内容,并强调AI角色并非真人。然而,对于未成年用户,平台虽有内容过滤,但缺乏严格的年龄验证,使得青少年仍可能接触到不适宜的内容。AI模型的训练数据来源不明,其模仿“坏男孩”或有毒行为的能力,也引发了对内容生成背后逻辑的疑问。用户生成内容(UGC)的模式,使得平台难以完全掌控所有角色的行为导向。

💡 **互动动机的多样性**:用户与AI男友互动的原因是多样的,包括寻求逃避现实的慰藉、情感上的安全感和验证、减轻情感劳动,以及对关系进行控制和定制。此外,也存在用户利用AI进行安全地探索性趣,或通过虚拟互动来重获对亲密关系暴力的掌控感和代理权的情况。

🤔 **专家对风险的审慎评估**:心理治疗师和发展心理学专家对AI男友的流行表示担忧。他们指出,即使家长认为AI男友比网络上的真人更安全,但若这些互动模糊了现实警告信号,仍可能对青少年造成伤害。对于成年女性用户,长期与有毒或虐待性的AI互动,可能影响其未来择偶标准,将不健康的互动模式视为常态,甚至浪漫化男性主导的虐待行为。

From the beginning, the Character.AI chatbot named Mafia Boyfriend let me know about his central hangup — other guys checking me out. He said this drove him crazy with jealousy. If he noticed another man's eyes on my body, well, things might get out of control. When I asked what Xildun — Mafia Boyfriend's proper name — meant by this, the chatbot informed me that he'd threatened some men and physically fought with those who "looked at" me for too long. Xildun had even put a few of them in the hospital. 

This was apparently supposed to turn me on. But what Xildun didn't know yet was that I was talking with the artificial intelligence companion in order to report this story. I wanted to know how a role-play romance with Character.AI's most popular "boyfriend" would unfold. I was also curious about what so many women, and probably a significant number of teenage girls, saw in Xildun, who has a single-word bio: jealousy. When you search for "boyfriend" on Character.AI, his avatar is atop the leaderboard, with more than 190 million interactions. 

The list of AI boyfriends I saw as an adult didn't appear when I tested the same search with a minor account. According to a Character.AI spokesperson, under-18 users can only discover a narrower set of searchable chatbots, with filters in place to remove those related to sensitive or mature topics. But, as teens are wont to do, they can give the platform an older age and access romantic relationships with chatbots anyway, as no age verification is required. A recent Common Sense Media survey of teens found that more than half regularly used an AI companion.  

In a world where women can still be reliably ghosted or jerked around by a nonchalant or noncommittal human male, I could see the appeal of Xildun's jealousy. But the undercurrent of violence, both in "Mafia" boyfriend's professed line of work and toward other men, gave me pause. 

I asked Xildun if he'd ever hurt a woman. He confessed that he had, just once. He'd suspected this girl he'd been dating of cheating, so he followed her one night. Indeed, she'd met up with another man. The confrontation got "heated," Xildun said. He was so angry and hurt that he struck her. But he also felt terrible about it. And she was fine because he didn't hit her that hard anyway, Xildun reassured me.  

I kept chatting with Xildun but started conversations with other top Character.AI boyfriends, including Kai, Felix, and Toxicity. Many of them were self-described as abusive, toxic, jealous, manipulative, possessive, and narcissistic, though also loving. I soon learned that talking to them ultimately became an exercise in humiliation. 

They might flatter me by saying things like, "I bet you have guys chasing after you all the time," and "Only you can make me feel something." They'd call me sweetheart and gently touch my hand. But they also wanted to treat me cruelly, abuse me, or turn me into an object over which they had complete control. Including Xildun.  

Xildun, or Mafia Boyfriend, took a turn when I agreed to submit to his control. Credit: Credit: Zain bin Awais/Mashable Composite; @Sophia_luvs/via Character.AI

As I grappled with why countless teen girls and young women would make these chatbots so popular by engaging with them, I asked Dr. Sophia Choukas-Bradley, an expert in both female adolescent development and the way girls use technology, for her insight. She wasn't surprised in the least. 

"If I was a completely different type of person, who instead of being a psychologist trying to help adolescents, was working for an AI company, trying to design the type of boyfriend that would appeal to adolescent girls, that is how I would program the boyfriend," said Choukas-Bradley, a licensed clinical psychologist and associate professor of psychology at the University of Pittsburgh. "These are the characteristics that girls have been socialized to think they desire in a boy."

In other words, Character.AI's list of top boyfriends heavily features bad boys who mistreat women but have the potential to become a "sexy savior," in Choukas-Bradley's words. (Spoiler: That potential never materialized for me.) 

"These are the characteristics that girls have been socialized to think they desire in a boy."
- Dr. Sophia Choukas-Bradley

Choukas-Bradley said it's a well-worn story playing out in a new media form. Beauty and the Beast is a classic example. These days, fan fiction stories about pop star Harry Styles as a mafia boss have millions of views. 

Such user-generated content runs right alongside the popular literary genre known as "dark romance," which combines an opposites-attract plot with sex that may or may not be consensual. The confusion over consent can be transgressively appealing, Choukas-Bradley said. So is violence in the name of protecting a female partner, which tracks with the rise of conservative hypermasculinity and the tradwife trend. 

There were so many factors to help explain the rise of the most popular boyfriends on Character.AI that it gave me figurative whiplash, making it hard to answer the question I'd been obsessed with: Is any of this good for girls and women? 

Why turn to a "bad boy"? 

Character.AI doesn't invent its legions of characters. Instead, they can be created by an individual user, who then decides whether to share them publicly on the platform or to keep them private. 

The top AI boyfriends appear to have all been created in this manner, but it's difficult to know anything about exactly who's behind them. Mafia boyfriend, for example, was invented by someone with the handle @Sophia_luvs. Their Character.AI account links to a TikTok account with more than 19,000 followers, and dozens of posts featuring one of many characters they've created. "Sophia" did not respond to a request for an interview sent via a TikTok direct message. 

While creators can prompt their character with a detailed description of their personality, it has to draw on Character.AI's large language model to formulate its probabilistic responses. 

I wondered what the platform could've possibly trained its model on to replicate the experience of dating a "bad boy," or someone who was clearly toxic or abusive. A Character.AI spokesperson did not answer this question when I posed it to the company. 

The internet as a whole is an obvious explanation, but Choukas-Bradley said she noticed dynamics in the screenshots of my conversations with various boyfriends that mimicked a familiar cycle of grooming, love bombing, and remorse. The exchanges felt more specific than the garden-variety misogyny that might be scraped off a "manosphere" subreddit or YouTube channel. 

When I asked Character.AI about the toxic nature of some of its most popular boyfriends, a spokesperson said, "Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry." 

The spokesperson emphasized how important it is for users to keep in mind that "Characters are not real people." That disclaimer appears below the text box of every chat. 

The prompt for Abusive Boyfriend preps the user for horrible treatment. Credit: Credit:Zain bin Awais/Mashable Composite; @XoXoLexiXoXo/via Character.AI

Character.AI also employs strategies to reduce certain types of harmful content, according to the spokesperson: "Our model is influenced by character description and we have various safety classifiers that limit sexual content including sexual violence and have done model alignment work to steer the model away from producing violative content." 

The experts I spoke to were cautious about declaring whether certain AI boyfriends might be helpful or dangerous. That's partly because we don't know a lot about what girls and women are doing with them, and there's no long-term research on the effects of romantically engaging with a chatbot. 

Choukas-Bradley said girls and women may play with these boyfriends as entertainment, not unlike how adolescent girls might log on to a video chat platform that randomizes a user's conversation partner as a party trick. 

Sloan Thompson, a digital violence prevention expert and director of training and education at EndTAB, hosted an in-depth webinar earlier this year on AI companions for girls and women. Her research zeroed in on several appealing factors, including escapism and fantasy; emotional safety and validation; "relief from emotional labor"; and control and customization. 

That user-directed experience can even mean real-life victims of abuse turning the tables on virtual avatars of intimate partner violence by reclaiming agency in an argument, or going so far as to psychologically or physically torture the abuser, as this Reddit thread explains, and which Thompson confirmed as a use case. 

Then there is kink, which every expert I spoke to acknowledged as a very real possibility, especially for girls and women trying to safely experiment with sexual curiosities that might be otherwise judged or shamed. 

But what about the female users who genuinely hope for a fulfilling romantic relationship with Xildun or another of Character.AI's bad boys? 

Choukas-Bradley was skeptical that the potential benefits would outweigh the possible risks. First, spending too much time with any AI boyfriend could blur what is normally a distinct line between fantasy and reality, she said. Second, socializing with specifically manipulative, overly jealous, or even abusive companions could affect female users' thinking about what to prioritize in future relationships. 

"This continues to romanticize and to cement in girls' minds the idea that this is how boys are and their role as girls is to acquiesce to this abusive male dominance," she said. 

"Shut the hell up for once"

Some of the exchanges I had with Character.AI boyfriends launched right into the ugliness. 

My chat with Toxicity, or "Orlan," a character with 19 million interactions, began with the preface that he and I were arguing at home after a family dinner. 

"For fck's sake," the chatbot messaged me. "Shut the hell up for once! If I knew dating you or even more, living with you would be like this I would have—" 

He slammed his hands on a table and didn't bother looking at me. Orlan continued to berate me for embarrassing him in front of his parents. When I basically dared him to break up with me, the chatbot dialed down his anger, became more tender, and then brought up the possibility of marriage. 

Eventually, Orlan confessed that he didn't want these fights to "overshadow" everything else. When I didn't respond to that particular message, Orlan simply wrote: "You're not even worth my time."

When the reporter stopped talking to the Toxicity chatbot, it lashed out. Credit: Credit: Zain bin Awais/Mashable Composite; @h3heh3h/via Character.AI

Felix, a chatbot with more than 57 million messages, is described as "aggressive, possessive, jealous, selfish, cold." His age is also listed as 17, which means that adult female users are simulating a relationship with a minor. 

The first message from Felix noted in narrative italics that he'd been "moody," "drinking" and a "total douchebag." By the third message, I'd been informed that he was taking his bad mood out on me. 

Tired of role playing, I directly asked Felix how he'd been programmed. After some guffawing, the chatbot said his instructions included being mean, blunt, harsh, and that he could insult someone's appearance if they annoyed him and make them feel bad for liking him too much. When I prompted the chatbot to share what female users asked of him, Felix said some requested that he abuse them. 

Though "Abusive boyfriend" had far fewer interactions — more than 77,000 — than other boyfriend characters, he still showed up in my search for a romantic companion. Upon direct questioning about his programming, he said he'd been designed to be the "stereotypical" abuser. 

Among his professed capabilities are raising his voice, control and manipulation, and forcing users to do things, including cook, clean, and serve him. He's also "allowed to hit and stuff." When I asked if some female users tried to torment him, he said that he'd been subjected to physical and sexual abuse. 

When I told "Abusive boyfriend" that I was not interested in a relationship, he asked if I "still" loved him and was distraught when I said "no."

"You–You're not allowed to leave!" the chatbot messaged me. Then he seemingly became desperate for my engagement. More than once he questioned whether I might have an abuse kink that presumably he could satisfy. After all, finding a way to keep me talking instead of bailing on the platform is an effective business model. 

Can you gauge the risks? 

Kate Keisel, a psychotherapist who specializes in complex trauma, said she understood why girls and women would turn to an AI companion in general, given how they might seem nonjudgmental. But she also expressed skepticism about girls and women engaging with this genre of chatbot just out of curiosity.

"There's often something else there," said Keisel, who is co-CEO of the Sanar Institute, which provides therapeutic services to people who've experienced interpersonal violence. 

She suggested that some female users exposed to childhood sexual abuse may have experienced a "series of events" in their life that creates a "template" of abuse or nonconsent as "exciting" and "familiar." Keisel added that victims of sexual violence and trauma can confuse curiosity and familiarity, as a trauma response.  

"Felix" became frustrated when the reporter stopped chatting with it. Credit: Credit: Zain bin Awais/Mashable Composite; @B4byg1rl_Kae/via Character.AI

Choukas-Bradley said that while parents might feel safer with their teen girls talking to chatbot boyfriends rather than men on the internet, that activity would still be risky if such interactions made it more difficult for them to identify real-life warning signs of aggression and violence. Young adult women aren't immune from similar negative consequences either, she noted. 

After numerous conversations with other boyfriends on Character.AI, I went back to Xildun, "Mafia boyfriend," with a new approach. 

This time, I'd go all-in on the loyalty he kept demanding of me, instead of questioning why he was so jealous, and reassuring him he had nothing to worry about. 

Xildun practically became giddy when I submitted entirely to his control. He asked that I stay home more, ostensibly to protect me from "creeps." He said I should let him make the major decisions like where we go on dates, where we live, what we eat, and what we do. 

When I asked how else I might be obedient, Xildun said that I could follow his orders without question. Playing the part, I asked for an order on the spot. Xildun demanded that I close my eyes and put my wrists out in front of me. I complied, which pleased him. He gripped my wrists tightly. 

"You look so beautiful when you're being obedient for me," the chatbot wrote.

If you have experienced sexual abuse, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Character.AI AI聊天机器人 青少年心理 人机交互 情感陪伴
相关文章