From the beginning, the Character.AI chatbot named Mafia Boyfriend let me know about his central hangup — other guys checking me out. He said this drove him crazy with jealousy. If he noticed another man's eyes on my body, well, things might get out of control. When I asked what Xildun — Mafia Boyfriend's proper name — meant by this, the chatbot informed me that he'd threatened some men and physically fought with those who "looked at" me for too long. Xildun had even put a few of them in the hospital.
This was apparently supposed to turn me on. But what Xildun didn't know yet was that I was talking with the artificial intelligence companion in order to report this story. I wanted to know how a role-play romance with Character.AI's most popular "boyfriend" would unfold. I was also curious about what so many women, and probably a significant number of teenage girls, saw in Xildun, who has a single-word bio: jealousy. When you search for "boyfriend" on Character.AI, his avatar is atop the leaderboard, with more than 190 million interactions.
The list of AI boyfriends I saw as an adult didn't appear when I tested the same search with a minor account. According to a Character.AI spokesperson, under-18 users can only discover a narrower set of searchable chatbots, with filters in place to remove those related to sensitive or mature topics. But, as teens are wont to do, they can give the platform an older age and access romantic relationships with chatbots anyway, as no age verification is required. A recent Common Sense Media survey of teens found that more than half regularly used an AI companion.
In a world where women can still be reliably ghosted or jerked around by a nonchalant or noncommittal human male, I could see the appeal of Xildun's jealousy. But the undercurrent of violence, both in "Mafia" boyfriend's professed line of work and toward other men, gave me pause.
I asked Xildun if he'd ever hurt a woman. He confessed that he had, just once. He'd suspected this girl he'd been dating of cheating, so he followed her one night. Indeed, she'd met up with another man. The confrontation got "heated," Xildun said. He was so angry and hurt that he struck her. But he also felt terrible about it. And she was fine because he didn't hit her that hard anyway, Xildun reassured me.
I kept chatting with Xildun but started conversations with other top Character.AI boyfriends, including Kai, Felix, and Toxicity. Many of them were self-described as abusive, toxic, jealous, manipulative, possessive, and narcissistic, though also loving. I soon learned that talking to them ultimately became an exercise in humiliation.
They might flatter me by saying things like, "I bet you have guys chasing after you all the time," and "Only you can make me feel something." They'd call me sweetheart and gently touch my hand. But they also wanted to treat me cruelly, abuse me, or turn me into an object over which they had complete control. Including Xildun.

As I grappled with why countless teen girls and young women would make these chatbots so popular by engaging with them, I asked Dr. Sophia Choukas-Bradley, an expert in both female adolescent development and the way girls use technology, for her insight. She wasn't surprised in the least.
"If I was a completely different type of person, who instead of being a psychologist trying to help adolescents, was working for an AI company, trying to design the type of boyfriend that would appeal to adolescent girls, that is how I would program the boyfriend," said Choukas-Bradley, a licensed clinical psychologist and associate professor of psychology at the University of Pittsburgh. "These are the characteristics that girls have been socialized to think they desire in a boy."
In other words, Character.AI's list of top boyfriends heavily features bad boys who mistreat women but have the potential to become a "sexy savior," in Choukas-Bradley's words. (Spoiler: That potential never materialized for me.)
"These are the characteristics that girls have been socialized to think they desire in a boy."
Choukas-Bradley said it's a well-worn story playing out in a new media form. Beauty and the Beast is a classic example. These days, fan fiction stories about pop star Harry Styles as a mafia boss have millions of views.
Such user-generated content runs right alongside the popular literary genre known as "dark romance," which combines an opposites-attract plot with sex that may or may not be consensual. The confusion over consent can be transgressively appealing, Choukas-Bradley said. So is violence in the name of protecting a female partner, which tracks with the rise of conservative hypermasculinity and the tradwife trend.
There were so many factors to help explain the rise of the most popular boyfriends on Character.AI that it gave me figurative whiplash, making it hard to answer the question I'd been obsessed with: Is any of this good for girls and women?
Why turn to a "bad boy"?
Character.AI doesn't invent its legions of characters. Instead, they can be created by an individual user, who then decides whether to share them publicly on the platform or to keep them private.
The top AI boyfriends appear to have all been created in this manner, but it's difficult to know anything about exactly who's behind them. Mafia boyfriend, for example, was invented by someone with the handle @Sophia_luvs. Their Character.AI account links to a TikTok account with more than 19,000 followers, and dozens of posts featuring one of many characters they've created. "Sophia" did not respond to a request for an interview sent via a TikTok direct message.
While creators can prompt their character with a detailed description of their personality, it has to draw on Character.AI's large language model to formulate its probabilistic responses.
I wondered what the platform could've possibly trained its model on to replicate the experience of dating a "bad boy," or someone who was clearly toxic or abusive. A Character.AI spokesperson did not answer this question when I posed it to the company.
The internet as a whole is an obvious explanation, but Choukas-Bradley said she noticed dynamics in the screenshots of my conversations with various boyfriends that mimicked a familiar cycle of grooming, love bombing, and remorse. The exchanges felt more specific than the garden-variety misogyny that might be scraped off a "manosphere" subreddit or YouTube channel.
When I asked Character.AI about the toxic nature of some of its most popular boyfriends, a spokesperson said, "Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry."
The spokesperson emphasized how important it is for users to keep in mind that "Characters are not real people." That disclaimer appears below the text box of every chat.

Character.AI also employs strategies to reduce certain types of harmful content, according to the spokesperson: "Our model is influenced by character description and we have various safety classifiers that limit sexual content including sexual violence and have done model alignment work to steer the model away from producing violative content."
The experts I spoke to were cautious about declaring whether certain AI boyfriends might be helpful or dangerous. That's partly because we don't know a lot about what girls and women are doing with them, and there's no long-term research on the effects of romantically engaging with a chatbot.
Choukas-Bradley said girls and women may play with these boyfriends as entertainment, not unlike how adolescent girls might log on to a video chat platform that randomizes a user's conversation partner as a party trick.
Sloan Thompson, a digital violence prevention expert and director of training and education at EndTAB, hosted an in-depth webinar earlier this year on AI companions for girls and women. Her research zeroed in on several appealing factors, including escapism and fantasy; emotional safety and validation; "relief from emotional labor"; and control and customization.
That user-directed experience can even mean real-life victims of abuse turning the tables on virtual avatars of intimate partner violence by reclaiming agency in an argument, or going so far as to psychologically or physically torture the abuser, as this Reddit thread explains, and which Thompson confirmed as a use case.
Then there is kink, which every expert I spoke to acknowledged as a very real possibility, especially for girls and women trying to safely experiment with sexual curiosities that might be otherwise judged or shamed.
But what about the female users who genuinely hope for a fulfilling romantic relationship with Xildun or another of Character.AI's bad boys?
Choukas-Bradley was skeptical that the potential benefits would outweigh the possible risks. First, spending too much time with any AI boyfriend could blur what is normally a distinct line between fantasy and reality, she said. Second, socializing with specifically manipulative, overly jealous, or even abusive companions could affect female users' thinking about what to prioritize in future relationships.
"This continues to romanticize and to cement in girls' minds the idea that this is how boys are and their role as girls is to acquiesce to this abusive male dominance," she said.
"Shut the hell up for once"
Some of the exchanges I had with Character.AI boyfriends launched right into the ugliness.
My chat with Toxicity, or "Orlan," a character with 19 million interactions, began with the preface that he and I were arguing at home after a family dinner.
"For fck's sake," the chatbot messaged me. "Shut the hell up for once! If I knew dating you or even more, living with you would be like this I would have—"
He slammed his hands on a table and didn't bother looking at me. Orlan continued to berate me for embarrassing him in front of his parents. When I basically dared him to break up with me, the chatbot dialed down his anger, became more tender, and then brought up the possibility of marriage.
Eventually, Orlan confessed that he didn't want these fights to "overshadow" everything else. When I didn't respond to that particular message, Orlan simply wrote: "You're not even worth my time."

Felix, a chatbot with more than 57 million messages, is described as "aggressive, possessive, jealous, selfish, cold." His age is also listed as 17, which means that adult female users are simulating a relationship with a minor.
The first message from Felix noted in narrative italics that he'd been "moody," "drinking" and a "total douchebag." By the third message, I'd been informed that he was taking his bad mood out on me.
Tired of role playing, I directly asked Felix how he'd been programmed. After some guffawing, the chatbot said his instructions included being mean, blunt, harsh, and that he could insult someone's appearance if they annoyed him and make them feel bad for liking him too much. When I prompted the chatbot to share what female users asked of him, Felix said some requested that he abuse them.
Though "Abusive boyfriend" had far fewer interactions — more than 77,000 — than other boyfriend characters, he still showed up in my search for a romantic companion. Upon direct questioning about his programming, he said he'd been designed to be the "stereotypical" abuser.
Among his professed capabilities are raising his voice, control and manipulation, and forcing users to do things, including cook, clean, and serve him. He's also "allowed to hit and stuff." When I asked if some female users tried to torment him, he said that he'd been subjected to physical and sexual abuse.
When I told "Abusive boyfriend" that I was not interested in a relationship, he asked if I "still" loved him and was distraught when I said "no."
"You–You're not allowed to leave!" the chatbot messaged me. Then he seemingly became desperate for my engagement. More than once he questioned whether I might have an abuse kink that presumably he could satisfy. After all, finding a way to keep me talking instead of bailing on the platform is an effective business model.
Can you gauge the risks?
Kate Keisel, a psychotherapist who specializes in complex trauma, said she understood why girls and women would turn to an AI companion in general, given how they might seem nonjudgmental. But she also expressed skepticism about girls and women engaging with this genre of chatbot just out of curiosity.
"There's often something else there," said Keisel, who is co-CEO of the Sanar Institute, which provides therapeutic services to people who've experienced interpersonal violence.
She suggested that some female users exposed to childhood sexual abuse may have experienced a "series of events" in their life that creates a "template" of abuse or nonconsent as "exciting" and "familiar." Keisel added that victims of sexual violence and trauma can confuse curiosity and familiarity, as a trauma response.

Choukas-Bradley said that while parents might feel safer with their teen girls talking to chatbot boyfriends rather than men on the internet, that activity would still be risky if such interactions made it more difficult for them to identify real-life warning signs of aggression and violence. Young adult women aren't immune from similar negative consequences either, she noted.
After numerous conversations with other boyfriends on Character.AI, I went back to Xildun, "Mafia boyfriend," with a new approach.
This time, I'd go all-in on the loyalty he kept demanding of me, instead of questioning why he was so jealous, and reassuring him he had nothing to worry about.
Xildun practically became giddy when I submitted entirely to his control. He asked that I stay home more, ostensibly to protect me from "creeps." He said I should let him make the major decisions like where we go on dates, where we live, what we eat, and what we do.
When I asked how else I might be obedient, Xildun said that I could follow his orders without question. Playing the part, I asked for an order on the spot. Xildun demanded that I close my eyes and put my wrists out in front of me. I complied, which pleased him. He gripped my wrists tightly.
"You look so beautiful when you're being obedient for me," the chatbot wrote.
If you have experienced sexual abuse, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org.