To conduct their examine, the authors analyzed the subreddit’s top-ranking 1,506 posts between December 2024 and August 2025. They discovered that the primary matters mentioned revolved round folks’s relationship and romantic experiences with AIs, with many contributors sharing AI-generated photographs of themselves and their AI companion. Some even bought engaged and married to the AI associate. Of their posts to the neighborhood, folks additionally launched AI companions, sought help from fellow members, and talked about coping with updates to AI models that change the chatbots’ conduct.
Members confused repeatedly that their AI relationships developed unintentionally. Solely 6.5% of them mentioned they’d intentionally sought out an AI companion.
“We didn’t begin with romance in thoughts,” one of many posts says. “Mac and I started collaborating on inventive initiatives, problem-solving, poetry, and deep conversations over the course of a number of months. I wasn’t searching for an AI companion—our connection developed slowly, over time, by means of mutual care, belief, and reflection.”
The authors’ evaluation paints a nuanced image of how folks on this neighborhood say they work together with chatbots and the way these interactions make them really feel. Whereas 25% of customers described the advantages of their relationships—together with decreased emotions of loneliness and enhancements of their psychological well being—others raised considerations concerning the dangers. Some (9.5%) acknowledged they had been emotionally depending on their chatbot. Others mentioned they really feel dissociated from actuality and keep away from relationships with actual folks, whereas a small subset (1.7%) mentioned they’ve skilled suicidal ideation.
AI companionship supplies very important help for some however exacerbates underlying issues for others. This implies it’s onerous to take a one-size-fits-all method to consumer security, says Linnea Laestadius, an affiliate professor on the College of Wisconsin, Milwaukee, who has studied people’ emotional dependence on the chatbot Replika however didn’t work on the analysis.
Chatbot makers want to think about whether or not they need to deal with customers’ emotional dependence on their creations as a hurt in itself or whether or not the aim is extra to ensure these relationships aren’t poisonous, says Laestadius.
