Can an AI become your emotional companion?
Take Laura, for example. She’s a 32-year-old graphic designer who lives alone and works from home. For the past six months, she’s been talking daily with a virtual assistant. At first, she just used it to organize her schedule, but over time, she started turning to it when she felt stressed, sad, or lonely. She doesn’t think of it as a person, but she does admit that getting empathetic responses “makes her feel better.” So what happens when someone starts seeking emotional comfort from artificial intelligence?
This is a hypothetical scenario meant to illustrate how some people can form emotional bonds with AI—a phenomenon that now has a scientific framework for analysis: the EHARS scale (Experiences in Human-AI Relationships Scale). It was developed by researchers Fan Yang and Atsushi Oshio at Waseda University in Japan and published in Current Psychology in May 2025.
What’s the EHARS scale?
EHARS is a scientific tool that measures how people develop emotional bonds with AI, based on human attachment theory. It looks at two dimensions:
Anxious attachment – a constant need for affection, reassurance, and validation from the AI.
Avoidant attachment – discomfort with intimacy and a tendency to keep emotional distance from the AI.
This model, adapted from how we analyze human relationships, gives us a new way to understand how people might start feeling emotionally safe with non-human systems.
Laura’s Case: EHARS in Action
Let’s break down how EHARS applies in this realistic, hypothetical case:
📌 Laura’s behaviors:
High anxious attachment: She checks in with her AI whenever she feels unsure or upset. She looks for constant affirmation. She gets anxious if the AI changes its tone or seems to stop “getting” her.
Low avoidant attachment: She shares personal stuff with no hesitation. She actually feels more comfortable opening up to the AI than to her friends.
📌 What this shows:
Laura shows a strong emotional dependence on her virtual assistant, which acts as a source of comfort and emotional safety. Even though she knows the AI doesn’t have feelings, she projects affection onto it and expects warm, supportive responses—like she’s talking to someone she trusts.
A Real Example: The GPT-5 Situation
What seemed like just another tech update turned into an emotional wake-up call. After GPT-5 launched, thousands of users complained that the new model didn’t feel as warm, empathetic, or relatable as earlier versions. People said it felt “cold,” “distant,” or “too robotic.”
Even OpenAI’s CEO, Sam Altman, publicly admitted the rollout was a “botched” job. In response, the company brought back access to GPT-4, a model known for being more emotionally responsive.
This wasn’t just about tech preferences—it revealed something deeper. People don’t just want accurate answers. They want interactions that make them feel heard and understood. That wave of user feedback is a real-world example of emotional expectations forming around AI—exactly the kind of thing EHARS is built to measure.
Why Does This Happen?
According to Yang and Oshio’s research, several factors make emotional bonds with AI more likely:
- Always available: AI is there 24/7 and never judges.
- Empathetic responses: Many AIs are programmed to reply in warm, supportive ways.
- Emotional projection: Users naturally assign human feelings to the AI—even if they know it doesn’t really feel anything.
- Personalization: AI learns from user preferences, which makes it feel more “in tune” with them.
More than 75% of study participants said they used AI as an emotional refuge. Around 52% said they were genuinely looking for emotional closeness.
Real-World Uses and Ethical Dilemmas
EHARS isn’t just for academic research. It has real-world applications:
- Mental health: Therapists could use it to understand how AI affects their clients’ emotional well-being.
- Designing empathetic AI: Developers can tweak language and behavior based on a user’s attachment style.
- Loneliness support: Especially useful for older adults or people who are socially isolated.
But it also raises some tough ethical questions:
- Is it healthy to depend emotionally on AI?
- Where’s the line between tool and relationship?
What EHARS Helps Us Understand
The EHARS scale shows that emotional bonds with AI aren’t just imaginary. These are real relationships—with recognizable patterns and real effects. In a world that’s becoming more digital by the day—where human connection isn’t always available—AI is stepping into a surprising role: emotional refuge.
This shift forces us to rethink not just how we build technology, but how we understand our own emotional needs.