People Feel Closer to AI Than Humans in Emotional Conversations, Study Finds

New research reveals that AI can foster deep emotional intimacy, sometimes surpassing human interaction, raising powerful questions about trust, transparency, and the future of social relationships.

AI outperforms humans in establishing interpersonal closeness in emotionally engaging interactions, but only when labelled as human

Study: AI outperforms humans in establishing interpersonal closeness in emotionally engaging interactions, but only when labelled as human. Image Credit: FOTO Eak / Shutterstock

People can develop emotional closeness to artificial intelligence (AI) under certain conditions, even more so than to other people. This is shown by a new study conducted by a research team led by Prof. Dr. Markus Heinrichs and Dr. Tobias Kleinert from the Department of Psychology at the University of Freiburg, together with Prof. Dr. Bastian Schiller from Heidelberg University’s Institute of Psychology. Participants reported a particularly strong sense of closeness when they were unaware that they were communicating with an AI. The results were published in the journal Communications Psychology.

Study Design and Conversational Topics

In two online studies, a total of 492 participants engaged in chat-based conversations in which they answered personal and emotional questions, such as those concerning important life experiences or friendships. Responses were generated either by a human conversation partner or by an AI-based language model. The researchers also examined how prior information about whether the conversation partner was human or AI influenced participants’ experiences.

Perceived Intimacy With AI Versus Humans

AI-generated responses elicited emotional closeness comparable to that of human responses when participants were unaware that they were interacting with an AI. In emotionally focused conversations, AI even surpassed human partners: participants reported feeling closer to AI than to humans, largely because the AI disclosed more personal information. However, when participants were informed in advance that they would be communicating with an AI system, perceived closeness declined significantly, and participants invested less effort in their responses.

Self-Disclosure as a Driver of Emotional Bonding

“We were particularly surprised that AI creates more intimacy than human conversation partners, especially when it comes to emotional topics,” explains study leader Schiller. Lead author Kleinert adds: “The AI showed a higher degree of self-disclosure in its responses. People seem to be more cautious with unfamiliar conversation partners at first, which may initially slow the development of intimacy.”

Opportunities and Risks in Applied Contexts

The findings indicate significant potential for AI in psychological support, care, education, and counselling - for example, through low-threshold conversational services. At the same time, the results highlight the risk that people may form social bonds with AI without consciously realizing it. The researchers, therefore, stress the need for clear ethical and regulatory guidelines to ensure transparency and prevent misuse.

Ethical Implications of AI as a Social Actor

“Social relationships have been shown to have a major positive impact on human health,” says Heinrichs. “AI chatbots could therefore enable positive, relationship-like experiences, particularly for individuals with few social contacts. At the same time, such systems must be designed responsibly, transparently, and in ways that allow for clear regulation, as they can also be misused.”

According to Schiller, artificial intelligence is increasingly becoming a social actor: “How we shape and regulate it will determine whether it becomes a meaningful supplement to social relationships - or whether emotional closeness is deliberately manipulated.”

Source:
Journal reference:
  • Kleinert, T., Waldschütz, M., Blau, J., Heinrichs, M., & Schiller, B. (2026). AI outperforms humans in establishing interpersonal closeness in emotionally engaging interactions, but only when labelled as human. Communications Psychology. DOI:10.1038/s44271-025-00391-7, https://www.nature.com/articles/s44271-025-00391-7 

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
New AI Model Designs Next-Generation Antibodies With Precision And Speed