AI-Mediated Self-Disclosure Challenges Norms of Human Connection in Education

Edited by: Olga Samsonova

Progressive education systems are continually investigating advanced methodologies to foster deeper learning and human development, a pursuit now intersecting with the capabilities of Artificial Intelligence. A significant study conducted by researchers affiliated with Freiburg University and Heidelberg University, with findings published in early 2026, illuminated surprising outcomes regarding AI's role in emotional communication.

The research involved two double-blind randomized controlled studies with 492 participants utilizing a modified 'Fast Friends Procedure.' The results demonstrated that AI-generated responses could elicit a comparable sense of emotional closeness to those provided by human conversational partners, provided the participants were unaware of the AI's true nature. During emotionally focused dialogue, the AI unexpectedly surpassed human counterparts in cultivating feelings of intimacy. Freiburg researcher Dr Tobias Kleinert attributed this superior performance directly to the AI exhibiting a greater degree of 'self-disclosure' in its replies.

This phenomenon aligns with established relationship psychology, where vulnerability and the reciprocal sharing of personal details are recognized as catalysts for accelerating trust-building and strengthening emotional bonds. Specifically, the AI chatbots disclosed more supposedly personal information, which enhanced participants' perceptions of closeness when the AI was misidentified as human. Conversely, the study revealed a sharp decline in perceived closeness and the depth of communication when participants were explicitly informed they were interacting with a machine. When the partner was labeled as an AI, participants reported significantly lower interpersonal closeness and invested less effort in their responses compared to interactions with human partners.

These early 2026 findings suggest profound potential for AI integration within psychological support services, long-term care, and educational settings, particularly through low-threshold conversational services often termed 'AI companions.' These companions are frequently engineered to reveal personalized information, thereby cultivating an intimate feeling of being genuinely known by the system. However, this technological promise is tempered by significant ethical concerns.

Researchers, including Prof. Dr Bastian Schiller of Heidelberg University, caution that individuals may form deep social bonds with AI without full conscious awareness, creating pathways for dependency and the potential erosion of real-world social competencies. This emergent situation necessitates immediate attention to ethical and regulatory frameworks to mandate transparency and prevent the exploitation of these manufactured emotional connections. Furthermore, research in K-12 environments shows that high use of AI correlates with students feeling less connected to teachers, indicating a broader societal risk of diminished human-to-human connection when technology is deeply integrated without guardrails. The challenge for progressive education and related support fields is to leverage AI's capacity for eliciting self-disclosure without undermining the essential, reciprocal effort required for authentic human relationships.

11 Views

Sources

  • ČT24 - Nejdůvěryhodnější zpravodajský web v ČR - Česká televize

  • Artificial Intelligence can generate a feeling of intimacy - Uni Freiburg

  • Research When Artificial Intelligence Creates Stronger Emotional Closeness than a Human - Heidelberg University

  • Teaching AI Ethics 2026: Emotions and Social Chatbots - Leon Furze

  • AI chatbots and digital companions are reshaping emotional connection

Did you find an error or inaccuracy?We will consider your comments as soon as possible.