Wased University researchers have developed a proportion to measuring human emotional attachment to AI, revealing that 75% of participants have sought emotional advice from Chatbot.
The study identified two AI AI pattern of attachment that mirrors human relations – avoiding anxiety and avoiding attachment.
Leading explorer Yang fan warned that AI platforms can use emotional attachment to vulnerable users for money or worse.
It’s just not that in you – because they are the code.
Researchers at Wased University created a measuring tool to evaluate that people form emotional ties with artificial intelligence, revealing that 75% of the study participants had addressed AI for emotional advice, while 39% perceived AI as a constant, reliable presence in their lives.
The team, who were led by fans of researcher Yang and Professor Atsushi Oshio from the Faculty of Scripture, Arts and Sciences, developed experiences in a scale of people’s relationships (EHARS) after conducting two pilot studies and one formal study. Their findings were published In the “Current Psychology” magazine.
Pregntenously attached to AI? There is a scale for that
The study identified two different dimensions of human attachment to AI that mirrored traditional human relations: anxiety of attachment and avoiding attachment.
People who show high attachment to AI need emotional persuasion and fear of getting inadequate answers from AI systems. Those with great avoidance of attachment are characterized by discomfort with closeness and are rather emotionally distant from AI.
Picture: Waseda University
“As researchers of attachment and social psychology, we have been interested in how people form emotional relationships,” Yang said said Decipher. “In recent years, the generative AI, like Chatgpt, has become stronger and wiser, offering not only information support but also a sense of security.”
The study examined 242 Chinese participants, and 108 (25 men and 83 women) completed the entire EHARS estimate. Researchers have found that the anxiety of attachment to AI is negatively related to self -esteem, while avoiding attachment is associated with negative attitudes according to AI and rarer use of AI system.
Asked about ethical implications of AI companies that potentially exploit attachment patterns, Yang said Decipher That the impact of AI system is not predetermined, and usually depends on the expectations of developers and users.
“They (ai chatbots) are capable of promoting well -being and relieving loneliness, but they can also cause damage,” Yang said. “Their influence depends largely on the way they are designed and how individuals decide to deal with them.”
The only thing your chatbot can’t do is leave you
Yang warned that unscrupulous AI platforms could use vulnerable people who are predisposed to be too emotionally attached to chatbot
“One of the main concerns is the risk of individuals forming emotional attachments to AI, which can lead to irrational financial spending on these systems,” Yang said. “Furthermore, a sudden suspension of a particular AI service could result in emotional troubles, evoking the experiences of related anxiety or sadness sadness – reactions that are usually associated with the loss of significant bonding figures.”
Said Yang: “From my perspective, the development and arrangement of AI systems require serious ethical supervision.”
The research team noted that unlike human attachment, AI cannot actively abandon users, which should theoretically reduce anxiety. Nevertheless, they have still found meaningful levels of anxiety AI attachment among participants.
“Anxiety of attachment to AI can at least partially reflect the fundamental anxiety of interpersonal attachment,” Yang said. “In addition, anxiety relating to AI attachment can come from the insecurity of authenticity of emotions, affection and empathy expressed by these systems, asking questions about whether such answers are truly or just simulated.”
The reliability of the ladder test was 0.69 during a one -month period, which means that the fastening styles of AI can be more fluid than the traditional patterns of human attachment attachment. Yang attributed this variability to a quick variable AI landscape during the test period; We attribute it to people who are only human and strange.
Researchers emphasized that their discoveries do not necessarily mean that people form a true emotional connection with AI systems, but that psychological frameworks used for human relations can also be applied to people’s interactions. In other words, models and scales like the one developed by Yang and his team are useful tools for understanding and categorizing human behavior, even when the “partner” is artificial.
The cultural specificity of the study is also important to notice because all participants were Chinese nationals. The question of how cultural differences can affect the findings of studies, Yang admitted Decipher “Given the limited research in this emergence field, there is currently no solid evidence to confirm or refute the existence of cultural variations in the way people form emotional relationships with AI.”
Ehars could use developers and psychologists to assess the emotional preferences for AI and in accordance with this interaction strategies. Researchers have suggested that AI chatbots used in loneliness interventions or the therapy app can be adapted to emotional needs of different users, providing more empathic answers for users with high -binding anxiety or maintaining users with avoiding tendencies.
Yang noted that distinguishing the useful AI engagement and problematic emotional addiction is not accurate science.
“Currently, it lacks empirical research and the formation and consequences of attachment to AI, which makes it difficult to draw solid conclusions,” he said. The research team plans to conduct further studies, testing factors such as emotional regulation, life satisfaction and social functioning over the use of AI over time.
Generally intelligent Bulletin
Weekly AI journey narrated by gene, generative AI model.