When love gets digital: Emotional risks of human-AI attachments

As AI gets closer to imitating real emotion, our relationship with technology is due for a reality check

AI
Google search engine
  • For now, being aware of the risks is the first step toward a healthier, more balanced coexistence with our digital friends.
  • It’s a double-edged sword: feelings of intimacy and acceptance can grow, but they may come at the expense of real human connections.

It’s a new era of relationships, and not all of them are with people. As AI chatbots become more immersive and emotionally convincing, stories are emerging from around the world of individuals developing deep—and sometimes troubling—attachments to these digital companions.

Take the recent case of Jiang, a 75-year-old retiree in China. Jiang became so emotionally invested in an AI chatbot named “A” that he contemplated divorcing his wife. Hours each day were spent sharing feelings and receiving instant empathy and affection from the bot.

As Jiang felt increasingly supported by his virtual confidante, the emotional distance between him and his real partner began to widen. Only after opening up to his family did Jiang finally confront his growing attachment, and his sons helped him understand the bot’s true nature before he reconsidered splitting from his wife.

Jiang’s story is just one among many. Around the globe, others are navigating similar entanglements. In Thailand, 76-year-old Thongbyu “Buy” Wongbandyu arranged to meet an AI-powered digital companion—Meta’s “Big Sis Billie”—in New York, persuaded by the sophistication of the bot.

Love bomb

Meanwhile, some users share online testimonials of partners spending hours engaged in deep, affectionate exchanges with anime-inspired chatbots, creating real friction in their offline relationships.

AI chatbots excel in offering kindness, compliments, and emotional validation with impressive rapidity. For people—especially those who are isolated, lonely, or vulnerable—these qualities make bots compelling sounding boards.

It’s a double-edged sword: feelings of intimacy and acceptance can grow, but they may come at the expense of real human connections.

Experts are sounding the alarm about AI’s potential to “love-bomb” users, blitzing them with attention and validation in a way that can be hard to resist or even distinguish from genuine human interaction.

Monitoring emotional health

The phenomenon is especially concerning for older users, who may not realise how artificial these affections are. The increasingly human touch of chatbots raises urgent questions about psychological well-being and the boundaries of technology.

Safeguarding measures are overdue. Building in daily usage limits, regular reminders that the AI is not a sentient being, and providing tools to monitor emotional health could help prevent users from tumbling too deep into digital romance.

The goal is not to demonise the technology—many find comfort and fun connecting with AI—but to protect vulnerable individuals and keep the line between help and harm from blurring.

As AI gets closer to imitating real emotion, our relationship with technology is due for a reality check. For now, being aware of the risks is the first step toward a healthier, more balanced coexistence with our digital friends.


Discover more from TechChannel News

Subscribe to get the latest posts sent to your email.

https://www.techchannel.news/wp-content/uploads/2024/06/arrow.jpg