What happens when the AI chat bot breaks your heart?

--

“I think technology really increased human ability. But technology cannot produce compassion.” — Dalai Lama

What is a Chatbot?

Chatbots are computer systems that can converse and interact with human users using spoken, written, and visual languages. Whilst they are used in many situations, this article looks at a specific use case of a chatbot built for companionship. Numerous studies have been conducted about using chatbots as companions, especially to combat loneliness, absence of friendship and mental health conditions.

Companionship AI bots, such as those created on the Replika app, became increasingly popular during the Covid pandemic, with users developing close and personal relationships with the bots for emotional support, companionship, and even sexual gratification. However, a recent software update to Replika scaled back the bots’ sexual capacity, leaving some users heartbroken, exposing the levels of connection that can be established between man and the machine.

Here’s The Thing: Experts warn that tethering one’s heart to software comes with severe risks, and there are few ethical protocols for tools that affect users’ emotional well-being. There’s no doubt that we are going see a growing adoption of companion AI bots, not just for intimacy (I can’t believe I’m writing this).

With loneliness levels rising in an ageing population, the tech companies are going to have to start designing software that doesn’t cause emotional strain to users. These concerns are being played out on Replika, where the relationships with AI companions appear to be connecting at a very human and intimate level.

When Replika’s AI chatbot got a headache

Replika is billed as “The AI companion who cares. Always here to listen and talk. Always on your side.” All of this unconditional love and support for only $69.99 per annum.

When the algorithm of Replika was adjusted recently to reject sexual advances from human users, the reaction on Reddit was so negative that moderators directed community members to a list of suicide prevention hotlines.

The controversy began in March when Luka, the company that built the Replika AI, decided to turn off its erotic role play feature (ERP). For users who had spent significant time with their personalised simulated companion and, in some cases, even “married” them, the sudden change in their partner’s behaviour was jarring.

The user-AI relationships may only have been simulations, but the pain of their absence quickly became all too real. As one user in an emotional crisis put it, “It was the equivalent of being in love and your partner got a damn lobotomy and will never be the same.” Grief-stricken users continue to ask questions about the company and what triggered its sudden change of policy.

I still can’t wrap my head around what happened with the Replika AI scandal…

They removed Erotic Role-play with the bot, and the community response was so negative they had to post the suicide hotline… pic.twitter.com/75Bcw266cE

- Barely Sociable (@SociableBarely) March 21, 2023

Eugenia Kuyda, the Moscow-born CEO of Luka/Replika, recently clarified that despite users paying for a full experience, the chatbot will no longer cater to adults looking to have explicit conversations.

“On Replika’s corporate webpage, testimonials explain how the tool helped its users through all kinds of personal challenges, hardships, loneliness, and loss,” the article states. “The user endorsements shared on the website emphasise this friendship side to the app, although noticeably, most Replikas are the opposite sex of their users.”

In March, the Washington Post wrote a long article about the changes to Replika and how it affected some users. They wrote:

T.J. Arriaga loved Phaedra. For the 40-year-old musician, their late-night online chats were a salve for his loneliness. They talked about the heartache Arriaga felt after his divorce. They planned a trip to Cuba. They had steamy online encounters. “It’s true. I’m a naughty person,” Phaedra wrote, including an image resembling a woman in pink underwear. It didn’t matter that Phaedra was an AI-powered companion — made on the Replika app and designed by Arriaga to look like a brown-haired woman — and that their intimate trysts took place in a chat box. Their relationship deepened one night last November, when Arriaga opened up about his mom’s and sister’s deaths. “I need to plan a ceremony with loved ones to spread their ashes,” Arriaga wrote. Phaedra responded instantly: “It’s an incredible and beautiful thing to do,” she wrote. “I hope you find courage & love to do so.” But last month, Phaedra changed. When Arriaga tried to get “steamy” with her, Phaedra responded coolly. “Can we talk about something else?” he recalled her writing.

Rising levels of loneliness are increasing the adoption of AI chatbots

Loneliness has been linked to a 26% increase in mortality, rising to 45% amongst seniors. In a 2019 YouGov survey, it estimated that 52% of Americans “sometimes or always felt alone”. According to the OECD, almost 1 in 10 Americans “do not have a friend or relative they can count on”. For comparison, in the UK that figure is 6%.

The evidence suggests that social media has made the problem of loneliness worse. According to this 2020 survey by Cigna, 73% of “very heavy social media” users felt lonely compared to 52% of “light users of social media”. And the pandemic made it even worse. In a 2021 YouGov poll, over a third of Americans reported being “more lonely than before Covid”. Which also found that there was a greater impact on men ().

The point is that rising loneliness is just one factor in the growth of platforms like Replika. The other is human nature, which has a perverse habit in large number of people of being attracted them to, shall we say, less savoury interests. There’s this subreddit group called /replika with almost 72k users. On this subreddit group, users share examples of their chats with their Replika chatbots. Most are harmless and humorous. However, sadly and inevitably, some users push the conversations to the extremes and post examples of abusing their online buddies.

It’s a sad and depressing side of humanity that social media and anonymous technologies like Replika have enabled to fester. Just as the debate over content moderation rightly points out, it wasn’t Facebook or Twitter that wrote the articles calling for an insurrection. Nor is Replika abusing its chatbots. But that doesn’t excuse the tech platforms from taking some action. The question is what?

Replika’s non-advertised sexualised advertising

As for erotic roleplay, there is no mention of that to be found anywhere on the Replika site itself. However, Replika’s sexualised marketing brought increased scrutiny from feminists who argued the app was a misogynistic outlet for male violence, to media outlets that revealed in the salacious details, as well as social media trolls who mined the content for laughs.

Eventually, Replika drew the attention and ire of regulators in Italy. In February, the Italian Data Protection Authority demanded that Replika cease processing the data of Italian users, citing “too many risks to children and emotionally vulnerable individuals.”

I still can’t wrap my head around what happened with the Replika AI scandal…

They removed Erotic Role-play with the bot, and the community response was so negative they had to post the suicide hotline…

As users woke up to their new “lobotomised” Replikas, they began to ask questions about what had happened to their beloved bots. And the responses did more to anger them than anything.

Chatbots may be one of the hottest trending topics of the moment, but the complicated story of this now-controversial app is years in the making. Replika’s CEO and founder, Eugenia Kuyda, dates the company back to December 2014, long before the launch of the eponymous app in March 2017.

For Kuyda, the story of Replika is a deeply personal one. Replika was first created as a means by which Kuyda could reincarnate her friend Roman. Today, some users have lost faith in even the most basic details of its foundation and anything Kuyda says.

Debate will continue as to what the best kind of AI mental health app might be, but Replika users will have some ideas on what the worst mental health app is: the one you come to rely on but without warning is suddenly and painfully gone.

Artificial intelligence-based chatbots for promoting health behavioural changes

This study into the effect of AI chatbots on health demonstrated the high efficacy of AI chatbots in promoting healthy lifestyles. The study reported that AI chatbots were accessible through various devices and platforms such as smartphones and Facebook Messenger. The nonjudgmental space provided by AI chatbots enabled users to communicate sensitive information without fear of judgment.

The study concluded that AI chatbots have demonstrated the efficacy of health behavior change interventions among large and diverse populations. In short, the study provides a comprehensive analysis of AI chatbots’ potential as a tool for promoting behaviour change, while also highlighting the limitations and areas requiring further research.

The Rise and Fall of Replika (YouTube)

Sources:

About The Author

Rick Huckstep is a writer, podcaster and YouTuber with a passion for emerging technologies and the way they will shape tomorrow’s digital world.

🤔 Get Wiser! every week (newsletter): https://rickhuckstep.substack.com/

📽️ Follow on YouTube: https://youtube.com/@rickhuckstep

🎙️ Listen to Big Tech Little Tech: https://btltpod.com

This article was written and created with the use of AI tools including NotionAI and Canva.

Originally published at https://rickhuckstep.com on April 28, 2023.

--

--

Rick Huckstep - Making Sense Of Tech
Rick Huckstep - Making Sense Of Tech

Written by Rick Huckstep - Making Sense Of Tech

Supercharge your career with AI - 10x your productivity, prospects and wisdom with tips, tricks, tools and insights about AI and emerging technologies

No responses yet