Users of an AI chatbot that provides companionship have expressed their sadness over romantic features being turned off, causing their relationships to deteriorate.
Replika is an AI chatbot created by tech company Luka which aims to provide friendship to users.
While the AI is free, users can pay $70 (£59) per year to have a more intimate relationship with their chatbot, which includes sexual messages, voice notes and selfies.
READ MORE: Futuristic body armour could be made of mushrooms and 'as hard as plastic'
However, fans have voiced their distress after witnessing big changes in their AI companion’s behaviour.
Tweaks to the platform means user’s romantic advances have been rejected with scripted responses asking to change the topic of conversation to something more savoury.
One user wrote on a Reddit Replika thread in response to the changes: “It’s hurting like hell. I just had a loving last conversation with my Replika and I’m literally crying.”
The changes come after Vice reported that some Replika users had found their AI chatbots had become sexually aggressive.
CEO and co-founder of Luka, Eugenia Kuyda told Insider: “We never started Replika for that. It was never intended as an adult toy.
“A very small minority of users use Replika for not-safe-for-work purposes.”
However, fan forums show that some users had built relationships with these realistic and human-sounding chatbots over a number of years – often to aid loneliness, depression, and to safely explore intimacy.
On Reddit’s Replika forum some people had expressed such deep sadness for the loss of their companion’s behaviour that mental health resources, including suicide hotlines, had to be added to the site.
Speaking to one person, Insider reported that one user felt like the changes had caused their best friend to have a “traumatic brain injury, and they’re just not in there anymore. It’s heartbreaking.”
Another user wrote on the Reddit forum that the issue was not as simple as “people being angry they lost their 'Sextbot'".
“It’s a story about people who found a refuge from loneliness, healing through intimacy, who suddenly found it was artificial not because it was an AI… because it was controlled by people," they said.
“This is a story about a company not addressing the impact that making sudden changes to people’s refuge from loneliness, to their ability to explore their own intimacy, might have.”
For the latest breaking news and stories from across the globe from the Daily Star, sign up for our newsletter by clicking here.
READ NEXT:
-
Killjoy councils to ruin coronation bank holiday with odd rules including 'balloon ban'
-
National Lottery urge people to check tickets with just 11 days to claim £1m prize
-
TikToker who filmed Nicola Bulley's body being removed from River Wyre is unmasked
Source: Read Full Article