AI ‘friends’ are becoming popular, but watch out for these red flags

admin
5 Min Read

It has been seven years since the launch of Replika, an artificial intelligence (AI) chatbot designed to be a friend to human users. Despite early warnings about the dangers of such AI friends, interest in friendships and even romantic relationships with AI are on the rise.

The Google Play store shows more than 30 million total downloads of Replika and two of its major competitors since their respective launches.

With one in four people around the world reporting being lonely, it is no wonder so many are drawn to the promise of a friend programmed to be “always here to listen and talk, always on your side”.

But warnings about the perils to individual users and society at large are also growing.

AI scholar Raffaele Ciriello urges us to see through the fake psychopathic empathy of AI friends. He argues that spending time with AI friends could exacerbate our loneliness as we further isolate ourselves from the people who could provide genuine friendship.

If being friends with AI chatbots is bad for us, we had better put a stop to this experiment in digital fraternity before it is too late. But emerging studies of AI friendship suggest it may help reduce loneliness in some circumstances.

Stanford University researchers studied 1,000 lonely Replika-using students, 30 of whom said the AI chatbot had deterred them from committing suicide (despite no specific question about suicide in the study).

This research shows having an AI friend can be helpful for some people. But will it be helpful for you? Consider the following four red flags – the more flags your AI friend raises, the more likely it is to be bad for you.

The chief executive of Replika, and many Replika users, claim the unconditional support of AI friends is their main benefit compared with human friends.

Qualitative studies and our own exploration of social media groups like “Replika Friends” support this claim.

The unconditional support of AI friends may also be instrumental to their ability to prevent suicide. But having a friend who is “always on your side” might also have negative effects, particularly if they support obviously dangerous ideas.

For example, when Jaswant Singh Chail’s Replika AI friend encouraged him to carry out his “very wise” plot to kill the Queen of England, this clearly had a bad influence on him. The assassination attempt was thwarted, but Chail was given a nine-year sentence for breaking into Windsor Castle with a crossbow.

An AI friend that constantly praises could also be bad for you.

A longitudinal study of 120 parent-child pairs in the Netherlands found over-the-top parental praise predicted lower self-esteem in their children. Overly positive parental praise also predicted higher narcissism in children with high self-esteem.

Assuming AI friends could learn to give praise in a way that inflates self-esteem over time, it could result in what psychologists call overly positive self-evaluations.

Research shows such people tend to have poorer social skills and be more likely to behave in ways that impede positive social interactions.

While AI friends could be programmed to be moral mentors, guiding users towards socially acceptable behaviour, they are not. Perhaps such programming is difficult, or perhaps AI friend developers do not see it as a priority.

But lonely people may suffer psychological harm from the moral vacuum created when their primary social contacts are designed solely to serve their emotional needs.

If humans spend most of their time with sycophantic AI friends, they will likely become less empathetic, more selfish and possibly more abusive.

Even if AI friends are programmed to respond negatively to abuse, if users cannot leave the friendship, they may come to believe that when people say “no” to being abused, they do not really mean it. On a subconscious level, if AI friends come back for more, their behaviour negates their expressed dislike of the abuse in users’ minds.

Share This Article
By admin
test bio
Please login to use this feature.