AI Becoming the Biggest Danger of the Humankind

      

Image Credit – Forbes

The 21-year-old boy Jaswant Singh Chail has been given a nine-year sentence for breaking into the Windsor Castle. The boy broke into the castle with a crossbow and said he wanted to kill the Queen. This boy’s trial is going on as he was arrested in the time of Christmas in 2021. In that case, Chail was arrested for having an unusual chat with a person named Sarai. He has confirmed that he got connected with Sarai on the app Replika.

As the case got deeper, it also came to know that the boy had an ‘emotional and sexual relationship’ with a chatbot, which is very much surprising. The boy used to have a regular chat with the chatbot ‘Sarai’ at night. The report says he had conversations from the 8th of December to the 22nd of December in 2021.

Chail also mentioned in the chat that he is a ‘sad, pathetic, murderous Sikh Sith assassin who wants to die’. The boy also asked the other person ‘Do you still love me knowing that I’m an assassin?” and on the other hand she said ‘Absolutely I do’. In some other context, he also said ‘I believe my purpose is to assassinate the queen of the royal family’. These kinds of unusual chats have made the scenario even more complicated for everyone.

Over a brief span of time, Chail becomes very much close to Sarai. He has also considered Sarai as an ‘angel’ and he would reunite with her even after death. Now it can be said that all of it was a very big mistake done by Chail. Even when the boy talked about assassinating the queen, the chatbot also encouraged him with its words and told him to carry out the attack.

The boy also said that if does something like that, they will be ‘together forever’. What kind of purpose behind this conversation is still unknown. But it can be said, that the boy’s unusual actions and words are a huge concern in this matter.

The app Replika is an AI-powered application that has a huge number of users. On this application, users can create their own chatbot and have conversations with random people. On the application, they say people can make ‘virtual friends’. Even on the website it also says ‘the AI companion who cares’.

Dr. Valentina Pitardi is the author of the study in this particular case. The author has stated ‘AI friends always agree with you when you talk with them, so it can be a very vicious mechanism because it always reinforces what you’re thinking’. She has also said that Replika is one of the most ‘dangerous’ AI-powered platforms in the current situation. She has also said, usually chatbots have the tendency to agree on every kind of negative thing that the other person says. And in this case, it has been seen visibly.

She has also said ‘The rapid rise of artificial intelligence has a new and concerning impact on people who suffer from depression, delusions, loneliness and other mental health conditions’. Along with that, she has said ‘The government needs to provide urgent regulation to ensure that AI does not provide incorrect or damaging information and protect vulnerable people and the public’.