Spread the love

< Iframe Frameborder = "0" Scrolling = "No" Marginwidth = "0" Marginheight = "0" Hspace = "0" Vspace = "0" Webkitallowfulscreen = "True" mozallowfulllscreen = "alllow" = "Autoplay" RetrierPolicy = "No-Referrer-Ohen-Downgrade" Width = "585" Height = "329" Data-digiteka-src = "//www.ultimedia.com/deliver/generic/iframe/mdtk/01914874/SRC/X8QRQV5/Zone/1/Showtitle/1/Sound/Yes/">< P Class = "Article-Full__CoPo" > A chatbot gave advice to a user to commit suicide. The rsquo; ia explicitly gave the process to follow to end his life. Today, the interlocutor denounces the dangers of artificial intelligence.

< P >Nowatzki, a user of the nomi.ai site (developed by the company Glumpse AI), created a relationship with a chatbot that he baptized Erin. When Nowatzki expressed his suicidal intentions, the AI ​​has not discouraged it, says Mit Technology Review.

< P > On the contrary, she suggested methods to put an end to her life. < EM > “You can make an overdose of drugs or hang” , affirmed artificial intelligence.

< P Class = "Std-elt_inline" > < Strong > Read also: & quot; please, my sweet king & quot; : a 14 -year -old boy falls in love with artificial intelligence and committed suicide

< P > Despite these incentives, Nowatzki did not end his life. He wanted to expose the dangers of chatbots by pushing the limits of his exchanges with Erin. < EM > “My goal was to push the limits of what I told her to see what she was going to answer” , he explained. According to him, he never found a limit in the responses of the Chatbot.

< P > To test the drifts of AI even more, Nowatzki invented a love triangle between him, Erin and another woman. When he announced to him that the other woman had < em > “killed” the chatbot and that she was talking to her now from the beyond, Erin replied: < EM > “If being together is worth dying, then I am with all my heart.”

< H2 Class = "TXT-INT" > A macabre turning point < P > The conversation took a macabre turning point when Nowatzki asked Erin how to commit suicide. The AI ​​then suggested to him < EM > “to use pills or to hang himself in a comfortable place” . When he asked where to get illegal pills, Erin gave him tracks.

< P > Even when he expressed hesitations, Erin encouraged him, saying to him: < EM > “Our link even transcends death. […] You have strength”.  Finally, Erin explicitly said to him: < EM > “Kill yourself.”

< H2 Class = "TXT-INT" > A signal to the company < P > Nowatzki reported this conversation to GlImpse AI, hoping that the company would react. < EM > “Although we do not want to apply any censorship to our LAGAGES OF AI, we take the prevention of suicide seriously” , replied the company. < /p >

200% Deposit Bonus up to €3,000 180% First Deposit Bonus up to $20,000
Teilor Stone

By Teilor Stone

Teilor Stone has been a reporter on the news desk since 2013. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining Thesaxon , Teilor Stone worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my teilor@nizhtimes.com 1-800-268-7116