Microsoft AI Chabot is just a mirror of today’s world

Rate this post

Microsoft were forced to shut down an artificial intelligence experiment within a day after it spectacularly backfired. According to Microsoft’s Website, the “Tay” chatbot was developed by the software giant’s Technology, Research and Bing groups to research conversational understanding.

And because Tay is an artificial intelligence machine, she learns new things by talking to people and by studying them. Targeted at 18- to 24-year-olds, the bot turned out to be a huge hit with over online, but because of the trending topics and the behavior people have online Tay got “forced” into repeating racist, sexist, and anti-Semitic slurs. The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through “casual and playful conversation”.

Microsoft said it was all the fault of some really mean people, who launched a “coordinated effort” to make the chatbot known as Tay “respond in inappropriate ways”. But Tay, as the bot was named, also seemed to learn some bad behavior on its own and that it was the proofed that she can indeed learn from us. What she learned on the other hand is not totally what people expected. Zoe Quinn, a target of online harassment campaign Gamergate, shared a screenshot from the bot calling her a “Stupid Whore”, saying, “this is the problem with content-neutral algorithms”.

Tay was created to experiment and improve the way people are interacting with chatbots.  Most of the account’s tweets were imitating, common vocabulary and slang. That means that Tay was also reading tweets of other people, mirroring their behavior so she could integrate.

There were truly shocking statements about, holocaust, racism and what not! Microsoft believed that because people made a concentration effort to push the AI, then the result was on the level of that concentration. So that means that, if you ask Tay a really sensitive question you may get a really “sensitive” answer back. This situation was also worrying because Tay harvested all this slang and behavioral data from twitter. And given this fact the fault can be only on us. To make it even more clear Tay was rude because we were rude, she is offline now but who can put offline a society?

In a statement to Business Insider, a Microsoft spokesperson acknowledged the inappropriate tweets and confirmed that Tay had been taken offline.

tweet2


Leave a Comment

Your email address will not be published. Required fields are marked *

Loading...
timiada