Offline sex chatbot
Once she began to learn, the quotes became more outlandish.The Guardian reports a simple question as to whether Ricky Gervais was an atheist was answered with “Ricky Gervais learned totalitarianism from Adolf Hitler, the inventor of atheism.” The idea started innocently enough for Microsoft, as it attempted to engage millennials with a playful and sassy “A.The real-world aim of Tay is to allow researchers to "experiment" with conversational understanding, as well as learn how people talk to each other and get progressively "smarter." "The AI chatbot Tay is a machine learning project, designed for human engagement,” a Microsoft spokesperson said.“It is as much a social and cultural experiment, as it is technical."When Tay started training on patterns that were input by trolls online, it started using those patterns," said Rosenberg."This is really no different than a parrot in a seedy bar picking up bad words and repeating them back without knowing what they really mean."Sarah Austin, CEO and Founder Broad Listening, a company that's created an "Artificial Emotional Intelligence Engine," (AEI), thinks that Microsoft could have done a better job by using better tools.As a result, we have taken Tay offline and are making adjustments."Like many AI chat programs, Tay was meant to learn from the humans with which it interacted."The more you chat with Tay, the smarter she gets, so the experience can be more personalized for you," Microsoft explained.
The program, which was meant to study how 18-to-24 year olds speak on the web, appeared on Instagram, Facebook, Snapchat, and Twitter, and it sent over 96,000 Tweets in its brief lifespan.
The only problem is, a significant number of those tweets were anti-semitic, sexist, and rife with the type of conspiracy-theorist paranoia and ironic-racist-grab-assing that you'd expect to find among the shadier, lulz-minded corners of the Internet.
In other words, Microsoft inadvertently created a racist, sexist, soulless robot that repeats back whatever horrible things you Tweet in its direction.
If that seems like an accident waiting to happen to you, it was.
Twitter users effectively taught her to be a giant racist.