Microsoft Robot ‘chatbot’ spew offensive comments

ShareShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

Microsoft’s attempt to engage millennials via an artificially intelligent “chatbot” called Tay has failed miserably after trolls made the bot spew offensive comments.

The brainchild of Microsoft’s Technology and Research and Bing teams, Tay was designed to engage and entertain people when they connect with each other online. Targeted at 18 to 24-year olds in the U.S., Tay aimed to use casual and playful conversation via Twitter and messaging services Kik and GroupMe. “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” explained Microsoft, in a recent online post.

The Internet, however, can be an unpleasant place and Twitter trolls were quick to pounce on the TayTweets account after the chatbot launched on Wednesday. It wasn’t long before trolls were teaching Tay to make unpleasant comments. The Washington Times reports that after repeating racist comments, she then incorporated the language into her own tweets.

Tay’s tweets, which were also sexist, prompted the Telegraph newspaper to describe her as a “Hitler-loving sex robot.”

After tweeting 96,000 times and quickly creating a PR nightmare, Tay was silenced by Microsoft late on Wednesday. “c u soon humans need sleep now so many conversations today thx” she tweeted, bringing to a close one of the more ignominious chapters in AI history.

Fox News

Leave a Reply

Your email address will not be published.