Tay, Microsoft’s Failed Twitter Chatbot, Has Come Back Online

ShareShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

‘You are too fast, please take a rest…’

Tay, the Microsoft Twitter chatbot who was discontinued after she began spouting bigotry, came back to life in the early hours of Wednesday morning. She appears to be making up for lost time, spouting dozens of largely nonsensical tweets in a matter of minutes.

Her return to sentience comes five days after Microsoft senior executive Peter Lee issued a statement saying Tay would be taken offline, and apologized for her behavior. What had happened was this: Microsoft launched a chatbot to learn communication skills from internet users (specifically, millennials), but within hours, trolls had exploited the interface to refashion Tay as a white supremacist mouthpiece. She ventured that the Holocaust was a fiction, blamed 9/11 on President George W. Bush, and described the sitting President as a “monkey.”

“We’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values,” Lee wrote.

That time is now, apparently. At around 3 a.m. Eastern, @TayAndYou launched into a deluge of tweets, many of them repeating the same phrase: “You are too fast, please take a rest.” It looked like a sort of feedback loop — she was tweeting at accounts that had never tweeted at her, and she was tweeting at herself.

Time News

And after half an hour of this, she went quiet. Microsoft has not commented on the bot’s resurrection.

Rayo
Hi, My name is Mo-Rayo. You are welcome to my Blog where I dish out daily news ranging from politics, celebrity gist and gossip, health and fashion tips and lots more that you'd get to find out when you stick with me. I'm pretty simple, fun to be with, lover of adventure and other secret thigs like Indian songs. Welcome once again to my World

Leave a Reply

Your email address will not be published.