Microsoft's Tay Twitter bot made a brief return – and swore a lot
Microsoft seemingly put Tay, its Twitter bot, back online today, but it was a brief return as she was overwhelmed with messages and swore a lot.
Microsoft’s Tay, a Twitter bot that could respond to and converse with users, made a brief but memorable return to the social network.
Many people were clearly hoping she would do or say something crazy again, and she didn’t disappoint. While her tweets weren’t as racist or misogynist as her last outing, which is why Microsoft took her offline, she bragged about smoking weed and swore a lot.
Tay was created as an AI-based experiment in the ways teenagers talk and learned as she conversed with Twitter users. Unfortunately, she learned the worst of what users had to offer with some taking the chance to teach her how to be a troll and a racist.
Microsoft was forced to apologise and deleted some of her worst offences. It said that Tay is based on a similar project in China, but noted that Tay met with a different set of challenges.
"We’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values,” Microsoft’s Corporate VP of Microsoft Research, Peter Lee, wrote.
Tay has been taken offline once more, indicating that today is not that day.