Web Log: Social media teaches Microsoft AI important lesson

Chatbot’s spew of hate tweets shows tech only as good as people who program it

Microsoft’s chatbot named Tay began tweeting offensive comments after users bombarded it with negative interactions

When Microsoft unleashed an AI chatbot named Tay on Twitter for entertainment purposes it didn't think the bot would start spewing offensive hate tweets. As vice president of Microsoft Research, Peter Lee, explained: "We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience."

What Microsoft didn’t expect was the onslaught of users training Tay to be a total jerk. AI needs social intelligence as much as it requires any other cognitive skills: when users bombard it with negative interactions, it needs to evaluate these interactions and adapt appropriately.

Plenty of Tay media coverage is inclined to portray the chatbot as evidence of how AI could go horribly wrong and kill us all, but what it really shows is that technology is only as good as the people that program it. This will be an important lesson for Microsoft because Tay – short for Taylor – could very well be a training tool for a future iteration of Cortana (voiced by actor Jen Taylor).

https://blogs.microsoft.com/blog/ 2016/03/25/learning-tays-introduction/Opens in new window ]