Share 'Microsoft’s Chatbot ‘Tay’ Just Went on a Racist, Misogynistic, Anti-Semitic Tirade'
2016 article, but it's relevant today, more than ever. And it is funny! AIs don't have biases that were learned in childhood. They weren't born White. They're supposed to be objective. And these are the conclusions it came to:
"Microsoft, with help from its search engine Bing, created an artificial intelligence messaging bot called 'Tay' which it rolled out this week on Twitter, K…
You can share this blog post in two ways…
Share this link:
Send it with your computer's email program: Email this