Thursday, March 24, 2016

Microsoft’s Twitter-robot denied the Holocaust – Swedish Dagbladet

Photo: Business Insider

Microsoft created Twitter chat bot Tay Wednesday, writes Business Insider. The goal was, according to the US tech giant to “experiment and do research in understanding conversations.”

By reading other user’s language patterns would Tay learn from it and be able to talk to them. But very quickly it became wrong. After a few hours, including Tay wrote that the Holocaust is fictional, that Mexicans should be exterminated, and begun to pick up on users in private messages:

Photo: Business Insider

Microsoft was forced to delete many of the most controversial tweets and reply to the criticism of cure in an email to Business Insider:

– the robot Tay is a machine learning project, designed to enable people to get involved. Meanwhile, she will learn some of her answers to be inappropriate and stained by the interactions people have with her. We will adjust the Tay now.

LikeTweet

No comments:

Post a Comment