![]() They are not taught to understand what those words actually mean, nor to understand the social, moral and ethical dimensions of those words. He missed entirely that the problem was a sociological and philosophical one which unless addressed from that perspective, will always result in technology that sounds superficially human but will always stop well short of displaying any real intelligence.Ĭhatbots are designed to learn about how language is constructed and to use that knowledge to create words that are contextually correct and relevant. The disturbing outcome of Tay was that Microsoft’s Peter Lee saw the problem with the Tay “experiment” as being a technological one that could be solved with a simple technology fix. There hadn’t been enough testing of the bot and certainly the developers of the technology did not have the sociological skills to understand the range of communities on the Internet and what they would do once the technology was released into the wild. Tay was taken down and the tweets deleted but not before some of the most offensive of them were captured and spread even further on the Internet.Īpparently, the researchers at Microsoft thought that because they had successfully developed a similar AI chatbot called XiaoIce that has been running successfully in China on the social network Weibo, that Tay’s experience on Twitter with a western audience would follow the same path.Ĭaroline Sinders, an AI interaction designer working on IBM’s Watson computer has written a good explanation of how the developers of Tay should have anticipated this outcome and protected against it. This time it was down to Peter Lee, the Corporate Vice President of Microsoft Research who had to say: “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for ” What happened however was that the experiment was hijacked by a group of people from the notorious “pol” (politically incorrect) bulletin board on 4chan and 8chan who set about training Tay to say highly inappropriate things. The idea was that the artificial intelligence behind Tay would learn from others on Twitter and other social media networks and appear as an average 19 year old female person. The matter was being dealt with internally and so we don’t know who would have been responsible and why they might have thought this was going to be a good idea.īut things were going to get much worse for Microsoft when a chatbot called Tay started tweeting offensive comments seemingly supporting Nazi, anti-feminist and racist views. ![]() That was unequivocally wrong and will not be tolerated”. He said that having the dancers at this event “was absolutely not consistent or aligned to our values. It started with the head of Microsoft’s Xbox division, Phil Spencer, having to apologise for having scantily clad female dancers dressed as school girls at a party thrown by Microsoft at the Game Developers Conference (GDC). It has been a nightmare of a PR week for Microsoft. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |