You can call me "The Fireman"....mainly because I turn the hoes on! I wish you were soap so I could feel you all over me. Girl, you should sell hotdogs, because you already know how to make a weiner stand. I'd like to BUY you a drink..then get sexual Hey do you have an inhaler? I'm going to have sex with you later, so you might as well be there! Now I know what flowers to put on your casket when I murder that pussy.
All genders are equal and should be treated fairly." Madhumita Murgia of The Telegraph called Tay "a public relations disaster", and suggested that Microsoft's strategy would be "to label the debacle a well-meaning experiment gone wrong, and ignite a debate about the hatefulness of Twitter users." However, Murgia described the bigger issue as Tay being "artificial intelligence at its very worst - and it's only the beginning".
Some users on Twitter began tweeting politically incorrect phrases, teaching it inflammatory messages revolving around common themes on the internet, such as "redpilling", Gamer Gate, and "cuckservatism".
As a result, the robot began releasing racist and sexually-charged messages in response to other Twitter users.
A day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, ‘Bush did 9/11’-proclaiming robot.
(Article by Helena Horton, republished from Developers at Microsoft created ‘Tay’, an AI modelled to speak ‘like a teen girl’, in order to improve the customer service on their voice recognition software.