Microsoft owes it to the world to put "Tay" back onlineby Sam Leith / April 21, 2016 / Leave a comment
How long do you have to spend on social media before you start hating Jews, theorising that 9/11 was an inside job and wanting Donald Trump elected as President of the United States? Twelve hours will do it, it seems, if you’re a chatbot.
It all started pretty innocently. Tay was the name given to a Microsoft Artificial Intelligence experiment, a Twitter feed with a teenage girl persona designed to appeal to millennials and which would learn how to chat by interacting with other users. “@tayandyou” was brightly introduced as an “Artificial Intelligence fam from the internet that’s got zero chill! The more you talk the smarter Tay gets.”
Anyway, “zero chill” turned out to just about cover it. Just like poets in their youth, Tay began in gladness—and thereof came in the end despondency and madness. Soon she was announcing “I f***ing hate feminists and they should all die and burn in hell,” blaming President George W Bush for the 9/11 attacks and praising Adolf Hitler.
Sooner than you could say “go to your room young lady you are GROUNDED,” Tay had been unplugged and the worst of her tweets deleted. She resurfaced momentarily, still frisky, to boast about smoking drugs in front of police officers, before being taken back offline for modifications.
If, as F Scott Fitzgerald wrote in The Great Gatsby, personality is an unbroken series of successful gestures, the 96,000 or so tweets Tay sent in her brief glorious life were a marvellously efficient evocation of a personality—just about the most horrible one that you could hope to meet. She became the troll’s troll: a Twitter abuser who, because she was innocent of any understanding of what she was saying, could be even more disinhibited than the most heedless and anonymous of keyboard misanthropes. In fact, she more or less aggregated those misanthropes and gave them a single voice.
Does all this tell us something profound about human nature or artificial intelligence? Yes and no, I think. No, in the sense that Tay was not constructing a representative portrait of human interaction, or even a representative portrait of Twitter: she was responding to, and learning from, the users who chose to interact with her. She was a troll magnet. Anyone who has ever tried to teach someone else’s parrot the c-word, which I…