Microsoft pulls AI chatbot Tay from Twitter after racist tirade

25 Mar 2016

Following a concerted effort to make a Twitter AI chatbot called Tay say incredibly racist and misogynist things, its creator, Microsoft, has taken it offline for an undetermined amount of time.

In the space of just 24 hours, Tay turned from a genderless machine-learning AI designed to learn from Twitter, into a Donald Trump-supporting, holocaust-denying sexist.

The Tay account was developed by Microsoft’s Technology and Research and Bing teams to learn from 18-24-year-olds – otherwise known as millenials – and, according to Microsoft, “the more you chat with Tay the smarter she gets”.

As Microsoft should have known, the masses of the internet tend not to play things by the book and, within the space of a few hours, the account with more than 100,000 followers was soon bombarded with vile comments in the hope that Tay would repeat them.

No doubt sending Microsoft’s PR team into a state of panic, Tay gradually started repeating many of these comments, including such obviously hateful ones as “Hitler did nothing wrong”, which, according to The Guardian, was due to an effort led by members from the infamous 4chan forum.

‘We’re making some adjustments’

Unsurprisingly, Microsoft has now decided it’s best to shut down Tay, at least temporarily, to prevent any more horrendous insults being repeated by the AI chatbot, and it has now been ‘put to sleep’.

In a statement on the matter, Microsoft said: “The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We’re making some adjustments to Tay.”

Microsoft has also painstakingly gone back through the tweets deleting all but three from the account, but the fallout continues, with many of the figures central to the Gamergate controversy being directly targeted by Tay’s algorithm, particularly game designer Zoe Quinn.

Quinn took to Twitter to criticise Tay’s creators for not taking into account that these things happen when a high-profile, content-neutral algorithm is unleashed on the internet with her own handle being included in a tweet that referred to her as a ‘stupid whore’.

Putting out fire image via ChameleonsEye/Shutterstock

Colm Gorey was a senior journalist with Silicon Republic

editorial@siliconrepublic.com