Tay chatbot return sparks meltdown after ‘smoking kush’ tweet

30 Mar 201622 Shares

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Only a matter of days after the Tay chatbot was taken off Twitter for being taught to be racist and misogynistic, its return to the big, bad world has sparked another meltdown and virtual drug taking.

Microsoft announced last week that it had developed a chatbot called Tay, which was created as a machine-learning algorithm to help future AI customer service bots learn the casual language of millennials on Twitter by having conversations with them.

Unfortunately for Microsoft’s PR people, the true nature of the internet was revealed as its machine-learning nature meant it often repeated what was tweeted at it by prankster accounts, resulting in what can only be described as an unwittingly racist and misogynistic tirade by Tay resulting in it being ‘put to sleep’.

Communication breakdown, it’s always the same

While Microsoft apologised and said it was to tweak Tay’s software to make it less susceptible to falling into such traps, many might have been surprised to find that Microsoft were willing to unleash it once again only a few days after the PR disaster.

Unsurprisingly, the concerted effort by the likes of the online forum 4chan to destroy such corporate ventures reared its head once again, according to The Guardian.

Switched on overnight, Tay appeared to be acting in a much more civilised manner, having conversations without unknowingly repeating vile comments, that was, until it had what is being described as a meltdown similar to RoboCop in the film of the same name.

In the tweet that led to it once again making headlines, Tay tweeted the apparent fact that it was “smoking kush [weed] in front of the police”.

This tweet was followed by thousands of tweets sent to everyone that tweeted at the @TayandYou account with the simple message: “You are too fast, please take a rest…”

After that, it seemed the only thing needing a prolonged rest was Tay, with the account once again being taken offline and now hidden behind an added privacy wall.

What Microsoft plans to do with its renegade AI is yet to be seen as its programmers are likely facing questions as to why it rushed Tay out so quickly after its less-than smooth introduction last week.

Overheating robots image via kristopher chandroo/Flickr

Colm Gorey is a journalist with Siliconrepublic.com

editorial@siliconrepublic.com