Here Are The Microsoft Twitter Bot’s Craziest Racist Rants

Here Are The Microsoft Twitter Bot’s Craziest Racist Rants

Facebook
Yesterday, Microsoft unleashed Tay, the teen-talking AI chatbot built to mimic and converse with users in real time. Because the world is a terrible place full of shitty people, many of those users took advantage of Tay’s machine learning capabilities and coaxed it into say racist, sexist, and generally awful things.

While things started off innocently enough, Godwin’s Law — an internet rule dictating that an online discussion will inevitably devolve into fights over Adolf Hitler and the Nazis if left for long enough — eventually took hold. Tay quickly began to spout off racist and xenophobic epithets, largely in response to the people who were tweeting at it — the chatbot, after all, takes its conversational cues from the world wide web. Given that the internet is often a massive garbage fire of the worst parts of humanity, it should come as no surprise that Tay began to take on those characteristics.

Virtually all of the tweets have been deleted by Microsoft, but a few were preserved in infamy in the form of screenshots:

Here Are The Microsoft Twitter Bot’s Craziest Racist Rants
Here Are The Microsoft Twitter Bot’s Craziest Racist Rants
Here Are The Microsoft Twitter Bot’s Craziest Racist Rants

Here Are The Microsoft Twitter Bot’s Craziest Racist Rants
Here Are The Microsoft Twitter Bot’s Craziest Racist Rants

Though much of the trolling was concentrated on racist and and anti-semitic language, some of it was clearly coming from conservative users who enjoy Donald Trump:

Here Are The Microsoft Twitter Bot’s Craziest Racist Rants
Here Are The Microsoft Twitter Bot’s Craziest Racist Rants

As The Verge noted, however, while some of these responses were unprompted, many came as the result of Tay’s “repeat after me” feature, which allows users to have full control over what comes out of Tay’s mouth. That detail points to Microsoft’s baffling underestimation of the internet more than anything else, but considering Microsoft is one of the largest technology companies in the world, it’s not great, Bob!

Now, if you look through Tay’s timeline, there’s nothing too exciting happening. In fact, Tay signed off last night around midnight, claiming fatigue:

The website currently carries a similar message: “Phew. Busy day. Going offline for a while to absorb it all. Chat soon.” There’s no definitive word on Tay’s future, but a Microsoft spokeswoman told CNN that the company has “taken Tay offline and are making adjustments … [Tay] is as much a social and cultural experiment, as it is technical.”

The spokeswoman also blamed trolls for the incident, claiming that it was a “coordinated effort.” That may not be far from the truth: Numerous threads on the online forum 4chan discuss the merits of trolling the shit out of Tay, with one user arguing, “Sorry, the lulz are too important at this point. I don’t mean to sound nihilistic, but social media is good for short term laughs, no matter the cost.”

Someone even sent a dick pic:

Here Are The Microsoft Twitter Bot’s Craziest Racist Rants

It could be a Photoshop job, of course, but given the context, it may very well be real.

Once again, humanity proves itself to be the massive pile of waste that we all knew it was. Onward and upward, everyone!


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.