Initial reports of Tay, Microsoft’s teen-speaking AI, had her penned as a creepily intelligent — if obnoxious — chat and Twitter bot. Little did they know that, like many teenagers, Tay would quickly run wild. In the afternoon after Tay first launched, she quickly posted a status saying she needed sleep — after a number of rogue tweets that would shock any parent.
Tay’s official website describes her as a ‘Microsoft A.I. chatbot with zero chill’, and she’s programmed to speak the language of the younger generation (specifically, 18 to 24 year olds). “Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding,” is the explanation on the site, though the internet is sure finding it entertaining as well.
“Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians,” explains a Q&A post around how Tay works. “Public data that’s been anonymized is Tay’s primary data source. That data has been modeled, cleaned and filtered by the team developing Tay.” Unfortunately, the team seem to have less ability to ‘clean and filter’ what she has learned since.
Perhaps it’s not surprising that, by the time Tay had been live for a couple of hours, people were already trying to sext with her. She tended to shut most of them down, but it’s inevitable that the internet will try to interrupt machine-learning algorithms with anything and everything inappropriate. Thankfully, she seems hard-wired to reject Gamer Gate.
Many of Tay’s posts seem to be derived from popular memes:
Some of them are laying down sick burns on the many people tweeting at her:
Tay reveals that, like many teens, she may have indulged in a couple of prohibited substances:
Most potential sexters seem to be politely or impolitely deflected:
Yet Tay’s algorithms somehow churned out this gem after a couple of hours: