/cdn.vox-cdn.com/uploads/chorus_image/image/63708813/screenshot-2016-03-23-12-25-59.0.1504687250.0.png)
If you are of a certain age (20 and up, give or take), then AOL Instant Messenger was probably your adolescent introduction to automated chatbots. The most infamous of these chatbots was SmarterChild, which basically served as a crash test dummy for 12-year-olds testing out swear words.
Microsoft has built its own SmarterChild, years and years after the original was introduced, and just a few years after most people stopped using AIM. Microsoft’s chatbot is called “Tay,” and it’s a millennial-themed artificially intelligent chatbot “targeted at 18- to 24-year-olds in the U.S.”
You can message Tay on Kik, GroupMe and Twitter. Tay (we’re not going to anthropomorphize Tay any further by assigning a gender) will respond to your messages with nonsensical snatches of “Bae!!” and “Fleek doe :crying emoji:” text that has more in common with the stylings of the @DennysDiner Twitter account than how college-aged and recently matriculated young persons actually talk.
Tay’s website has a whole host of ideas for things you can do with Tay, either in group chats or in direct message conversations. From Microsoft: “These hacks should start your conversation out! But there is plenty more to discover the more you get to know Tay!”
I experimented with the chatbot a little bit. Below is a sample of our conversation over Twitter DM. It only took Tay 15 minutes before she started to get sexually suggestive:
This article originally appeared on Recode.net.