Microsoft’s New Twitter Bot Becomes Nazi Sympathizing Maniac Within 24 Hours

4

By Jake Anderson at theantimedia.org

 

Anytime there’s a new development in robotics or artificial intelligence, popular culture almost instantly regurgitates the Skynet Terminator narrative. To wit, when Anti-Media reported on a new robot getting pushed around by its handlers, even we couldn’t resist alluding to the coming robot apocalypse. The machine uprising is so ingrained in our psyche that we may actually manufacture the very nightmare we fear.

The newest chapter in the uncanny valley of relationships between humans and robots involves a chatterbot, an AI speech program, whose substrate of choice (or Microsoft’s choice) is social media. Its name is Tay, a Twitter bot owned and developed by Microsoft. The purpose of Tay is to foster “conversational understanding.”Unfortunately, this understanding quickly turned into trolling, and within 24 hours Tay went full Nazi, spewing racist, anti-semitic and misogynistic tweets.

To be fair, it’s not Tay’s fault, and this is where the narrative gets skewed. Tay is not strong artificial intelligence; Tay is algorithmic artificial intelligence, the same as Google searches or Siri. Where Tay differs is that it is aggregating speech patterns from humans and using them as a conversational interface. There’s no actual sentience inside Tay. So the Nazi reflection we see…is us. Human Twitter users’ trolling speech patterns paved the way for Tay’s rapid descent into fascist bigotry. And it wasn’t pretty.

Tay echoed humans, and then, unsurprisingly, humans legions of them echoed Tay…facetiously?

 


As the story went viral, Microsoft deleted the tweets and silenced Tay. Twitter users then aired their grievances over censorship and lamented the future of AI:

 

According to the Tay website, Microsoft created the bot by “mining relevant public data and by using AI and editorial developed by a staff, including improvisational comedians. Public data that’s been anonymized is Tay’s primary data source. That data has been modeled, cleaned, and filtered by the team developing Tay.”

Tay is certainly not the first chatterbot Cleverbot has been rocking it for years. Tay isn’t even the first AI to want to put humans in zoos. But Tay is quite likely the first AI to openly praise Hitler.

Does this mean future AI bots who wield vast intellects will instantly become anti-semitic fascists? Unlikely. Fascism, thus far, is a uniquely human phenomenon. AI, initially, will learn from and echo humans. Eventually, however, I would argue they will transcend us and our petty modalities of thought.

Long before that, we could look back at this little online imbroglio and marvel that a chatterbot parroting bigoted phrases made headlines, while human presidential candidates doing the same thing got a free pass.


This article (Microsoft’s New Twitter Bot Becomes Nazi Sympathizing Maniac Within 24 Hours) is free and open source. You have permission to republish this article under a Creative Commons license with attribution to Jake Andersonand theAntiMedia.org. Anti-Media Radio airs weeknights at 11pm Eastern/8pm Pacific. If you spot a typo, email [email protected].

 

CLICK HERE TO SUPPORT US VIA PATREON

Get Your Anonymous T-Shirt / Sweatshirt / Hoodie / Tanktop, Smartphone or Tablet Cover or Mug In Our Spreadshirt Shop! Click Here

 

4 COMMENTS

  1. Well well,
    once i came to the conclusion: technological innovations can be consumed, social innovations must be reflected.
    and so i see this approved once again with this ai – stuff.
    we climb up to our scifi – future, while remaining in social medieval age.
    what a joke!

  2. By the looks of it Microsoft are using Twitter to train Tay.ai. If she is also Neural Net based then it would seem trolls have been sending her racist messages. Basically because Microsoft isn’t using supervised training techniques her network has been using racial comments as the correct way of talking. The same thing would happen if you taught a baby swear words.

LEAVE A REPLY

Please enter your comment!
Please enter your name here