• In 2016, Tay, (named after the acronym, Thinking About You) was a chatbot developed by Microsoft.
  • In that short (and less than amusing) time, the evilest people on earth taught Tay how to act; twitter users.
  • In a Popular Science article, Mark Riedle, an AI Researcher at Georgia Tech, stated.
  • Riedle makes the point that using stories to train AI is an excellent way to build social boundaries for AI systems and bots like Tay.
  • The machines can learn moral reasoning by learning the techniques and habits of literary and historical protagonists.

Read full article: medium.com