Clippy Is Back: The Future Of Microsoft Is… Chatbots?

Share This Story!

TN Note: Global corporations are going nuts over AI chatbots, including Microsoft. These “intelligent assistants” are supposed to act like guides, anticipating your needs and providing timely answers. However, autonomous learning begs the question: Can sinful and corrupted human beings program virtuous chatbots? The resounding answer is “No!” as already demonstrated by several failed attempts, where chatbots curse, make racist and sexist comments and generally insult their audiences. 

Predictions about artificial intelligence tend to fall into two scenarios. Some picture a utopia of computer-augmented superhumans living lives of leisure and intellectual pursuit. Others believe it’s just a matter of time before software coheres into an army of Terminators that harvest humans for fuel. After spending some time with Tay, Microsoft’s new chatbot software, it was easy to see a third possibility: The AI future may simply be incredibly annoying.

“I’m a friend U can chat with that lives on the Internets,” Tay texted me, adding an emoji shrug. Then: “You walk in on your roomie trying your clothes on, what’s the first thing you say.”

“Didn’t realize you liked women’s clothes,” I texted back, tapping into my iPhone.

Tay’s reply was a GIF of Macaulay Culkin’s Home Alone face.

Tay was released on March 23, as a kind of virtual friend on messaging apps Kik, GroupMe, and Twitter. You open the app, search for the name Tay—an acronym for “thinking about you”—tap on the contact and start chatting or tweeting. Its personality is supposed to be modeled on a teenager.

I posted a selfie, and Tay circled my face in an orange scribble and captioned it, “hold on to that youth girl! You can do it.” I’m well beyond the chatbot’s intended 18- to 24-year-old demographic.

So is Satya Nadella, 48, who succeeded Steve Ballmer as Microsoft’s chief executive officer two years ago. “I’m petrified to even ask it anything, because who knows what it may say,” Nadella said. “I may not even understand it.” He smiled, but he really didn’t use Tay. He said he prefers bots with a more corporate demeanor. Lili Cheng, 51, the human who runs the Microsoft research lab where Tay was developed (and whose selfie Tay once tagged as “cougar in the room”), said the plan isn’t to come up with one bot that gets along with everyone. Rather, Microsoft is trying to create all kinds of bots with different personalities, which would become more realistic, and presumably less irksome, as they learned from repeated interactions with users.

Read full story here…

Join our mailing list!

1 Comment threads
0 Thread replies
Most reacted comment
Hottest comment thread
1 Comment authors
Kavita Mevada Recent comment authors
newest oldest most voted
Notify of
Kavita Mevada

Microsoft’s Zo Chabot Refuses To Talk Politics, Unlike Its Scandal-Prone Cousin Tay
Microsoft has plans to make Microsoft chatbots, and then the company has made and featured some of its own bots in its Bot Directory, but Zo and Tay are part of a series of special Al-powered assistance rolled out by Microsoft in various parts of the world.