When I see my parents use computers, it’s clear that there’s something that gets lost in the communication between computer and person. I think often that a big part of this is a lack of understanding of bugs/exceptions. In a way, there is a lack of fault tolerance in how my parents interact with their computers. When an app crashes, an email doesn’t send, or a screen freezes, there’s a sense of bewilderment, even incredulity. But when you grow up natively with computers, your fault tolerance is higher and you learn to navigate the bugs that inevitably arise because you understand at an almost innate level how they work.
The more I see new startups leveraging natural language processing, the more I can’t help but wonder whether “learning how to talk English to a machine” is the next area where this user experience gap will arise.
Thanks to NLP, we increasingly have the ability to talk to machines with our everyday language — we see it in examples like Siri, Google Now, Amazon’s Alexa, Clara Labs, x.ai, or any of the startups experimenting with AI-powered conversational UIs. And it’s increasingly with these conversational UIs that we’re subtly changing the way we speak in order to be able to talk to our machines.
First, there’s the utilitarian change. It’s amazing how many extra words we use to handle each other emotionally. When you’re talking to a machine, you don’t need those words. You’ll write differently.
Second, will be learning the best syntax to maximize getting what we want out of our bots, sometimes depending on the bot.
Initially, these changes won’t be natural to those of us who have only spoken to other humans our entire lives. It’s an adjustment we’ll need to make, though of course it will be second-nature to kids being born right now.
For example, this Business Insider review of x.ai articulates one way in which our language will change as we “learn to talk machine”.
[When telling Amy, x.ai’s AI-powered assistant the author’s meeting preferences:] I felt compelled to be polite back. “Hi Amy, nice to meet you. I hate meetings before 11AM — so please try to avoid those as I am usually busy in the mornings. Thank you, Lara.”
I had to scold myself: She’s not real! Yet it felt odd constructing an email with only the most basic of information, eschewing any sort of salutation.
Odd at first, especially for those that don’t grow up with any AI bots as friends or assistants, but eventually we get used to it.
We’ll develop a fault tolerance too. We’ve all been there before — you ask Siri or Google Now something and don’t get quite the answer you’re looking for. So you reframe, or you speak louder, or clear, and bingo, get what you’re looking for. Overtime, you learn how to get what you want, you develop a fault tolerance.
I see this even more so with startups working with limited datasets which creates a much rougher experience than what Apple and Google provide. The power users of these apps know how to talk to the machines behind the app to get what they want, but when a new user starts, they don’t. Of course, it’s the computer’s job to bridge this gap as much as possible, but there will always be bugs.
It’ll be funny to see, ten years from now, all the nostalgic blog posts that get written about how kids talk differently to each other because of all the bots they’ve grown up talking to.
So what’s it going to be? Time to learn how to talk machine or prepare to be bewildered?