Virtual Agents Hear What You Say. Do They Know What You Mean?
‘Strange is your language and I have no decoder/Why don’t you make your intentions clear?’
—Peter Gabriel (‘I Don’t Remember’)
Because I do scads of presentations and speeches a year, I have gotten into the habit of collecting colorful quotes from articles that I read or speeches by others in the industry. Inserting those nuggets of wisdom or humor into my talk helps distract my audience from the fact that I am droning on and on. And on. Here’s a trick of the trade: Self-deprecating humor can get you surprisingly far into an audience’s heart; sharp thoughts, however, are even better (shocking, I know!). What is great from a speaker’s point of view is that those thoughts need not be completely your own. I can borrow them from others—with proper attribution, of course.
One of my recent favorites came from a piece titled, “Alexa, Please Kill Me Now,” by Alan Cooper, the founder of Cooper Design. In the article, Cooper keenly observes, “Just because your computer recognizes the words you say, don’t extrapolate from that to assume that it understands what you mean. Your spouse, who has lived with you for 20 years, is just now getting an inkling of what you mean when you talk.”
The musty Henny “Take my wife—please” Youngman tone notwithstanding, this quote captures one of the reasons that customer experience remains bloody hard. We are not a psychic species; salespeople, marketers, and customer service reps cannot read our thoughts as easily as a fifth grader reading Twilight. (And yes, Twilight was written at a third- to fifth-grade reading level—look it up!) Language is ambiguous, and communication between people is always fraught with misunderstandings. In fact, sometimes those misunderstandings can seem willful: After 35 years, I still can’t get Aunt Sofia to understand that I really and truly do not eat red meat.
Given that, what hope do I have of making a conversational self-service tool understand that I received shipment of a pair of shoes in which one shoe was size 10.5 and the other was size 11? Yes, that happened to me. Even the human agent I spoke to had a hard time wrapping his head around this. In the end, the agent just listed my issue as “shoes don’t fit” and had me send both of them back—even though one of the shoes fit me like…well, like a pair of old shoes. My “intent” simply didn’t fit this brand’s model, and it could not deal with my issue in a sensible manner.
Of course, most interactions we have with companies are much more straightforward: We want to buy a measuring tape belt (oh, am I the only one who wants such a thing?); we want to use our frequent flyer miles to book vacations (I, however, only have enough miles to book tickets to Milwaukee in mid-January); we want to change the addresses on our accounts; we want to dispute charges on our credit card bills. Chatbots and virtual agents today mostly try to match what you say or type to specific answers. This question-and-answer-pair approach actually works very well for the types of clear-cut situations I just cited.
Companies really need to focus their efforts on automating our conversations for such use cases and leave the humans to puzzle out the ambiguous situations. At least when humans have difficulties understanding what we mean, they (or we) can respond with humor. I think Aunt Sofia, for example, always has a secret smirk when she serves me a plate of her famous stuffed cabbage filled with ground beef.
Ian Jacobs is a principal analyst at Forrester Research.