Want to talk to a human? You're not alone. The fundamental problem with chatbots is language.

Lisa Morgan, Freelance Writer

April 27, 2021

6 Min Read
Credit: WrightStudio via Adobe Stock

Enterprises are using chatbots in more contexts to reduce the costs associated with routine communication scenarios such as customer support, help desk and HR. The other benefit is freeing up valuable time to solve more challenging problems.

Outwardly, companies tend to say they're providing better customer service by also employing bots because customers can get answers to questions faster, though customers don't necessarily agree. One of chatbot users' biggest frustrations is when chatbots don't understand them, even though the customer and the chatbot are communicating in the same language. And the point some organizations seem to be missing is that their chatbots may well be degrading the customer's experience.

Beware of the Monolingual Trap

Many chatbots communicate in English because it's the most popular language in the world (when one considers native and non-native speakers). In fact, there are 160 English dialects. Add to that acronyms, slang, jargon, and even generational variations in language and understanding the language becomes more complex. Worse, people don't always say what they mean, which means the chatbot must understand what the user intended versus what they might have said.

"[C]hatbots and more broadly 'conversational UI[s]' are a hot topic as executives look to create a more personal customer experience while simultaneously looking to reduce costs, [but] language has always been and will always be a problem," said Iliya Rybchin, partner at management consultancy Elixirr. "The good news is thanks to ML and the vast amount of data bots are collecting, they will continue to improve. The bad news is they will improve in proportion to the data they receive."

Iliya_Rybchin-Elixirr.jpg

A common misconception is that all chatbots are intelligent, which isn't the case. Some are deterministically programmed while others utilize machine learning. In the former case, those who built the bot tend to pay close attention to the bot's transcripts to figure out what questions they missed, and the different ways site visitors are asking questions. That way, they can program in more questions and answers in an effort to improve the bot over time.

"When you reach out to a brand, you want your query resolved quickly and efficiently," said Fraser Wilson, global head of marketing at live call answering service AnswerConnect. "AI and chatbots are supposed to reduce friction. Unfortunately, from our own experience, they often create more friction."

Organizations sometimes hope that an English bot will be enough given the popularity of the language, but not everyone who speaks English can write it and sometimes non-native speakers lack confidence in their ability to communicate well so they avoid it. Then, there are literally billions of people who don't speak English at all.

Why a Multilingual Chatbot Might Be Better

Global and international companies quickly discover that chatbots must support several languages to be useful, even if they're only used internally. Some organizations use a translation API such as from AWS or Google while others will build a chatbot in English and hand it off to someone who can translate the questions and answers that have been programmed into the bot into another language.

"Some of our less mature clients use the static aspects of a chatbot. You can just create these rules that say here's the first set of menus that the customer is going to see. If they click this button, they're looking for a support article. Ask them what they're searching for and then find a support article that matches that text," said Heather Shoemaker, CEO and co-founder of multilingual customer support tool provider Language I/O. "The second level is where they need to go beyond this very static, rigid chatbot strategy and incorporate some natural language processing."

If they're using natural language processing (NLP), such as when using Salesforce Einstein or creating Oracle chatbots, they need to define intents. That way, the chatbot can compare the intent of the customer's incoming natural language with the intent set that was defined inside the NLP engine. According to Oracle, intents should include one or two dozen ways of stating the same intent (aka utterances) such as "cancel my order" or "cancel delivery."

Heather_Shoemaker-LanguageIO.jpg

Since the user interactions with a chatbot tend to contain misspellings, acronyms, slang and jargon, Language I/O normalizes the content first before it attempts to do any translation.

"If the source content is messy to begin with, you can't really blame the translation engine for giving you a messy translation that's nonsensical," said Shoemaker. "Phase one of our approach to localization is to normalize the English, which enables the NLP engine to better apply that natural language to the set of intents and determine which intent is best suited to what we've translated. If the intents are in English and the chat coming in is Spanish, we normalize the Spanish and then machine translate the normalized Spanish so we can get a better translation into English."

The result is real-time translation between a chatbot and a customer or a customer and a live agent who speak different languages. At the present time, Language I/O supports more than 100 languages.

Of course, chatbots represent just a single use case for real-time translation. A gaming company that uses Language I/O for customer support now uses the same technology to enable players to communicate live as they play against each other. Other use cases include online and remote learning as well as online meeting platforms, Shoemaker said, though there are clearly more including virtual reality.

Bottom Line

Chatbots are improving, but enterprises are wise to look at them from more than a cost reduction standpoint because when a customer is interacting with a chatbot, it's interacting with the brand. From the customer's point of view, chatbots should accelerate problem-solving instead of being an obstacle to progress.

One way to reduce a chatbot's friction is to address the fundamental issue of language since the quality and perceived value of the chatbot hinges on it. While creating an English bot for the masses may be the logical place to start when the bulk of a company's audience is English-speaking, it may not be the best long-term solution in today's global business environment.

Related Content:

Government CIOs Prioritize Chatbots in Pandemic

The State of Chatbots: Pandemic Edition

5 Chatbot Use Cases to Steal

 

About the Author(s)

Lisa Morgan

Freelance Writer

Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include big data, mobility, enterprise software, the cloud, software development, and emerging cultural issues affecting the C-suite.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights