A chatbot or voice bot, as the name suggests is a virtual ‘Bot’ or computer program that conducts a conversation. Powered by artificial intelligence (AI) and natural language processing (NLP), these Bots are designed to simulate human conversational patterns. These bots are used in a variety of practical scenarios such as customer service. As mentioned, some chat/voice bots use sophisticated underlying technology such as AI and NLP. Some simpler ones scan for keywords from within a pre-defined database to extract a reply with the most matching keywords or wording pattern. Depending on how they are configured, bots can be highly differentiated with respect to their functionality and conversational quality. Here’s a blow by blow of what differentiates one voice/chatbot platform from another. In other words, what voice bot features ensure smooth conversation? 1. The ability to understand meaning: Intent Analysis This is perhaps the most essential criteria. Meaning and comprehension is everything during a conversation. Sadly, as human beings, we aren’t always literal or to the point when we speak. Given this inherent human irony, a bot should be able to use its ‘Intent Analysis’ (super)power to extract the meaning behind our words. For example, the intent behind a “sure” or “why not” is most likely a “yes”. Conversely, “I’m good”, “later” and “not really”, likely translates to a “no”. These simple examples show that a chat bot’s job is rather nuanced. Today, as companies rely more and more on technology to ease the burden on human labor, chatbots or voice bots are common. As a result, these bots need to ensure that nothing is lost in translation! To enable a successful transition to automation, voice bots need to be smart to carry out convincing conversations. They need to understand if they’ve reached the right person, they need to understand the sentiment of the person and act accordingly (e.g. schedule a later call or transfer the call to another department or agent, etc.). Here are some scenarios to provide more context: Scenario 1: The customer who receives a call is unable to speak and hangs up very quickly after a few words. Bot: “Hello, this call is regarding your credit card payment for January 2019. Am I speaking with Mr. X?” The answer to this seemingly simple question isn’t as straightforward as we may think. Answers may have a range: Curse word(s) “I’m busy.” “I’m driving. Call me later.” “That’s my dad. He’s not home.” “Call later.” Or hopefully, “Yes, speaking.” In all the above responses (except for the last one), the bot’s task is tricky. It needs to discern that the caller is unavailable for some reason and that the call needs to be rescheduled. A smart bot would politely thank the person on the other line and ends the call. The bot is mindful of the customer’s time, and in the back-end, it will schedule a call a few hours later to follow up. Scenario 2: The customer’s answer has multiple steps Bot: “Hello, this call is regarding [...]
About Sudhamay MaitySudhamay is a project manager at Ozonetel Communications. He is our go-to person for all things AI.
At the beginning and end of every call, your Speech Analytics system displays customer mood on your Agents dashboard. How does a machine understand whether your customer is happy or not? It uses two things: Natural Language Processing Sentiment Analysis Natural Language Processing. Conventionally, people used programming languages to “speak” to computers. But now, we see Alexa, Siri and Cortana, following instructions we give in our “natural“ language. This is thanks to NLP. NLP or Natural Language Programming is the ability of a computer program to understand natural human language. It is an aspect of Artificial Intelligence. Sentiment Analysis. People don’t just communicate information using language. We communicate emotions too. Sentiment Analysis is a layer placed over Natural Language Programming, where a program is “trained” to understand sentiment in a text passage. How does this work? The short explanation Very simply speaking, your SA first translates your call recordings to text. It then scans for positive and negative words. It relates each word to the surrounding 5 words. Then it uses these words as context to understand word meanings. Like humans, machines to get better and better at understanding language through exposure and experience. So the very short explanation would be: Your Speech Analytics system can tell if your customer is happy or not, because of experience. The slightly longer explanation What we mean by experience is that your Speech Analytics System has been exposed to billions and billions of words—until it’s learnt to understand meanings. This is similar to how a well trained, educated—and possibly middle aged— person gets good at understanding what others mean to say. It learns to understand slang, humor and sarcasm. For example, It learns to differentiate this sentence: That is so sick! From this sentence: I’m sick of you. It also assigns intensities. For example, in this sentence: I am very unhappy with your service. SA will consider this sentence negative because of the presence of unhappy. And then assign it an intensity of +2 due to the presence of very. And then gives you an aggregate. For example, in this sentence: I am happy with your service but your product is too costly. SA will be trained to assign costly as negative. It will add ++ for intensity. But then it will calculate for the positive emotions. Then, it will use appropriate aggregators— like sums or averages— to assign a final value to your sentiment. How accurate is SA at analyzing sentiment? SA is more accurate than humans. BUT, this statement comes with two disclaimers. Disclaimer 1: SA is only more accurate than humans over large quantities of data that have to be analyzed within tight deadlines. This is because humans get bored with repetitive tasks. Pressure to meet deadline, tiredness and boredom can affect human accuracy. Disclaimer 2: SA is not 100% accurate. On an average, SA can be said to be is 60-85% accurate. But, know this—within the professional setup of gauging sentiment from text, humans are not 100% accurate either. [...]
Is the future becoming the present faster than ever before? It may just be. AI is revving changes in the ever-evolving contact center. And by year-end, you can expect a complete upheaval in the way your contact center works and delivers customer experience. With such rapid changes on the horizon, it’s important to stay informed. This article looks at the 4 latest disruptions in the contact center, and the quantum leaps they will trigger. Goodbye IVR, Hello Voice Bots. A voice bot now greets, helps and connects your customers faster and better than ever before. Features to adopt right away: Smart IVR It is your IVR that will transform into your voice bot. Smart IVR’s are the first step in this evolution. As of today, your smart IVR can: o Recognize and respond to simple “yes”, “no”, inputs. o Ask for and understand verification details. o Give basic self-service: Now customers calling to check the ticket status/balance /delivery status of their order—just need to ask, and your Smart IVR will fill them on all the details they need. Few months down the line: It will be easier than talking to Google Voice, Alexa or Siri. Your voice bot will answer calls and give your customers: o Advanced self-help options. o Quick answers to common technical problems, and faqs. o An invisible interface. The menu will disappear. Customers will just state their problem and the bot will route them to the best agent for the job. Call routers with HR skills. Good Call routing systems will consider more than just queues, wait times and customer priority. Powered by AI, bots will also match on customer sentiment and agent capability. Features to adopt right away: Speech analytics & Sentiment Analysis Start developing your AI’s Human resource capabilities now. Use these tools to develop new metrics for your call center—ones that focus on delivering finer customer experiences: o Speech analytics is already capable of analyzing your agents’ turnaround abilities (best technical skills, best at calming angry customer, best sales capabilities) o Sentiment analysis gives an instant analysis of customer sentiment during the call. Keeping tab of sentiment before and after calls will identify agents adroitness at handling irate customers. Few months down the line: We’ve already mentioned how customers will speak naturally to your voice bots about their requirements. Now they will also reach right person faster with AI-powered call routing o Call routing will continue to consider customer priority, call waits, agent queues. o It will consider customer sentiment (angry, neutral, happy), o It will consider customer history (priority customer, recurring complaint, impulsive shopper) o And match to the best agent as per department (sales, service, technical, support, escalation) and capability (can calm angry customers, is good at upselling, empowered for escalations etc.). Digital assistants that superpowers agents. Set a person to shovel snow with a spoon in their hand, and you won’t even get a clear driveway. But seat them in a snow blower, and watch them blaze trails across [...]