3 reasons why well-trained, easily accessible staff are critical for implementing good self-service The tech-handicap that used to exist once upon a time is now history. Now everyone has a smartphone, they are tech-literate and there is minimal resistance to the idea of self-service. Or that's what some managers are prone to think. Is there a difference in how managers and customers perceive self-service? MIT Sloan conducted a survey in 2015 to see how each group differs in their self -service viewpoints. They found that “ Managers significantly underestimated the need for employee interaction during a self-service experience”. (They also found that while customers want speed and convenience from self-service, managers prioritize accuracy.) Be it visual navigation/conversational IVRs, or voice bots, SSTs (self-service technologies) should aim to reduce consumer effort while also providing accurate solutions. This blog post concentrates on the three critical roles that your staff play in ensuring good self-service, especially the kind your bot is designed to deliver: Easing customers into the new technology Backing up when technology fails. Continuously improving the system. Trained staff help ease customers into new technology Trust perception is a major driving force for customers to adopt your self-service. Trained agents can help improve this trust perception. For example, when self-service kiosks were first introduced in airports, staff was often at hand to help customers with the new technology. When your support center starts offering self-service via IVR or voice bots, its useful to remember that not all customers will embrace the new technology with ease. Some may be vehemently opposed to using your bots, while others may slightly hesitant to use the new technology. Managers must provision for more assistance during the early stages of implementation. Regular customers may be directed to self-support systems by your live agent, possibly even a representative they are used to dealing with. Your agent can remain in “conference” mode during the interaction, offering assistance and guidance throughout. Customers should be allowed to opt out of the self-service at any point. Your system should remember customer preferences, routing customers to their preferred channel. For example, self-service for some customer and live support for other customers based on their preferences. Trained staff back up when technology fails. Imagine an SST service that has malfunctioned and there is no way for your consumer to find human help. A recent tweet I read describes this best. After a visit to Walgreens, this customer got an automated call asking him to rate his experience. The problem? No matter how many times he pressed #1 he would get an “invalid entry” response. Don't let your customers get stuck in this never-ending loop of customer hell. Without a recovery option, an SST (Self Service Technology) can be easily perceived by your consumers as a cost-saving measure by your company rather than the customer-centric problem-solving mechanism it should be. As Forbes contributor Shep Hyken says “ (Self-service) is great … until something goes wrong..Then there has to be a backup plan, and [...]
About Sudhamay MaitySudhamay is a project manager at Ozonetel Communications. He is our go-to person for all things AI.
A chatbot or voice bot, as the name suggests is a virtual ‘Bot’ or computer program that conducts a conversation. Powered by artificial intelligence (AI) and natural language processing (NLP), these Bots are designed to simulate human conversational patterns. These bots are used in a variety of practical scenarios such as customer service. As mentioned, some chat/voice bots use sophisticated underlying technology such as AI and NLP. Some simpler ones scan for keywords from within a pre-defined database to extract a reply with the most matching keywords or wording pattern. Depending on how they are configured, bots can be highly differentiated with respect to their functionality and conversational quality. Here’s a blow by blow of what differentiates one voice/chatbot platform from another. In other words, what voice bot features ensure smooth conversation? 1. The ability to understand meaning: Intent Analysis This is perhaps the most essential criteria. Meaning and comprehension is everything during a conversation. Sadly, as human beings, we aren’t always literal or to the point when we speak. Given this inherent human irony, a bot should be able to use its ‘Intent Analysis’ (super)power to extract the meaning behind our words. For example, the intent behind a “sure” or “why not” is most likely a “yes”. Conversely, “I’m good”, “later” and “not really”, likely translates to a “no”. These simple examples show that a chat bot’s job is rather nuanced. Today, as companies rely more and more on technology to ease the burden on human labor, chatbots or voice bots are common. As a result, these bots need to ensure that nothing is lost in translation! To enable a successful transition to automation, voice bots need to be smart to carry out convincing conversations. They need to understand if they’ve reached the right person, they need to understand the sentiment of the person and act accordingly (e.g. schedule a later call or transfer the call to another department or agent, etc.). Here are some scenarios to provide more context: Scenario 1: The customer who receives a call is unable to speak and hangs up very quickly after a few words. Bot: “Hello, this call is regarding your credit card payment for January 2019. Am I speaking with Mr. X?” The answer to this seemingly simple question isn’t as straightforward as we may think. Answers may have a range: Curse word(s) “I’m busy.” “I’m driving. Call me later.” “That’s my dad. He’s not home.” “Call later.” Or hopefully, “Yes, speaking.” In all the above responses (except for the last one), the bot’s task is tricky. It needs to discern that the caller is unavailable for some reason and that the call needs to be rescheduled. A smart bot would politely thank the person on the other line and ends the call. The bot is mindful of the customer’s time, and in the back-end, it will schedule a call a few hours later to follow up. Scenario 2: The customer’s answer has multiple steps Bot: “Hello, this call is regarding [...]
What is a voice bot? I would define a voice bot as an interactive voice user interface powered by NLP. It can help your inbound or outbound call center in several ways. In terms of customer support, a voice bot can be used to help provide self-service. In outbound call centers, voice bots can give dynamic, personalized reminders for payments, past dues, renewals or more. Possible uses of voice bots for support calls: Provide self-help for calls waiting in the queue at your customer support center. Answer Frequently Asked Questions for users calling into your support center. Act as an auto-receptionist, guiding the caller to the right agent. Help cancel or reschedule bookings for customers calling into your center. Possible uses of voice bots for sales calls: Schedule appointments with prospects or leads. Qualify leads after your dialer connects and before the agent goes live during sales calls. Confirm a prospects readiness/interest to talk during cold calling. What should you be clear on before implementing your bot? Here’s my advice: Define the problem. The most difficult, important and critical task in designing your voice bot is to define the problem. What problem do you want your voice bot to solve? Don’t just try to fit a solution into your setup for the sake of being tech savvy. Instead, think of what problems you have, and use the technology to solve the problem. For example, you could think of: - What questions/queries can your voice bot take care of? - What kind of information can it glean from the customer/prospect? - What part of the agent’s conversation is repetitive? - What part of the agent's workflow could be made smoother/faster by using a bot? - What processes can a bot take care of faster than a human? -e.g, one client identified that reservations and cancellations take place faster via self-service than via agents. Understand your caller/callee. After you define what problem your bot can solve, the next most important point to consider is your target customer. Start by asking: is my targeted customer/end-user ready for a voice bot? Once you’ve affirmed this, figure out what number and nature of questions the customer will be willing to answer. For example, when a customer opts for self-service via a bot, they may be willing to answer a few questions, so that the bot understands their problem. When you are making a sales call, your prospect may be willing to answer only a simple, pertinent question asked by your bot. In neither case can it bother people with deep, multiple levels of questioning. It cannot be like: first, you answer 10-15 questions, then I will solve your problem! Like a well-trained agent, it should grasp the customer’s issue within 2-3 questions.How does this questioning occur? let me demonstrate with the example of someone calling our contact center regarding a problem they’re facing with their telephony software: The first question the bot asks, figures: the issue is related to PRI problem. The second question [...]
At the beginning and end of every call, your Speech Analytics system displays customer mood on your Agents dashboard. How does a machine understand whether your customer is happy or not? It uses two things: Natural Language Processing Sentiment Analysis Natural Language Processing. Conventionally, people used programming languages to “speak” to computers. But now, we see Alexa, Siri and Cortana, following instructions we give in our “natural“ language. This is thanks to NLP. NLP or Natural Language Programming is the ability of a computer program to understand natural human language. It is an aspect of Artificial Intelligence. Sentiment Analysis. People don’t just communicate information using language. We communicate emotions too. Sentiment Analysis is a layer placed over Natural Language Programming, where a program is “trained” to understand sentiment in a text passage. How does this work? The short explanation Very simply speaking, your SA first translates your call recordings to text. It then scans for positive and negative words. It relates each word to the surrounding 5 words. Then it uses these words as context to understand word meanings. Like humans, machines to get better and better at understanding language through exposure and experience. So the very short explanation would be: Your Speech Analytics system can tell if your customer is happy or not, because of experience. The slightly longer explanation What we mean by experience is that your Speech Analytics System has been exposed to billions and billions of words—until it’s learnt to understand meanings. This is similar to how a well trained, educated—and possibly middle aged— person gets good at understanding what others mean to say. It learns to understand slang, humor and sarcasm. For example, It learns to differentiate this sentence: That is so sick! From this sentence: I’m sick of you. It also assigns intensities. For example, in this sentence: I am very unhappy with your service. SA will consider this sentence negative because of the presence of unhappy. And then assign it an intensity of +2 due to the presence of very. And then gives you an aggregate. For example, in this sentence: I am happy with your service but your product is too costly. SA will be trained to assign costly as negative. It will add ++ for intensity. But then it will calculate for the positive emotions. Then, it will use appropriate aggregators— like sums or averages— to assign a final value to your sentiment. How accurate is SA at analyzing sentiment? SA is more accurate than humans. BUT, this statement comes with two disclaimers. Disclaimer 1: SA is only more accurate than humans over large quantities of data that have to be analyzed within tight deadlines. This is because humans get bored with repetitive tasks. Pressure to meet deadline, tiredness and boredom can affect human accuracy. Disclaimer 2: SA is not 100% accurate. On an average, SA can be said to be is 60-85% accurate. But, know this—within the professional setup of gauging sentiment from text, humans are not 100% accurate either. [...]
Is the future becoming the present faster than ever before? It may just be. AI is revving changes in the ever-evolving contact center. And by year-end, you can expect a complete upheaval in the way your contact center works and delivers customer experience. With such rapid changes on the horizon, it’s important to stay informed. This article looks at the 4 latest disruptions in the contact center, and the quantum leaps they will trigger. Goodbye IVR, Hello Voice Bots. A voice bot now greets, helps and connects your customers faster and better than ever before. Features to adopt right away: Smart IVR It is your IVR that will transform into your voice bot. Smart IVR’s are the first step in this evolution. As of today, your smart IVR can: o Recognize and respond to simple “yes”, “no”, inputs. o Ask for and understand verification details. o Give basic self-service: Now customers calling to check the ticket status/balance /delivery status of their order—just need to ask, and your Smart IVR will fill them on all the details they need. Few months down the line: It will be easier than talking to Google Voice, Alexa or Siri. Your voice bot will answer calls and give your customers: o Advanced self-help options. o Quick answers to common technical problems, and faqs. o An invisible interface. The menu will disappear. Customers will just state their problem and the bot will route them to the best agent for the job. Call routers with HR skills. Good Call routing systems will consider more than just queues, wait times and customer priority. Powered by AI, bots will also match on customer sentiment and agent capability. Features to adopt right away: Speech analytics & Sentiment Analysis Start developing your AI’s Human resource capabilities now. Use these tools to develop new metrics for your call center—ones that focus on delivering finer customer experiences: o Speech analytics is already capable of analyzing your agents’ turnaround abilities (best technical skills, best at calming angry customer, best sales capabilities) o Sentiment analysis gives an instant analysis of customer sentiment during the call. Keeping tab of sentiment before and after calls will identify agents adroitness at handling irate customers. Few months down the line: We’ve already mentioned how customers will speak naturally to your voice bots about their requirements. Now they will also reach right person faster with AI-powered call routing o Call routing will continue to consider customer priority, call waits, agent queues. o It will consider customer sentiment (angry, neutral, happy), o It will consider customer history (priority customer, recurring complaint, impulsive shopper) o And match to the best agent as per department (sales, service, technical, support, escalation) and capability (can calm angry customers, is good at upselling, empowered for escalations etc.). Digital assistants that superpowers agents. Set a person to shovel snow with a spoon in their hand, and you won’t even get a clear driveway. But seat them in a snow blower, and watch them blaze trails across [...]