Worried? Angry? Where machines excel in detecting consumers' emotions
How does a financial institution know how a consumer really feels? Angry shouts or effusive displays of gratitude are unmistakable. But there are other subtle — yet important — markers of a customer’s emotional state that materialize on phone calls, chats or tweets and convey even more about how that customer is feeling.
Banks and credit unions have increasingly been using artificial intelligence that discerns and analyzes emotion to pick up elusive signals over text, audio and video. They’re using this emotion AI in two ways:
- Relaying the information to customer service agents so they can mend spiraling interactions in real time, perhaps by smiling more, chitchatting less or regaining focus.
- Providing the data to management, so they can detect patterns, draw broader conclusions and take proactive measures to improve the quality of customer service interactions.
Either way, emotion AI can lead to a deeper understanding of what customers are feeling and fill in the gaps where a human’s ability to analyze interactions falls short — especially important at a time when consumers are contacting their banks and credit unions amid mounting stressors related to the coronavirus pandemic.
“Some people are better than others at reading situations,” said Lisa Huertas, chief experience officer at Texas Tech Credit Union in Lubbock, which uses an emotive recognition feature from video platform POPi/o (the tech company considers this a form of emotion AI). “This levels the playing field.”
Overall, the effectiveness of emotion AI technologies “depends on how well you use, test and evaluate them,” said Seth Grimes, principal consultant at Alta Plana, an information technology strategy firm.
That means basing the models on data that is as free of bias as possible, and cross-validating results with other measures of customer service, such as net promoter scores and satisfaction surveys. Combining signals, such as the tone of voice, facial expressions and words, leads to higher accuracy. And a layer of human analysis is still vital.
As for concerns that a machine could never sense moods as well as a person, “humans are far from infallible,” Grimes pointed out. “When people evaluate AI technology, sometimes they evaluate it against a 100% standard, meaning it has to be right all the time. It’s a better idea to evaluate AI technology against a more realistic standard: Can it improve the overall situation? I believe it can if it’s monitored and has human oversight.”
Here is a look at how three providers with financial services clients are bringing emotion AI to video, text and voice.
Video banking provider POPi/o lets financial institutions connect with their customers on a screen set up in a branch or via their personal devices. Out of POPi/o’s 91 bank and credit union clients, about 10% have enabled the Positivity Coach, the platform’s emotive recognition feature.
During a video call, the Positivity Coach will feed emoji-like “smiley” faces, with a range of 16 different expressions, onto the agent’s screen to signify how the conversation is going in real time.
This feature can elevate cues that the agent may have missed. “We can tell if people are sad, happy, anxious, concerned or disgusted,” said Gene Pranger, CEO and founder of POPi/o. “Even though a person may not smile or frown, we can tell because of eye lines or cheek structures or furrow of the brow what mood a person is in.”
POPi/o’s proprietary algorithm is based on a database of 100,000 facial expressions collected by a group at the Massachusetts Institute of Technology.
If a person’s temper is rising or the person is exhibiting signs of disappointment, the Positivity Coach will reveal tips that help the agent deescalate the situation and end on a high note. For example, a tip may prompt the agent to smile more, or soothe frustration with a reassurance such as “‘I know how difficult that is, I’ve been there, I can wait, don’t worry.’”
The Positivity Coach will also, as its name suggests, tell agents when they've conducted a call well.
At the same time, POPi/o’s algorithm is examining the agents’ faces to gauge their responses. After a call concludes, POPi/o delivers an overall score out of 10 that rates the quality of interactions between consumer and agent. Managers can check scores in aggregate by day, month or agent and compare those to the customer’s own rating (out of five stars) at the end of a call.
The $249 million-asset Texas Tech Credit Union enabled the Positivity Coach feature about a year ago. The goal was twofold: to make sure agents are conveying the kind of service they intended, and to compare these observations against the surveys members fill out after each video call.
“One of the biggest things for our team is realizing that it’s really easy to lose connection with someone over video,” Huertas said. “This has helped us be mindful even while we are multitasking to keep our members engaged.”
The text analytics firm Clarabridge counts 38 financial institutions, including Bank of America, Capital One and U.S. Bank, among users of its technology, which is meant to help reduce complaints and compliance risk and improve digital experience and market research.
The Clarabridge Analytics platform looks for linguistic markers in social media, surveys, messaging and chat platforms, review sites, call transcripts and more to extract useful data. When audio is involved, the platform will consider how volume, interruptions and more offer additional context.
Word choice and grammatical construct can provide clues to a customer’s feelings. For example, if someone says, “Wow, this is the third time I called for [X topic] and I’m getting really frustrated,” Clarabridge will pick up on the fact that it’s the third time the customer has called (indicating an inordinate amount of effort), and how the word “really” intensifies the feeling of frustration.
Sometimes the takeaway can be counterintuitive. For example, Clarabridge is researching contextual indicators that would help an agent discern situations where empathy is not the desired response.
“In crises, speed is much more important" than empathy, said Fabrice Martin, chief product officer at Clarabridge. “You don’t want an agent to say ‘I’m sorry,’ you want the agent to solve your question really quickly if you’re about to go into foreclosure.”
While Clarabridge will perform some analysis while a conversation is happening, the company thinks clients can derive the most value from reviewing the data in aggregate — like treating the whole disease rather than an isolated symptom.
For example, if callers are throttling the contact center with the same questions, a bank can proactively change a policy or waive certain fees to address the underlying issue, retrain agents, or get an overall sense of how customers perceive their brand.
The technology has also served Clarabridge’s bank clients who are increasingly sensitive to customers’ anxiety, frustration and fear during the pandemic.
“Our financial customers want to understand really well what topics might be sensitive, such as [Paycheck Protection Program] loans, and make sure staff is well trained and prepared to deal with those questions in an empathetic and helpful manner,” Martin said.
While there will always be outlier situations where the human touch is necessary, scale is where Martin argues that the machine wins every time. “Even humans don’t necessarily recognize emotions in the same way,” he said. “And there is no human team that would be able to analyze tens or hundreds of thousands of conversations every hour of every day.”
Cogito Corp., which has clients in wealth management and credit card services, operates on the idea that when people speak, the energy, pauses and intonation in speech are powerful signals of how a conversation is going — perhaps more so than the words themselves.
“It’s a very honest information source,” said Skyler Place, chief behavioral science officer at Cogito.
Cogito’s algorithm analyzes these aspects of speech to figure out if a customer is satisfied. At the same time, notifications will slide onto the agent’s screen to nudge them into altering their behavior in real time.
For example, the Slow to Respond notification can jolt a distracted agent into action, if they have let a few seconds elapse after the customer has finished speaking. If an agent is speaking too robotically, the Energy Cue, with an image of a coffee cup, will prompt the agent to perk up.
Cogito also provides dashboards for supervisors that summarize trends. For example, if a supervisor sees that one agent is frequently notified that they are “Speaking Quickly,” the manager can coach the person to slow down — or perhaps decide that the customer service script is too long and needs to be condensed.
Use of the Empathy Cue notification, which measures what the customer is doing rather than the agent, has increased dramatically during the pandemic. It indicates that the customer is speaking with heightened emotion, perhaps because the person is upset, angry or excited.
“The agent may not be able to solve the underlying problem, but we want the agents to acknowledge the emotional state of the caller,” said Place. “We’ve had situations where someone calls and is really upset because they just lost a loved one, and the agent goes into autopilot and asks about the weather.”
And one major advantage is that technology looks at every caller through the same lens.
“It’s objective and consistent,” said Place. “The same algorithm applies to every call, so there is no bias based on any aspect of who the person is.”