Delve: Why Friendly AI Chatbots Don’t Always Deliver Five-Star Customer Service, with Elizabeth Han
Logically everyone knows that software doesn’t have feelings, but AI chatbots that express emotion—as well as other advanced artificial intelligence tools like and —have a sentient quality that places them somewhere between machine and human. Conventional customer service wisdom shows that when human employees express positive emotion, customers give higher evaluations of the service. But when emotionally expressive chatbots enter the equation, people’s reactions change depending on their expectations.
Research by Desautels Faculty of Management professor Elizabeth Han investigates the effects of AI-powered chatbots that express positive emotion in customer service interactions. In theory, making software appear more human and emotionally upbeat sounds like a great idea, but in practice, as Han’s research shows, most people aren’t quite ready to make a cognitive leap across the uncanny valley.
“We found out that if a chatbot expresses positive emotion, it disconfirms people’s expectation. And what kind of expectation? The expectation that machines cannot feel emotion. How can they express emotion when they cannot feel the emotion? There's this cognitive dissonance coming from that violation of expectation—and that's actually causing a negative impact on the customers’ evaluation of the service. It’s like those two competing mechanisms cancel out each other out.” – Elizabeth Han
Bringing Insight to the Surface
Founded in 2019, Delve is the official thought leadership publication of ż´Ć¬ĘÓƵ University’s Desautels Faculty of Management. Under the direction of Professor Saku Mantere, inaugural Editor-in-Chief, Delve features the latest in management thinking that stretches perspectives, sparks new ideas, and brings clarity to decision-makers at all levels and across sectors.
Feedback
For more information or if you would like to report an error, please web.desautels [at] mcgill.ca (subject: Website%20News%20Comments) (contact us).