05 Nov 2024
Today’s AI voice assistants, like ChatGPT and Gemini, transform how consumers search for products, place orders, and track shipments. These systems use large language models (LLMs) to monitor behavior and predict future needs, engaging users in natural conversations. Powered by Generative AI, they assist with everything from grocery shopping to trip planning, creating seamless consumer experiences.
Marketing scholars and researchers are currently working on testing AI-driven empathy that enhances customer responses and adds empathy to AI-led interactions, which can revolutionize consumer experiences and boost brand effectiveness.
“As voice assistants like Amazon Alexa become integrated into our daily life, their unemotional and mechanical nature often makes them feel like poor substitutes for genuine human interaction,” said Dr. Alex Mari, Senior Research Associate and Lecturer at the University of Zurich. Drawing from his extensive experience in digital marketing and AI, Dr. Mari gave a talk on “Pioneering the AI Empathy Frontier: How to Design Performative Voice Assistants for Marketing” at Sasin Research Seminar.
Dr. Mari conducted a study on consumer responses to the empathy exhibited by AI agents. Affective empathy refers to the actual sharing of another’s emotional state, allowing individuals to mirror their emotional experiences. Compassionate empathy extends this resonance into active concern, motivating support or assistance. Cognitive empathy involves recognizing another’s emotional state (i.e., perspective-taking), which is foundational to both affective resonance and compassionate engagement.
Dr. Mari explained that a salesperson’s individualized attention can prompt positive responses from the observer based on the belief that the counter is warm and personable, resulting in the customer’s trust and confidence in the service provider. AI empathy refers to the capacity of AI agents to recognize and adjust to the cognitive needs and emotional states of the human interlocutor. However, Dr. Mari said that there is a limitation to the definition as AI agents don’t live subjective experiences or have a capacity for experience-sharing.
While pre-LLM AI agents remain incapable of sensing, understanding, and mimicking human emotions while adapting their behavior to their counterparts, the recent launch of Empathic Voice Interface (EVI) by Hume AI substantially advances the field of AI empathy. EVI is designed to interpret and respond to user emotions in real time by analyzing speech prosody, including tone, pitch, rhythm of speech, vocal bursts, and emotional language. By seamlessly integrating foundational LLMs like ChatGPT or Claude, EVI combines advanced language modeling with a proprietary empathic AI module to enable adaptive and affectively attuned exchanges. Such advancements underscore the growing potential of GenAI to emulate human-like empathy in voice assistants.
To illustrate AI empathy in action, Dr. Mari shared a video clip showing an AI assistant responding to a user stuck in traffic. When the driver jokingly said they loved being stuck in Bangkok, the AI replied, ‘The heat can be tough to handle.’ This exchange highlighted how even simple, human-like responses can create a sense of social connection with AI, enhancing user engagement.
“It’s a sort of persona effect, you think it’s a social entity, someone else, and this creates a bond, so people see more favorably voice interaction than text-only interaction,” he said. However, he warned that adding empathic features to an AI agent is not always desirable as it may backfire. Sometimes, an authenticity issue could occur, and people may doubt if AI really has empathetic qualities.
According to Dr. Mari’s research, “Empathic Voice Assistants: Enhancing Consumer Responses in Voice Commerce,” AI empathy is crucial for managers wishing to shape consumers’ responses. Dr. Mari’s research uses the service Robot Acceptance Model (sRAM) as a theoretical basis to model users’ adoption and acceptance. Voice assistants were rated by the subjects on functional, relational and social-emotional dimensions. Dr. Mari’s team developed a custom Alexa app, “Voice Shopping,” mimicking the native voice shopping process including utterances, interaction flow, and voice characteristics. Two versions were tested—standard and empathic—with 412 families, each tasked with purchasing simple items like batteries and paracetamol tablets.
The results convey that families interacting with the empathic version of Alexa responded more positively across all AI agent attributes. However, when evaluating the functional characteristics of the voice assistant, such as usefulness and ease of use, individual shoppers preferred efficiency over empathy, indicating that empathic features may be less effective when consumers are alone.
Dr. Mari’s research shows that while empathetic AI can enhance customer experience, it’s crucial for marketers to understand their audience’s context and needs. He warned that applying AI empathy too broadly can backfire, as some users may find these interactions inauthentic. Marketers should focus on tailoring AI interactions to specific consumer motivations and environments. By using empathic voice assistants like EVI, companies can simulate all essential dimensions of empathy and increasingly be perceived as authentic and genuine by their users, especially when prior emotional bonds with the assistant are formed.