-->
  • April 17, 2026
  • By Ian Jacobs, vice president and lead analyst, Opus Research

Cold Circuits, Warm Heart: AI and Empathy in CX

Article Featured Image

“We are programmed just to do/ Anything you want us to” —Kraftwerk (“The Robots”)

There’s a fascinating debate going on in the worlds of artificial intelligence and customer experience concerning whether AI can be empathetic, and if so, how. Maybe it’s not a culture-stirring debate on the order of “Should certain types of speech or expression be censored or restricted?” or “Is affirmative action necessary to address systemic inequalities in society?” But it’s definitely compelling to those of us in the CX space.

My own view is straightforward. AI is not itself empathetic, but its output can be. Or, more precisely, it can be perceived to be empathetic, and in customer experience that perception is the whole ballgame. Customers aren’t grading a machine on its inner emotional life. No one is asking the bot to keep a journal, stare out the window, and wonder whether it has become emotionally unavailable. They’re judging the interaction in front of them. Did it recognize their situation? Did it respond in a way that felt appropriate? Did it reduce stress, confusion, or effort? If the answer is yes, many customers will walk away feeling they were met with empathy.

That is not a trivial distinction. Human empathy involves lived experience, emotional understanding, and genuine feeling. AI obviously doesn’t possess those. But in service interactions, what often matters most is not whether the responder feels something. It is whether the response demonstrates understanding in a way the customer can recognize. When a conversational AI says, “I can see why that would be frustrating, and here are the next two steps,” it might feel more empathetic than a human agent who is technically sentient but distracted, impatient, or simply having a bad day.

This is one reason AI can sometimes outperform average support experiences. It can be trained to detect signals of distress, choose language that reflects context, remember prior details, and respond with consistency. It does not get tired at the end of a shift. It does not snap. It does not sound bored. Maybe most importantly, unlike human agents, it is not really time-pressured as we understand that concept. Writing something that seems considerate and reassuring can be time-consuming for a human agent; for AI, not so much.

Of course, the opposite is also true. Scripted sympathy with no real understanding behind it is not empathy. It is theater. Customers can tell when an AI is serving up a soft phrase while failing to solve the problem, repeating itself, or trapping them in a loop. AI that says ‘I completely understand your frustration’ before sending you back to the same dead link is not empathetic; it’s a hostage negotiator with no authority to release the hostage. Empathy is not just about saying the right words. It is about recognizing context and helping in a way that fits the moment.

I’d go even further and say that the use of AI by organizations can be empathetic. This is where the conversation gets more interesting. Sometimes empathy is not about mimicking human warmth. Sometimes it is about designing an interaction that better respects the customer’s emotional reality.

Think about situations where embarrassment, stigma, or anxiety create social friction. A teenager asking questions about sexual health or pregnancy may prefer a private, well-designed AI interaction over speaking to another person. Someone facing debt collection may be more willing to engage with an AI system that is calm, nonjudgmental, and available on their terms. In cases like these, the AI is not empathetic, but the company deploying it might be. The empathy lives in the design choice.

In that sense, AI is a tool, and like pretty much all tools, it can be used in empathetic ways. Or not. A company can use AI to rush people, deflect problems, and cut costs with no regard for emotional impact. Or it can use AI to make difficult moments easier, more private, and less stressful. That’s the real test. So, to me, the important question isn’t whether AI feels empathy. It is whether organizations use AI to create experiences that feel more humane.

Ian Jacobs is vice president and lead analyst at Opus Research.

CRM Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues

Related Articles

Profiled by Prompt, Illustrated by Model: Audit Your LLM’s Assumptions

Your large language model thinks it knows you. Is it right?

Why Conversational Experience Orchestration Will Have Its Moment

Something needs to be at the helm making sense of customer experiences.

Too Much, Too Little, Too Late: Rethinking ‘100 Percent Coverage’ in Contact Center QA

Finding a happy medium between a few snapshots and blanket coverage.