February 25, 2026

Customers aren't anti-AI. They're anti-wasted effort.

3 min listen

3 min read

Fifty-seven percent of customers expect a clear path to a human agent within five AI exchanges. Not eventually. Within five. And 54% will walk away entirely if they've spent 10 minutes getting nowhere.

That's not a rejection of AI, but a rejection of wasted effort.

Customers now start with AI willingly. Fifty-nine percent prefer it as their first stop for support. The problem is what happens when the technology stops working and there's no clean way out.

Five exchanges isn't a countdown

The five-exchange threshold reads like a patience score. It's actually a design signal. When customers hit that point and still haven't found a path forward, the system has stopped feeling helpful. It feels obstructive.

Most CX teams hear that stat and reach for escalation rules: at what point do we route to a human? That's the wrong question. The threshold tells you that customers want to know the exit is there before they need it. Visible access to a human changes how every exchange before that point feels.

A customer who can see the door stops counting exchanges. A customer who can't starts. That's the difference between an AI experience that feels like support and one that feels like a trap. It's a design decision, not a technology limitation.

Get the full 2026 Customer Expectations Report

New Wakefield Research data on how customers experience AI support and where loyalty breaks.

Designing for the threshold, not past it

Five exchanges is a signal that a handoff should already be happening, not a benchmark to optimize toward. The brands getting this right are building AI that treats the transition as part of the experience, not a fallback.

In practice, that starts with making the exit visible before the customer needs it. The option to reach a human shouldn't appear only after the AI fails. It should be present from the first exchange, not as a surrender button but as a signal the system is on their side. When customers know help is available, patience tends to extend. The threshold shifts.

The handoff itself matters just as much. When a customer reaches a human, the agent should already know what the AI covered, what was tried, and how long the customer has been in the flow. A transfer that drops context erases everything the customer already said. That's where effort accumulates.

Measurement is the third lever. Re-contact rate within 48 hours, time-to-resolution, and post-interaction NPS tend to be stronger proxies for whether the experience actually worked than whether the case closed. If customers are resolving but not returning, the effort signal is there. The data just isn't being read that way.

AI handles what it handles well: order status, simple returns, known answers. The goal is matching the right tool to the right moment. Done well, 57% of customers report consistent satisfaction after a smooth AI-to-human handoff, and 33% increase their purchases.

Customers want AI that knows when it's the right tool — and what to do when it isn't.

See how Gladly handles the handoff

Built around the customer, not the ticket. See Gladly in action.

Angie Tran headshot

Angie Tran

Staff Content & Communications Lead

Angie Tran is the Staff Content & Communications Lead at Gladly, where she oversees brand storytelling, media relations, and analyst engagement. She helps shape how Gladly shows up across content, PR, and thought leadership.

Frequently asked questions