Managing the Right Customer Experience Measure

Jeff Sauro, PhD

If you can’t measure the customer experience, you can’t manage it.

Improving the customer experience starts with measuring.

But you’ve got to be sure you’re getting the right measure (or usually measures) to manage.

The right measure will:

  • identify problem areas
  • track improvements over time
  • be meaningful to the customer

The wrong measure can:

  • identify wrong areas of focus
  • miss problems all together
  • lead to unintended consequences
  • alienate customers

Here are some different ways of thinking about measuring experiences.

Customer request closed vs. customer request resolved

The front lines of the customer experience are often the customer support channels–phone and email. For example, I love Mint.com and have probably recommended it to at least 5 people in the last year. For some reason my PayPal account hasn’t updated in my Mint account for six months. After trying several tweaks over a few months I finally got around to submitting a problem ticket. I was told I’d receive a response within 24 hours.

And I did!

Eight hours later I was told that this sort of thing happens from time to time and is most likely a temporary issue, trying again in a day or so should work. Ticket closed. Yikes!  It’s been a 6 month temporary issue. I suspect that request was considered successful because it was both closed and closed quickly.

Conversion rate vs. number of conversions

Conversion rates are the central metric for testing better designs, ads and campaign effectiveness. The ratio of total users who purchase, register or click (convert) to all users who viewed the page is an effective ratio because you can compare low traffic and high traffic pages.

While this is a convenient metric, the total number of conversions likely has a bigger impact on our bottom line than the rate.  Wouldn’t you rather have 100 conversions from 100,000 pageviews than 10 conversions from 100 pageviews?

In chatting with Jared Spool at IAS 11, we called this a denominator problem. He says he tells his clients he can increase conversion rates very easily…by reducing traffic to the site.

Number of clicks vs. time to destination

When you’re trying to make a more efficient experience, reducing the number of clicks to accomplishing a goal seems like a good way to measure. Putting all functionality and content on one page would certainly reduce the number of clicks, but that’s probably not what the user had in mind when he said it should take less time to complete tasks.

Call time or call satisfactorily resolved

Wonder why those often scored customer services agents you call to complain to speak so quickly? If you want to reduce call time in a customer support center, you can instruct agents to get off the phone faster, but have you really increased service or quality if customers have to call back? Often a simple follow-up question sent via email can solve this problem. See also Customer request closed vs. customer request resolved above.

Net Promoter Score as a bonus motivator

I use the NPS all the time as a gauge of customer loyalty and a lot of my clients do as well.  Many companies now pay bonuses based on achieving and exceeding Net Promoter benchmarks or other customer satisfaction goals.

When I took my car in for repair a few months ago, the client service representative helping me told me to expect a follow up survey and that the “right answers” to the questions were “strongly agree.”

Sometimes the problem isn’t always this obvious. Even well executed surveys come from customers who are willing to both provide contact information AND spend the time filling them out. From time to time it helps by taking a look at external benchmarks to see if you’re only sampling the happy promoters.

Likelihood to Recommend or Likelihood to Repurchase

With the popularity of the Net Promoter Score, it may seem like word of mouth is the only measure you should care about. But if everyone already knows about and owns your product or visits your website, likelihood to purchase again might be a better measure of growth. I recommend using both.

On time arrival versus on time departure

Have you ever pulled away from the jet-way only to sit on the runway waiting for mechanical issues or other delays? You then arrive at your destination late?  It’s likely that flight segment counted as an on time departure. You can’t argue with the measure–the plane did pull away on time and that does mean something–it just doesn’t mean that much to the customer.

Number of Golf Balls Flushed

If you’ve ever bought a toilet from a hardware store, then you might have seen those signs showing you how many golf balls the toilet can flush. Not sure about you, but I don’t flush too many golf balls or any of the other things they show in this video. (My children might—so at least I’m covered there).

 

Temperature vs. Comfort

Temperature is sort of the go-to example when we think of measurement. But temperature is an indication of when water will boil or freeze. Comfort is how we feel given the temperature, humidity, altitude and wind. I’ll take 45 degrees in Denver over 45 degrees in New York and Chicago any day (sun and dryness make all the difference).

I’ll do 90 degrees in Phoenix Arizona over 80 degrees in Washington DC–the difference is 90% humidity vs. 10% humidity. As Richard Saul Wurman said in Information Anxiety, Comfort is human; temperature is for thermometers.

Credentials vs. Learning

I teach statistics to both professionals and students. It’s hard to tell the difference between students and professionals early on. After a few days they’re easy to differentiate: students become more focused on getting the right answers and professionals become more interested in understanding and learning how to apply the methods.

The difference is in the metric: grades on a test versus better decisions and a skill you can apply. This is a larger issue with academia in general; it’s often less about learning than about getting very expensive credentials—but that’s the measure both students and schools are managing.

Measuring is good. Knowing what to measure is better. Finding the right measure means taking multiple measures and seeing which one best tracks other customer sentiments and revenue.

0
    0
    Your Cart
    Your cart is emptyReturn to Shop
    Scroll to Top