customers in line at the service desk at Target.
PHOTO: Random Retail

Customer experience professionals have a variety of metrics they can deploy to measure the experience of customers and prospects. Some of the more familiar that come to mind are the Customer Satisfaction Score (CSAT) and Net Promoter Score (NPS). 

However, it may be time to invest in the Customer Effort Score (CES), as well, according to customer experience pundits. A TSIA Benchmark survey found companies with the highest CES scores for support interactions also have the highest NPS scores. The same research found high CSAT scores and high NPS scores do not necessarily correlate. 

“Customer experience data should tell you a story about customers’ experiences with your organization,” said Stephanie Thum, CCXP, chief advisor of federal customer experience for Qualtrics. “Think about CES as the opening lines of that story. You’ll need other data including anecdotes, focus group findings, or maybe even contact center recordings to help you and your colleagues understand the entire story, and then figure out what’s next." 

What Is CES?

In short, it’s asking your customers about an experience with a product or a service: “It was easy to accomplish this task. Please rank that statement on this scale.” According to a blog post by HubSpot’s Alex Birkett, customers rank their experience on a seven-point scale ranging from “very difficult” to “very easy,” which is to determine how much effort was required to use the product or service. 

The big picture here? Trying to keep customers loyal. “When it comes to service, it's recognizing that customer effort leads to disloyalty,” said Pete Slease, vice president at Gartner, whose company acquired CEB, which claims to have created the CES. “This was all born out of a question from senior executives: what impact does service have on customer loyalty? CEOs know their clients really don't differentiate based on the sales experience or the marketing experience. Service and support are in charge of differentiating the experience.”

Related Article: Customer Experience Complexity Calls For a New Breed of Measurement

More Valuable than NPS?

NPS does have its share of fans. But, according to John Ragsdale, distinguished vice president of service technology research for TSIA, the CES can even be more value than the NPS, which relies on asking customers how likely they are to recommend a product or service. “Likelihood to recommend is a very emotional thing, and can be completely meaningless,” Ragsdale said. “Sure, I’ll rate you a ‘10’ that I will recommend you, but in my head I’m thinking, only for people I don’t like, or companies with very simplistic requirements.”

CES Delivers ‘Friction Points’

CES identifies “friction points,” or the elements that negatively impact the customer experience, according to Ragsdale. It’s better than the NPS in this way, Ragsdale added. “If you use NPS, and they say they are somewhat likely to recommend, the follow-up question is, ‘Why?’ which could be about anything, and leads to vague and uncomfortable discussions,” Ragsdale said. 

Ragsdale suggests asking specific follow-up questions after your initial effort question to gain some valuable insights: i.e., what in particular generated effort on your part? “This allows honing in on specific issues, i.e., friction points impacting the experience,” he said.

In a TSIA presentation on CES, Ragsdale cited the common friction points:

  • Finding self-service options
  • Single sign-on
  • Unified search
  • Screen design/layout
  • Tools/routes to find content
  • Content filtered for my account
  • Content formats (text vs. video)

Getting Started with the CES

So where’s your starting point with CES? Ragsdale said getting started with CES is as easy as just asking questions, especially in post-support interaction and periodic relationship surveys. “Of course you don’t want to ask too many questions, which impacts response rates, so if they are nervous about moving entirely to CES at first, consider only asking the CES question of a certain percent of customers, and continue using the existing questions (CSAT, NPS) for the rest,” Ragsdale said. “This can be helpful in looking at how CES is trending compared to other measurements. The big change comes in how you react to the data.”

CES in Steps

This process may be a lot of material to digest, so Ragsdale recommends breaking it down into five steps to analyze customer effort:

  • Step 1: Collect internal insights.
  • Step 2: Develop a hypothesis.
  • Step 3: Research the current processes used to support customers.
  • Step 4: Collect customer feedback on customer effort.
  • Step 5: Map the customer journey, identify friction points, develop action plans to eliminate high customer effort.

Related Article: What Is Customer Satisfaction Score (CSAT)?

Focus on Specific Respondent Types

A good next step with gleaning meaningful insights and action steps is to follow up with those who answered with “disagrees” or “somewhat disagrees," Slease said. Why would you go after those "2s and 3s" on the seven-point scales? The "strongly disagrees" usually are upset with the company over an issue larger than a support person’s control, and are not likely to be helpful or even responsive, Slease said.

However, Slease said he finds the "disagrees" and "somewhat disagrees" to be thoughtful in their evaluation of the experience. “They're saying, ‘It wasn't good, I can tell you that much,” Slease said. "‘It was OK, not the worst thing ever. But it was bad. It was not good.’ Those are the folks to go after.” And you can slice that data even further by product type, tenure with the company, issue type, channel type, etc.

Related Article: 6 Strategies for Conducting Customer Experience Analysis

Focus on Good Things Too

You're probably asking plenty about poor experiences. But for those customers that rated you highly, ask what elements of the experience were lower effort than the experiences they have had with other companies, Ragsdale said. 

“Most companies focus only on low scores, but follow up on high scores, too,” Ragsdale said. “It is great to get details on what you are doing right so you can replicate it and instantiate it. This also allows great reporting back to customers, such as reports for advisory boards or at user conferences."

Approach Respondents From a Learning Perspective

Customers who take the time to answer your CES questions and share a negative response, don’t want to feel like you’re pressuring them for more details, according to Slease. “[So] take on a learning posture when you do [ask for more information],” Slease said. “‘Hey, you evaluated us this way. We appreciate the feedback, and we'd like to learn more about what your experience was like, so that we can make improvements for next time. So with that in mind, would you mind sharing with us a little bit more about why you evaluated it the way you did?’” Using the word “learn” puts many respondents at ease and makes them feel like they're in a position of teaching or one of authority, Slease added.

Related Article: How to Measure Customer Experience Beyond Net Promoter Score

Build Usage Mindfully

Customer experience pros just starting out with CES should build the usage mindfully, in conjunction with other metrics that tell the story of their customers’ experiences with your organization, Thum said. Communicate to senior leaders, managers and employees what those metrics mean and why they matter. Always connect them back to business goals. Making it easy to work with you should matter to your business, said Thum. “CES,” she said, “can give you a pulse on how you’re doing. But it should be a complementary part of a more holistic story that includes a mix of operational and experience data.”

At one of her prior organizations, customer experience had a strategic goal to improve the ease of doing business for customers, Thum said. “We linked to that goal," she said. "Human stories will surface while you’re collecting that data. Share those stories. Then communicate the results over time. Communication surrounding metrics, including the human stories that go with the metrics, is just as important as choosing the right metrics.”