How to ask the best customer survey questions
How to peel back the layers on your customer survey questions and ask the best question for your organization
Add bookmark
Customer surveys are important. We need to ask questions to understand how our customers feel about our products, our services, our business. For example, "What do you think about this offering?"
But sending that survey without questioning the questions regularly is detrimental to understanding your customers’ changing needs. Sometimes you just need to change a question.And sometimes that change only needs to be one word.
How do you know what to do, if anything? You might need to do one of my favorite things - a survey about a survey. Drill into a question and ask the following:
- What do customers think the question means?
- What do staff think it means? Is it the same as your customers?
- How open to interpretation is it? Are all customers reading it the exact same way?
- Is the question leading them to an answer? Or is it neutral?
- Is this something the company cares about? Or is it a holdover that is no longer relevant?
- What do you/your organization WANT to measure?
Ask these questions to those who use this data the most. Ask these questions of the leaders of your organization. And most importantly, ask your customers. If you don’t already do listening sessions with some loyal customers, here’s your first opportunity to start!
Once you’ve done your research, you should have an idea of what you really should be asking. Then you can tap into a tried-and-true marketing tactic.
A/B customer survey questions testing
Marketers love to use A/B testing for website and ad clicks, so why can’t customer insights professionals? A/B testing, also known as split testing, is a method where two versions of a webpage or design element are compared to see which one performs better based on specific metrics. Apply this logic to your customer surveys and you can harness the power of A/B testing to find out: What’s the right question to ask?
Here’s a real life example of a simple wording change on a grocery shopper survey that not only lead to a 3 percent increase in satisfaction, but a host of other unexpected outcomes including: a greater understanding of what good customer service means, a mutual comprehension of language between customers and staff and a ripple effect that encouraged more question interrogation in the future.
The case of the “friendliness” question
For almost 15 years, one question in the checkout section of the shopper survey measured "friendliness" by asking:
”Did you receive a greeting and farewell from staff?”
In more than 15 years, a lot has changed at checkouts including the age demographic of primary shoppers, and what is interpreted as "friendly" by both staff and customers.
Using the reflection questions above, the "friendliness" question was put under the microscope:
- Customers, rightly so, believed it was asking about a specific interaction only, regardless of how friendly it was.
- Staff believed the question was asking about how friendly they were with customers. This was different from what customers thought.
- It was not open to interpretation and most customers read it literally. While clarity is usually a good thing, in this case, it limited the types of friendly interactions being acknowledged.
- Positively, the question was neutral.
- After careful probing and discussion with leadership and store teams, it became clear that the question no longer measured what the organization actually cared about. Staff were no longer trained to use the traditional retail “hello and goodbye” script and so the question was measuring the wrong thing.
- What the company actually wanted to know was how friendly its staff are. Did customers have a positive experience with a staff member?
With those insights and a listening session or two, a new question was developed: “Did you have a friendly interaction with staff?” This opened up the question to all kinds of engagement and more specifically, called out the word, “friendly” as the focal point.
Enter A/B testing for customer survey questions
The original question was sent to half the survey respondents while the new question was sent to the other half over a period of eight weeks. Results were astoundingly different.
Across millions of responses:
- Customer satisfaction (CSAT) for “friendliness” was almost 3 percent higher for the new question.
- Dissatisfaction slightly decreased.
- With no other changes to the store experience or the survey, it appeared that customer service had magically and dramatically improved.
But all that changed was bravery to ask a better question.
Still skeptical? Ask it twice
“What if somehow, that group of customers just happened to have a better experience?”
If your stakeholders don’t believe what’s right in front of them, what’s your next move? Ask nearly the same question on a survey, twice. And that’s exactly what happened - for more than 12 weeks still.
The results were the same. Even on the same survey, asking the same customer two very similar questions about the same shopping experience, the data showed that "friendly interaction" scored higher than "greeting and farewell."
Customers were interpreting them differently, because they are different questions.
And now, on the shopper survey, only one of those questions remains - the new one, which asks customers about friendly human connections, not the robotic tick-box scripts.
What next?
As with any survey change, it’s important to account for the "data blip," or the moment it changed. Accurate year-on-year comparisons are crucial to measurement and in a large business, communicating the change is key to ensuring correct interpretation.
Without proper communication and documentation, it might just look like, “Staff are friendlier than they were last year, we must be doing something right!” When you change the question, it’s important to bring everyone along for the journey, and the changed results.
Following the success of the new question, an announcement was made to frontline staff, so they understood exactly what was being measured - transparency is more important than ever! In addition, numerous follow-ups were also held with key stakeholders to explain the change, the why behind it and of course the incredible, inarguable results.
Pro tip for a resistant stakeholder
Communicate the change as far and wide as possible with contagious enthusiasm, and if you find you're struggling with a particular group, my favorite hack for changing hearts and minds of even the most resistant stakeholders is to let them choose what they think the current question is.
Give them a multiple choice option with the current question, the new question and some plausible distractors and let them choose what they think the current question wording is. The results will surprise you (and them).
Go forth and start asking better questions!