In this series for CX professionals, I’ve outlined practical advice to start up a best in class CX program. The focus thus far has been on creating a big picture program calendar, distinguishing between a program and a project, focusing on sample selection as a precursor to holistic survey design, and doing so in the context of a detailed analysis plan. We have covered a lot of ground.
You may have noticed that we have yet to send a survey to your customers.
Great. Being the savvy pro that you are, you understand the virtues of preparation. The vast majority of organizations rush to push an inquiry to unexamined lists of people, eschewing quality in favor of immediate results. Having worked through the homework presented in this series, you feel confident your CX program will yield material benefit due to your careful planning.
Thinking holistically about CX requires this pacing. As you apply these principles to new CX projects, the faster you will be able to move through the process, and you will be rewarded with a verifiable best-in-class CX program.
The prep is done, and you are ready for the moment of truth: Fielding your survey! While it is tempting to go straight to the “send” button, you should field with one more round of checkpoints to ensure that the study is as free from error as possible. Your customers will benefit, your metrics will be clear and actionable, and you will sleep easier the night before launch day.
Before You Send Your Customer Survey: Pretest, Pilot, Plan and Pivot
Conduct one last read-through, preferably in a group setting. Reading your survey questions out loud allows you to catch mistakes or unclear concepts. It will give you a rough idea of the survey timing and puts you in the respondent’s shoes from the get-go. Pay attention to connotations and ask yourself: “Does this word mean what we think it means? Is it easy to understand? Is this the language that my proposed sample uses?”
Send a test to a small internal sample. This helps ensure everything is working properly, that the survey appears in the format you were expecting. It’s also a great way to socialize the launch internally. If there are any issues, you will only expose a few respondents to the mistake, and you can remedy the problem before the broader launch.
In the best case scenario, you have done your homework previously outlined in this series before deployment and now it is time to observe the early results, keeping your eyes and ears open for the unexpected, as an anthropologist in the field might. This means looking at completion percentages and creating a topline report. Check it against your analytical plan and look closely for any early indicators that may disprove your hypothesis. Preparing a real time dashboard can greatly aid this important step.
An authentic conversation goes both ways. After you’ve sent out a link, mark a time to send a reminder (only to those who have not answered!) with a completion percentage. People are more likely to respond when they see how engaged the whole group is. Remind them that it is short (make sure that it is short!), and that you will show them how everyone voted at the conclusion of the survey. Think about other incentives that may appeal to your target audience. Present a plan of action for remedy, a thank you for long tenure, recognition for loyalty or positive feedback, or an appealing promotion to drive customer return.
If you see a need to modify your survey, don’t be afraid to do it. While not ideal, you can absolutely change a study while it is in the field. A correction in the field can solve glaring problems that would compound otherwise. Focus less on the consistency of a metric and more on having the most authentic conversation possible with your customers.
If responding customers are using your open text field to tell you your survey is long or intrusive, make it shorter or change it. A note to the sample group letting them know that you heard them and have taken action is a positive CX move and will drive general engagement across the board.
Watch the data roll in, track your response rates, and walk a topline report into the corner office. Be ready and able to explain why you used every word you did and share why you deployed the project at the time you selected. Come prepared to justify any changes you made with data-driven evidence and begin to think about changes you’d make in the next iteration. Enjoy your time in the field, because this is your moment to catch your breath.
When you reach your predetermined number of responses and find answers to the questions that you carefully constructed, it’s time to share the link to the real-time dashboard that stands as your final report with your key stakeholders. Now the work begins in earnest, so stay tuned for the last installment!
Eddie Accomando, XM Scientist at Qualtrics, is an applied anthropologist who has 25 years of experience in the design, deployment, and maintenance of enterprise-wide CX programs. A strong methodological focus can be brought to bear on real-world programs, and he applies qualitative and quantitative research techniques to reveal insights that drive action within organizations.