I've put together some survey design tips that I hope you'll find helpful. I realize that there are other variables to consider depending on the type of survey or the data collection methodology, but these general guidelines should apply regardless.
General Survey Guidelines
1. First and foremost, define and know your objective! As the saying goes, "garbage in, garbage out." If you don't have an objective in mind, your survey initiative will fail. Think about how you will analyze the responses and ask the questions in an appropriate manner.
2. Open your survey with a brief introduction, and I would state your objective (in customer-friendly terms) here, as well. Respondents want to know why you're conducting this survey and what you're going to be doing with their responses. Don't set expectations about actions and follow-up here that won't be executed. Also give an honest indication of how long the survey is or how long it will take.
3. Think about survey/question flow. Start with questions that warm up the respondent to the topic. As you dive into the survey, put questions in a natural, logic flow and in sections rather than jumping around in some illogical sequence. For example, in a post-transactional survey, ask questions in the flow of the experience; and when you are conducting brand awareness surveys, they come with their own set of requirements for how questions should be asked.
4. Know the reason for, and impact of, question placement. If you ask overall satisfaction at the beginning of the survey, you are getting a top-of-mind rating. If you place the question at the end of the survey, you have taken the respondent through the experience again via the flow of the survey and the questions asked, so the overall satisfaction rating will reflect that experience. You will get two different scores, depending on placement. Several years ago, I tested this theory on seven different surveys for seven different clients, and when the osat question was asked first, the score was always lower. I had a client who insisted on moving the question from the end of the survey to the beginning after years of having it at the end. I warned that the score would drop if we did that; the client still chose to move the question, and in the end, their osat score dropped one full point (on a 10-point scale) from the previous year! (Know that any discussion around placement of the osat question can be a "religious" one, and there could be a variety of differing views and opinions on this topic.)
5. Be mindful of survey length. Transactional surveys can be brief, e.g., 10-15 questions max, whereas relationship surveys can be a bit longer, e.g., 50 questions (where respondents only see those questions relevant to them, in essence making the survey shorter). Other methodologies may call for longer surveys. Use attribute grids to logically (questions that belong together) group questions with same rating scales. And don't forget progress meters to let respondents know where they are.
6. Ask a mix of closed-ended and open-ended questions. It is not necessary to ask an open-ended question after every closed-ended question, e.g., every rating question. As a matter of fact, I strongly suggest you limit the number of open-ended questions in your survey. You need to have at least one, but don't have 20!
7. Impacting survey length is question relevance, which each of the following will help with.
- Don't ask things you already know about the customer, e.g., last purchase date, product purchased, date of support call, etc.
- Only ask questions that are relevant to that customer and his/her experience. For example, if you know the customer owns Product X and Product Y and recently called about support for Product X, don't ask questions about Product Y, too. Or don't ask questions about marketing materials in a support post-transactional survey.
- Don't allow other groups or departments to commandeer the survey by adding questions that are not relevant to the survey objective.
- Use smart survey techniques to skip questions not relevant based on responses to previous questions.
9. Speaking of language, if your survey is going out to a global audience, be sure to offer respondents the option to take the survey in their preferred languages.
10. Remember that you cannot collect personal information from anyone under 13 without parental consent.
When in doubt about general survey and sampling guidelines, follow the CASRO Code of Standards.
Question Writing Guidelines
1. Don't ask double-barreled or compound questions. That means, keep your question to just one thought and not a couple. For example, if you ask about "quality and timeliness of issue resolution," I'm not really sure how to answer that. You have just asked me about two concepts: quality and timeliness. What if the quality was great, but it took you forever to resolve the issue?
2. Make sure your questions are not ambiguous. Write questions clearly. If a respondent pauses and says, "What do they mean by that," then the question is poorly constructed.
3. Ensure that the questions are actionable. Ask yourself, "If someone rated that question poorly, what would I fix as a result of that?" If you can't answer that question, then throw out the question.
4. Similarly, every question should have an owner. If you can't attribute the question to a department or individual who owns its response or rating, pitch it. You're just asking for the sake of asking. (Granted, there will be some questions, e.g., demographics, that don't fit that requirement and will be needed to make the survey analysis more robust and the data actionable.)
5. Your question response choices and rating scales should be mutually exclusive. And do your homework; make sure you provide a complete list of response choices. I hate when the one answer that should be there is missing. Be sure to provide an "Other (please specify)" when appropriate.
6. Don't asking leading or biased questions. "We know you loved our new soft drink. How much did you love it?"
7. Use randomization of response choices to avoid positioning bias; but use this judiciously, i.e., doesn't make sense for every response choice list.
8. Use proper grammar and make sure you spell check!
9. Offer an "out" for questions, where appropriate. For example, not everyone wants to tell you their household income or about their children, and you may ask some questions for which they genuinely don't have an answer. Similarly, do not make every question in the survey required. This really makes for an awful respondent experience.
10. For closed-ended questions, be specific. Ask exactly what you want to know, e.g., "What can we do to ensure you rate us a 10 on overall satisfaction next time?" Or, "Tell us the most important reason you recommended us to your friends."
11. And, last but certainly not least, I'll briefly address question scales. Like placement of the osat question, question scales are a religious discussion. Get 10 researchers in a room and get 10 different views of which scale is best and when. My point on scales will be this: be consistent on your use of scales within a survey. Clients have handed me surveys to review that have five different scales within each survey. That's a disaster for a variety of reasons, not the least of which is the respondent experience.
12. Don't forget to thank your respondents for their time at the end of the survey!
I hope these tips are helpful. The main thing to keep in mind... as CX professionals, we know we need to think about the experience with a company from the customer perspective. The survey design process is no different: think about the customer experience as you design the surveys. After all, surveys in their simplest form are just another touchpoint that you'll want to execute flawlessly.
Come back for my next post, when I outline how to maximize response rates.