10 Ways to Get a Horrible Survey Response Rate

Jeff Sauro, PhD

You’ve worked hard designing your survey, you need data to make better decisions for your product, you need people to answer your survey!

Unfortunately, in our quest to squeeze the most out of our precious participants, it gets difficult not to commit some survey sins.

Inevitably one or a few of these response rate killers will creep into your next survey project. Knowing more about them should improve your next survey response rate.

  1. Long, Long, Long Surveys: There are usually many stakeholders who have their initiative or pet question that needs to get in the survey. Not surprisingly, too many questions lead to survey bloat and long surveys. The longer the survey, the higher the drop-out rate; this is especially the case when users aren’t engaged or get the same type of questions over and over. If you must go long, mix things up, provide some sense of progress.  Consider randomly assigning a subset of questions if you must include all questions.

     

  2. Many Open Ended Questions: Using a mix of open-end comment boxes helps provide some of the “why” between pre-determined rating scale questions. However, open-ended questions take more effort and time to respond to. Surveys shouldn’t bring back memories of college exams. If you must have many open-ended questions, consider making them optional or at least help focus the question so respondents aren’t being asked to write an essay on customer satisfaction.
  3. Giant matrix questions:  When multiple items have the same response options (e.g. strongly disagree to strongly agree) you can save screen space and participant reading time by putting the items in a matrix instead of repeating them.  However, if the space saved by matrix questions becomes an excuse to include many more items, you’ve more than offset any time savings.
  4. Impossible screening criteria:  Looking for women between the ages of 25-34 who bought a washing machine on their smartphone in the last month but have never heard of Maytag?  Getting the right people to take the survey is important, but the more specific you get, the harder it is to fill your survey.   We often see very similar responses to questions for participants who meet some, but not all the criteria, so while targeting is important, getting too targeted can mean never filling your survey.
  5. No progress bar or sense of completion— When will it end! It’s a basic law of human behavior. We need to know where we are and where we’re going next.  If your survey doesn’t have a progress bar, provide the poor participant with some sense of progress, even if it’s words of encouragement sprinkled throughout the survey letting them know it will eventually end.
  6. Really personal questions right up front:  Hi, how old are you?  How much do you make a year ? What do you spend on dining each month?  It’s not a good way to start a conversation in-person and it’s also not a good idea in surveys. Gathering demographics on your respondents is an important way to further analyze your respondents, but asking too early and too many of these personal questions is a recipe for abandonment.
  7. Redundant and irrelevant questions:  If a question won’t apply to a segment of your participants then try not to ask it. Too many “N/A” response options is usually a good clue. Take the burden off the participant by using logic and branching, or even better, consider cutting the items.
  8. Confusing and challenging questions :  Forced-rank questions help identify what really matters to respondents. However, ranking more than a few items gets challenging fast–especially when respondents don’t have strong opinions about more than a few items. A little pre-testing with qualified participants will help identify the challenging questions. And for ranking more than a few items in particular, we recommend using the top-task services.
  9. Questions that you have little or no opinion on : Please rate your level of agreement to the following statements about the role of the Federal Reserve’s recent announcement on monetary policy. If you ask people their opinion, they’ll give you one, even if it’s uninformed or people really don’t have a strong feeling either way. Too many of these types of questions are burdensome to answer. Bonus: remove the neutral or N/A response for these types of questions to get both irrelevant data and higher dropout rates.
  10. No or low incentive: You don’t have to pay people to take a survey. In fact, sometimes the best response rates come from leveraging the interest participants have in having their opinion heard. But for many surveys, you’ll just need to pay the right people the right amount to take it. The longer, the duller and the more complex the more you should budget for incentives.
0
    0
    Your Cart
    Your cart is emptyReturn to Shop
    Scroll to Top