Customer Survey FAQ’s

  • Q: What is the latest thinking about offering incentives to participants of customer satisfaction surveys? Certainly, incentives will improve response rate, but do they introduce unacceptable level of bias?

    A: We recommend that all customer survey clients offer some sort of incentive to respondents to encourage participation. We’ve found that even a modest incentive can increase response rates dramatically. To reduce possible bias, for most cases we recommend a cash prize drawing. Cash is an incentive with universal appeal and will not create a potential bias that could be generated through the offer of some sort of merchandise. A drawing is preferred over offering each individual a small incentive because, for the same cost, people will be more motivated by the opportunity to win a large cash prize than by the offer of a smaller incentive for each person. For example, if you were to sample 500 customers, you would be more likely to generate a stronger response rate by offering a drawing for two $500 cash prizes, than by offering a $2 gift card to each respondent.

  • Q: If I want to do a customer survey, what are the kinds of things I should ask about besides just overall customer satisfaction?

    A: A good customer satisfaction survey should include questions about key product or service attributes or characteristics such as pricing, service, courtesy, professionalism, quality, delivery, reliability, etc. There should also be general questions about likelihood to recommend the product or service to others and likelihood to repurchase or continue to use. These attributes can be used in a key driver analysis to determine which have the greatest impact on overall satisfaction.

  • Q: We’ve been sending our clients an online satisfaction survey of about 20 questions every 6 months, but find that response rates are below 15%. We’re a little disappointed. What is a realistic response rate for B2B satisfaction surveys these days? In your experience, what incentives work well in B2B environments?

    A: We see response rates for B2B customer satisfaction surveys usually ranging from 10-25%. The most significant factors affecting response rates are (a) the nature of the relationship that customers have with your company (transactional vs. ongoing relationships), (b) the length of the survey itself, (c) the communication to your customers, and (d) the response incentive that is offered.

    To improve response rate, first examine your survey frequency. If your survey is based on transactional customer relationships, you may want to establish a continuous survey, where the survey is always available and triggered when a customer completes a particular transaction with your company. In this case, you may want to limit an individual respondent to receiving a survey only once every 6 months. If you are measuring ongoing customer relationships, you will only want to survey each customer no more than once every 6 months.

    Length of survey also has an impact on response rate. Limit the length of the survey to only those questions that provide “need-to-know” information. The length of completion should be no more than 10 minutes.

    When communicating with your customers in your invitation to respond to the survey, be concise, stress your appreciation, assure that their responses will be confidential and anonymous, and indicate how you will use the results of the survey to improve your product or service.

    When incentives are offered for B2B customer surveys, a small gift for every survey respondent is recommended over a cash prize drawing (which is more effective for B2C surveys.) This can be a promotional item with your corporate logo, which not only encourages survey participation but also has a marketing and brand awareness advantage. It could also be a small discount on their next purchase, or access to a white paper published by your firm. Some B2B respondents may not be able to accept a cash or promotional incentive. In these cases, it is often effective to offer a charitable deduction in the name of the respondent as an incentive.

  • Q: How do you get customers to fill out surveys? Most times they are always too busy or don’t give a detailed answer.

    A: Motivating customers to complete an online customer survey is always a challenge, but certain “best practices” can improve survey response rates dramatically. Communicating the importance of the survey along with how the results will benefit customers directly is an important first step. We recommend an introductory letter from the company CEO, which demonstrates to customers how seriously you take the customer survey initiative. A proper incentive should also be offered to encourage participation (see question on incentives in previous answer.) To encourage more detailed survey responses, open-ended survey questions must be worded properly. For example, it’s better to ask, “What are three ways that Company XYZ can serve you better?” than “How can Company XYZ serve you better?”

  • Q: I’m trying to determine the optimal frequency of conducting a client satisfaction survey. These would be B2B – we’re a consulting company and would like to do a client satisfaction survey with our clients. Our current model is for us to conduct these two times a year, however, some people believe they only need to be done once a year. Any advice?

    A: Customer surveys should be conducted frequently enough to keep a “pulse” on customer sentiments, thus allowing you to make improvements to customer processes on a continual basis. For ongoing customer relationship surveys, we recommend that our clients conduct a customer satisfaction survey on either a semi-annual (twice per year) or an annual basis. The former is ideal if there is time between survey administrations to make changes based on the survey results and have these changes noticed by customers within that period. Otherwise, annual surveys are preferred. For transactional surveys, continuous survey administration as close to the time of the customer transaction is recommended.

  • Q: What is the best timing of a customer survey? In your experience, do you get a better response rate depending on the month the survey is issued? For example, should we avoid July, August, and December?

    A: The best timing an annual or semi-annual survey depends on your business cycle. If your business is seasonal, it is recommended that you not conduct the survey during peak sales times. We also advise avoiding administering surveys during peak holiday seasons such as late December since response rates decrease when customers are out of the office.

    Another timing consideration needs to be made if the results of the survey are used as a bonus incentive for your employees if their individual or group scores are positive. To prevent employees from trying to influence the scores (for example, by providing customers with extra services just before the survey administration), do not tell employees when the survey is going to be launched and vary the date of administration from period to period.

  • Q: When using a 5 point Likert scale to measure quality of service, what are the best answer choices to use?

    A: For customer surveys incorporating a 5-point Likert scale (or “category identifier”) we usually use the following scale: (5) Very satisfied, (4) Somewhat satisfied, (3) Neither satisfied nor dissatisfied, (2) Somewhat dissatisfied, (1) Very dissatisfied. The key features of this scale are that it’s symmetrical and avoids descriptors with strong emotional connotations. We use a similar scale to measure agreement with certain statements or likelihood to take certain future behaviors.

  • Q: How often should customers be surveyed online? Does it depend on the type of survey? How often should results be calculated?

    A: We recommend that online customer surveys are conducted on an annual or semi-annual basis. If your product or service is more transactional in nature, you may consider a survey that is continuously in the field which respondents are asked to complete immediately after a transaction. Regardless of the survey administration frequency, no single respondent should be asked to participate more than once per quarter to avoid respondent fatigue. In the case of ongoing customer surveys, we usually report results monthly. For clients who receive basic monthly reports we also offer an annual “comprehensive” report including trend analysis, regression analysis, and results broken down by customer segment subgroups.

  • Q: How effective is it to let customers know the results of a survey?

    A: Customer survey results are usually not distributed to customers. Most companies consider their survey results confidential information and therefore feel uncomfortable releasing this data publicly. Also, any negative survey results may damage customers’ perceptions of the firm.

  • Q: How would you suggest improving the response rate to an online client survey? We already offer an incentive for each response, as well as entering them in a regular prize drawing.

    A: There are three main factors which impact your survey response rate: 1) Survey length, 2) Incentives, and 3) Communications. Since you are already offering a dual incentive to respondents, you should probably focus on #1 and #3 above. There is a strong inverse relationship between survey length and response rates, so if you can shorten the length of your survey that will likely help your cause. You should also send succinct, professional communications to your customers outlining the purpose of the survey and how you value their response. These communications should remind customers that you will pay sharp attention to the survey results and act upon the findings. Finally, it is critical that you do act upon the findings of your survey and tell customers about how you are implementing the results.

  • Q: What is the optimal number of questions for a customer survey?

    A: We usually recommend that customer surveys are no longer than 25 questions in length which translates into about 5-7 minutes. However, that assumes a variety of different question formats dealing with separate aspects of the customer experience. What really matters is the survey completion time.

  • Q: For B2C surveys, how much of an increase in response rate do you believe would arise for an online survey as opposed to a postal one?

    A: These days we typically see better response rates for online surveys than paper surveys thanks to strong Internet penetration and consumers more technically savvy than ever before. Respondents usually prefer the online format due to convenience, ease of survey completion, and lower time requirements. As for the actual boost you would see in your survey response rate going from paper to online, it really depends upon your target audience.

  • Q: Are very satisfied or very dissatisfied customers any more likely to respond to a customer satisfaction survey? In other words, are customer satisfaction survey responses biased based upon who decides to complete the survey?

    A: There is a observed response bias in customer satisfaction surveys towards the customers on either extreme of the satisfaction scale, particularly those who are very dissatisfied. In other words, very satisfied or dissatisfied customers are more likely to respond to a survey invitation than those more towards the middle. The best way to combat the resulting “polarization” of customer survey data is to assure the maximum survey response rate possible, thus capturing more of the ambivalent customers and making the customer survey responses more representative of the overall customer base.

Submit a Question