FAQs

Frequently Asked Questions

Infosurv posts these Frequently Asked Questions (FAQ’s) as a resource for our clients, prospective clients, and website visitors.

Each of these questions was submitted by a website visitor like you. Please click on the categories below to see the related knowledge base. If you have a question that is not answered by our FAQ, please submit it to us via the form below.

Q. If I’m doing a consumer targeted online survey, how large a sample do I need for statistical reliability?

A: To determine the optimal sample size for a consumer survey, one must strike a balance between cost and reliability or the level of sampling error that is tolerable. A larger sample provides greater reliability, but at a higher cost. A smaller sample will be less costly, but will yield a higher margin of sampling error. The table below provides the error margins for a set of sample sizes at the 95% and 90% level of confidence. This means, that for the 95% level of confidence and for a sample size of 100, the answers to the questions in your survey will be within +/-9.8% of the actual, unknown, “real” answer, 95% of the time if you repeated the survey an infinite number of times.

Sample Size

95% Confidence Level

90% Confidence Level

100

+/- 9.8%

+/- 8.2%

200

+/- 6.9%

+/- 5.8%

300

+/- 5.6%

+/- 4.7%

400*

+/- 4.9%

+/- 4.1%

500

+/- 4.4%

+/- 3.6%

600

+/- 4.0%

+/- 3.3%

*As you observe, once your reach a sample size of about 400, the decrease in sampling error margin begins to diminish and may not justify the increase in the cost of the sample.

Another consideration in determining sample size is your need to compare any subgroups through cross-tabulations. If this is a consideration, you should design your total sample size so that there are adequate numbers of responses in each of your subgroups to provide the desired level of reliability.

Infosurv can provide the consultation and expertise to design a sample size and structure that efficiently and effectively meets your objectives.

Q: Where do you find a sample that is truly representative of the consumer audience I am trying to reach?

A: Through our relationships with various online sampling partners, Infosurv can provide targeted respondent sampling services for clients. Our sampling partners all maintain panels of volunteer survey respondents for both Consumer and Business-to-Business surveys that closely mirror the United States population in terms of gender, region, race and household income per the latest US Census data. We also have access to global panel data for International studies.

Q: How long do surveys typically take to administer?

A: Administration or “fielding” of an online market research survey typically requires 2-10 business days. The variables that affect fielding time include:

  1. The sample size
  2. The incidence of the target sample (i.e., the percentage of people in the population who meet your requirements)
  3. The length of the survey in minutes

Once these variables are determined, Infosurv can provide a projected timeline for your project.

Q: Do you conduct global online research?

A: Yes, Infosurv does conduct global market research. Utilizing our network of sampling partners, we can gather sample from almost any country in the world. In addition, we support the administration of surveys in multiple languages, including double byte languages like Chinese and Japanese.

Q: I am starting up a new business, but need to see if my business is a good idea and know more information about my target audience, as well as to show potential investors that consumers are excited about my concept. What type of questions should I ask?

A: The survey you need to conduct is called a new product concept test. New product concept tests are designed to gauge the market potential of a new product or service that has been conceived but not yet marketed to consumers. These studies typically include questions to measure consumers’ likelihood to buy, expected price points, expected points of distribution, and perceptions versus existing competing products. You might also consider conducting a market assessment, which is designed to form a “snapshot” of your target market in terms of demographics (age, gender, income, etc.), psychographics (hobbies, interests, wants, needs, fears, aspirations), media consumption, and behaviors (where they shop, how often they go out, how much money they spend.)

Q: We work in a highly competitive industry and would like to know what marketing strategies to use to compete successfully. Can a survey be used to help us?

A: Market research surveys can be effectively used to measure the level of relative awareness and perceptions of your brand among target consumers. This allows you to see your competitive strengths and weaknesses and point out where you need to invest your company’s resources. Companies that apply this technique present a believable and consistent brand message to their customers, resulting in greater sales, growth in market share, and higher profits.

Q: What is the latest thinking about offering incentives to participants of customer satisfaction surveys? Certainly, incentives will improve response rate, but do they introduce unacceptable level of bias?

A: We recommend that all customer survey clients offer some sort of incentive to respondents to encourage participation. We’ve found that even a modest incentive can increase response rates dramatically. To reduce possible bias, for most cases we recommend a cash prize drawing. Cash is an incentive with universal appeal and will not create a potential bias that could be generated through the offer of some sort of merchandise. A drawing is preferred over offering each individual a small incentive because, for the same cost, people will be more motivated by the opportunity to win a large cash prize than by the offer of a smaller incentive for each person. For example, if you were to sample 500 customers, you would be more likely to generate a stronger response rate by offering a drawing for two $500 cash prizes, than by offering a $2 gift card to each respondent.

Q: If I want to do a customer survey, what are the kinds of things I should ask about besides just overall customer satisfaction?

A: A good customer satisfaction survey should include questions about key product or service attributes or characteristics such as pricing, service, courtesy, professionalism, quality, delivery, reliability, etc. There should also be general questions about likelihood to recommend the product or service to others and likelihood to repurchase or continue to use. These attributes can be used in a key driver analysis to determine which have the greatest impact on overall satisfaction.

Q: We’ve been sending our clients an online satisfaction survey of about 20 questions every 6 months, but find that response rates are below 15%. We’re a little disappointed. What is a realistic response rate for B2B satisfaction surveys these days? In your experience, what incentives work well in B2B environments?

A: We see response rates for B2B customer satisfaction surveys usually ranging from 10-25%. The most significant factors affecting response rates are (a) the nature of the relationship that customers have with your company (transactional vs. ongoing relationships), (b) the length of the survey itself, (c) the communication to your customers, and (d) the response incentive that is offered.

To improve response rate, first examine your survey frequency. If your survey is based on transactional customer relationships, you may want to establish a continuous survey, where the survey is always available and triggered when a customer completes a particular transaction with your company. In this case, you may want to limit an individual respondent to receiving a survey only once every 6 months. If you are measuring ongoing customer relationships, you will only want to survey each customer no more than once every 6 months.

Length of survey also has an impact on response rate. Limit the length of the survey to only those questions that provide “need-to-know” information. The length of completion should be no more than 10 minutes.

When communicating with your customers in your invitation to respond to the survey, be concise, stress your appreciation, assure that their responses will be confidential and anonymous, and indicate how you will use the results of the survey to improve your product or service.

When incentives are offered for B2B customer surveys, a small gift for every survey respondent is recommended over a cash prize drawing (which is more effective for B2C surveys.) This can be a promotional item with your corporate logo, which not only encourages survey participation but also has a marketing and brand awareness advantage. It could also be a small discount on their next purchase, or access to a white paper published by your firm. Some B2B respondents may not be able to accept a cash or promotional incentive. In these cases, it is often effective to offer a charitable deduction in the name of the respondent as an incentive.

Q: How do you get customers to fill out surveys? Most times they are always too busy or don’t give a detailed answer.

A: Motivating customers to complete an online customer survey is always a challenge, but certain “best practices” can improve survey response rates dramatically. Communicating the importance of the survey along with how the results will benefit customers directly is an important first step. We recommend an introductory letter from the company CEO, which demonstrates to customers how seriously you take the customer survey initiative. A proper incentive should also be offered to encourage participation (see question on incentives in previous answer.) To encourage more detailed survey responses, open-ended survey questions must be worded properly. For example, it’s better to ask, “What are three ways that Company XYZ can serve you better?” than “How can Company XYZ serve you better?”

Q: I’m trying to determine the optimal frequency of conducting a client satisfaction survey. These would be B2B – we’re a consulting company and would like to do a client satisfaction survey with our clients. Our current model is for us to conduct these two times a year, however, some people believe they only need to be done once a year. Any advice?

A: Customer surveys should be conducted frequently enough to keep a “pulse” on customer sentiments, thus allowing you to make improvements to customer processes on a continual basis. For ongoing customer relationship surveys, we recommend that our clients conduct a customer satisfaction survey on either a semi-annual (twice per year) or an annual basis. The former is ideal if there is time between survey administrations to make changes based on the survey results and have these changes noticed by customers within that period. Otherwise, annual surveys are preferred. For transactional surveys, continuous survey administration as close to the time of the customer transaction is recommended.

Q: What is the best timing of a customer survey? In your experience, do you get a better response rate depending on the month the survey is issued? For example, should we avoid July, August, and December?

A: The best timing an annual or semi-annual survey depends on your business cycle. If your business is seasonal, it is recommended that you not conduct the survey during peak sales times. We also advise avoiding administering surveys during peak holiday seasons such as late December since response rates decrease when customers are out of the office.

Another timing consideration needs to be made if the results of the survey are used as a bonus incentive for your employees if their individual or group scores are positive. To prevent employees from trying to influence the scores (for example, by providing customers with extra services just before the survey administration), do not tell employees when the survey is going to be launched and vary the date of administration from period to period.

Q: When using a 5 point Likert scale to measure quality of service, what are the best answer choices to use?

A: For customer surveys incorporating a 5-point Likert scale (or “category identifier”) we usually use the following scale: (5) Very satisfied, (4) Somewhat satisfied, (3) Neither satisfied nor dissatisfied, (2) Somewhat dissatisfied, (1) Very dissatisfied. The key features of this scale are that it’s symmetrical and avoids descriptors with strong emotional connotations. We use a similar scale to measure agreement with certain statements or likelihood to take certain future behaviors.

Q: How often should customers be surveyed online? Does it depend on the type of survey? How often should results be calculated?

A: We recommend that online customer surveys are conducted on an annual or semi-annual basis. If your product or service is more transactional in nature, you may consider a survey that is continuously in the field which respondents are asked to complete immediately after a transaction. Regardless of the survey administration frequency, no single respondent should be asked to participate more than once per quarter to avoid respondent fatigue. In the case of ongoing customer surveys, we usually report results monthly. For clients who receive basic monthly reports we also offer an annual “comprehensive” report including trend analysis, regression analysis, and results broken down by customer segment subgroups.

Q: How effective is it to let customers know the results of a survey?

A: Customer survey results are usually not distributed to customers. Most companies consider their survey results confidential information and therefore feel uncomfortable releasing this data publicly. Also, any negative survey results may damage customers’ perceptions of the firm.

Q: How would you suggest improving the response rate to an online client survey? We already offer an incentive for each response, as well as entering them in a regular prize drawing.

A: There are three main factors which impact your survey response rate: 1) Survey length, 2) Incentives, and 3) Communications. Since you are already offering a dual incentive to respondents, you should probably focus on #1 and #3 above. There is a strong inverse relationship between survey length and response rates, so if you can shorten the length of your survey that will likely help your cause. You should also send succinct, professional communications to your customers outlining the purpose of the survey and how you value their response. These communications should remind customers that you will pay sharp attention to the survey results and act upon the findings. Finally, it is critical that you do act upon the findings of your survey and tell customers about how you are implementing the results.

Q: What is the optimal number of questions for a customer survey?

A: We usually recommend that customer surveys are no longer than 25 questions in length which translates into about 5-7 minutes. However, that assumes a variety of different question formats dealing with separate aspects of the customer experience. What really matters is the survey completion time.

Q: For B2C surveys, how much of an increase in response rate do you believe would arise for an online survey as opposed to a postal one?

A: These days we typically see better response rates for online surveys than paper surveys thanks to strong Internet penetration and consumers more technically savvy than ever before. Respondents usually prefer the online format due to convenience, ease of survey completion, and lower time requirements. As for the actual boost you would see in your survey response rate going from paper to online, it really depends upon your target audience.

Q: Are very satisfied or very dissatisfied customers any more likely to respond to a customer satisfaction survey? In other words, are customer satisfaction survey responses biased based upon who decides to complete the survey?

A: There is a observed response bias in customer satisfaction surveys towards the customers on either extreme of the satisfaction scale, particularly those who are very dissatisfied. In other words, very satisfied or dissatisfied customers are more likely to respond to a survey invitation than those more towards the middle. The best way to combat the resulting “polarization” of customer survey data is to assure the maximum survey response rate possible, thus capturing more of the ambivalent customers and making the customer survey responses more representative of the overall customer base.

Q: Should I survey every employee in my company or just a random sample?

A: Infosurv strongly recommends surveying each employee. Employee surveys do more than measure employee satisfaction and opinions — they also send a message to employees that management cares. When companies decide to survey only a sample of employees, they may save a few dollars and still collect statistically valid data, but those employees left out of the study won’t feel as good about the company as those included.

Q: If we do an employee survey, how often should it be re-administered?

A: We recommend that our clients conduct an employee survey on at least an annual basis, and perhaps more often if the company is going through a period of rapid change (such as acquisitions, lay-offs, or high growth.)

Q: What kinds of questions should an employee survey cover?

A: Common topic areas of interest for an employee survey include:

  • Overall satisfaction
  • Corporate culture
  • Supervisor relations
  • Training
  • Pay and benefits
  • Work environment
  • Communications

Within each of these topic areas, a set of questions can be designed relevant to your company’s needs and objectives.

Q: To what extent can you leverage the “anonymity” of the web to effectively incorporate a peer/management review component into employee surveys?

A: One of the chief advantages of the online employee survey administration methodology is the anonymity that the Web lends to employee responses. We often conduct 360 degree surveys with the specific goal of collecting peer or management feedback in an anonymous and unbiased fashion. Employees are usually more comfortable completing a survey online than via paper since 1) their handwriting is not on the survey, and 2) their responses are transmitted directly to an independent 3rd party, Infosurv, for analysis without ever passing through the hands of a manager. Peer or management review can also be incorporated into a standard employee satisfaction survey with results reported by manager. Since each manager’s results are only reported in aggregate, individual respondent anonymity is guaranteed.

Q: My company wants to do an employee survey but not anonymously. How can they expect to get true results?

A: We strongly advise against non-anonymous employee surveys. Many studies have shown that employees respond differently to satisfaction and opinion surveys when their anonymity is not protected. Responses to non-anonymous surveys are often “sugar coated” by employees for fear of retribution from management. If a company hopes to gain an accurate picture of employee satisfaction, the survey must be administered via a 100% anonymous method.

Couldn’t find what you were looking for? Please complete the form below and we’ll get back to you ASAP with our response, and post your question and our answer to our FAQ for future website visitors.

CAPTCHA image

* These fields are required.