We’ve seen it all- the good, the bad, and the ugly. As data becomes more ubiquitous it’s becoming harder and, at times, downright confusing to interpret which can lead to mistakes, confusion, and miscommunication.
So here are 3 common mistakes you can easily avoid when interpreting and presenting data:
- Overstating your data: We’ve all heard statements like “customers were twice as satisfied.” While a value of a 4 is twice that of a 2, when the data isn’t scaled correctly, it’s misleading to make that statement. We also see this in political polling surveys and currently in the presidential polls where one nominee is “leading by X percent”. However, that “X percent” is a hardly dependable number due to usually low sample sizes, and clarity if those participating in the polls are even registered voters.
- Not reporting on dissatisfaction (and vice versa): It’s not uncommon to see data with great customer or employee satisfaction survey results (sometimes an overall satisfaction score (top 2 box) of over 85%), however, even a 99% satisfaction score does not mean all customers or employees are satisfied. At Infosurv we believe in showing the full picture. Do those that aren’t satisfied fall in the neutral or dissatisfied box? While stating “Almost all customers are satisfied” is not false, there should always be room left for more interpretation (even if 85% are pretty happy, what if the other 15% are really dissatisfied?). Further analysis might show an isolated problem area that needs to be addressed.
- Using the word “significant” to bring attention to significant differences in data: While there’s nothing wrong with drawing attention to data using the word “significant,” sometimes when data shows it’s significant it may not be telling the whole story. For example we’ve seen data where 65% are satisfied in Year 1 and 95% are satisfied in Year 2, and while data can show there’s a significant difference, this can be misleading especially if your sample size is 300 in Year 1 and only 50 in Year 2. There’s more to the story when you factor in sample size, demographics, etc. Interpreting data doesn’t mean regurgitating the stats–there needs to be evidence from a holistic view.
For more information on reporting, read our blog article on how to write reports that get read here.
We as marketing researchers must dig beneath the surface of our data to tell a story that’s not leading and doesn’t involve cherry picking only what we want to tell. Clients are going to use this information to make critical decisions at their companies and correct interpretation of their data makes all the difference in the world to them!
While these are just three mistakes to avoid when interpreting and presenting your data, we’d love to hear what you’ve seen or read. Leave your comments below!