top of page

6 things to look out for when using SurveyMonkey (and other DIY tools).



Do-it-yourself survey tools such as SurveyMonkey and Qualtrics have made it easier for  companies to conduct research quickly and affordably - and without having to hire an agency.  But as Peter Parker’s Uncle Ben says, “With great power comes great responsibility.” Tools are empowering, but you still need to learn how to use them well. 


Here, in no particular order, are 6 common methodology mistakes to watch out for when using DIY tools: 


  1. Not optimizing surveys for mobile. A huge proportion of consumers will take surveys on phones. Grid questions that require vertical and horizontal scrolling aren’t user friendly, and if they’re not optimized for mobile devices, survey takers will find it hard to see all the response options. This could lead to frustration, drop off, or random selection of answers to get past these unfriendly questions. 

  2. Ignoring audience composition. You should always be aware of who is responding to your surveys. Customer surveys often receive over-representation from the loud minority. General population surveys sourced from online panels often have more female and more lower income respondents. Without accounting for these skews either by weighting the data or being transparent when reporting results, companies can end up making decisions that don’t reflect the needs and wants of their broader customer base or target audience.

  3. Not doing qualitative research or stakeholder interviews before drafting your survey. Without these up-front exploratory conversations, surveys tend to leave out important questions and/or responses. “Select all that apply” questions that don’t include all of the most likely responses can result in up to half of survey-takers selecting “Other” rather than an answer provided. Sometimes the “Other” response item will include a “write in” or “please specify” component, but coding these written-in answers can be time-consuming. By doing the groundwork up front, you can ensure that the most important response options (i.e. those that people are most likely to choose) are included in your original list.  

  4. Insufficient data. A survey that gets you data doesn’t necessarily give you all the answers you need. For instance, regarding brand health surveys, a significant proportion of respondents overstate intent or confuse brands. This can lead to inflated awareness and use figures: just because 70% of respondents say they are likely to buy a product does not mean that anything close to 70% will buy that product. You can compare this data point with those for competitors, or track it over time, but it’s risky to assume this finding is fact. 

  5. Misinterpretation of statistical concepts. Interpreting survey data isn’t easy or straightforward. We’re often asked “What sample size do we need to have statistical significance?” Sample sizes can’t be statistically significant (or insignificant). They can only have margins of error. Statistical significance becomes relevant when you’re comparing differences between subgroups (men versus women, GenZ versus GenX) or between waves or research (this year’s brand awareness compared to last year’s). Understanding these concepts can give you a solid foundation for interpreting data in a way that drives informed decision making. A lack of understanding can lead you to draw conclusions about differences between groups that aren't actually meaningful.

  6. Inattention to data quality. Unless you’re comfortable with between thirty and fifty percent of your online panel survey responses coming from bots or low quality respondents (like professional survey takers), you need to be proactive and reactive to get truly meaningful insights. Poor data quality is a serious issue, one that many agencies don’t want to address because of the costs involved in securing quality survey takers.


Survey tools give us the impression that research is easy, but there is much more to it than writing a few questions and programming them into a tool. Let Spark Insights worry about the methodology so you don’t have to. Contact us to see how we can give you the methodologically sound data you need, at a price that’s within your budget. 


Comments


bottom of page