Is Your Online Survey Like a Dinner Conversation with your 12-Year-Old?

November 17, 2016

In the understatement of the century, technology has changed the landscape of marketing research. The days of sending out hard copy surveys by mail or going door-to-door to collect survey responses have become close to extinct (here's looking at you U.S. Census Bureau.) The popularity of cell phone, mobile devices, and computers have made it incredibly easy for survey companies to contact respondents to seek responses and feedback. Due to these cost and time savings, online surveys are almost to default "go-to" methodology when drafting up a quantitative proposal in the industry.

 

 

Unfortunately, the passive nature of online surveys has hurt the industry. How? All survey analysts have experienced the feeling of clicking send on their mass-batch of email invitations, leaning back in their chair, and saying "they're out, nothing I can do now until fieldwork is closed." Right? This passive approach is commonplace. You click send and you feel like your work is done because it is now in the respondents' hands to reply. But in reality, using passive methodologies like online surveys where there is no person-to-person (P2P) interaction does have quite an impact data quality.

 

As stated above with all online surveys, we are at the mercy of the respondent. Other than the email reminder or potential follow-up call, the sender really has no control. The respondent can choose if they want to do the survey, when they want to do the survey, engage in other distractions while doing the survey (NFL Sunday football and discreet based conjoint analysis do not mesh), and spend as much or as little time doing the survey.

 

The deck seems pretty stacked in their favor if you ask me. As a result, all of us have seen the dreaded open-ended comments are returned from online surveys. For instance, in your survey you ask your 1 to 5 satisfaction scale followed by a simple "why?" Chances are your online responses will include some gems like "they're good", "bad service", "dislike the company", "unhappy", and other generic comments of the highest possible level. As an analyst there is nothing that can be done with this data.


Oftentimes, online surveys sometimes work like dinner table conversations with your 12-year-old son:

 

  • Dad: "How was school today?"

  • Son: "Good."

  • Dad: "Anything exciting happen?"

  • Son: "Nope."

  • Dad: "What was your favorite class?"

  • Son: "Lunch."

  • Dad: "Rate your Mom's meatloaf on a 0 to 10 likelihood to recommend scale so I can calculate her meatloaf's NPS."

  • Son: "You're weird."


Other than the last part, does this sound familiar? Not a real in-depth look at your child's day is it? Not all survey respondents act like this, but many do.

 

The faces of your online surveys

 

Playing this out, if all of your survey responses came back with generic feedback like this it would make for an awkward board presentation. If your report was the headline of a newspaper article it would read something like "Company ABC earns a 1.9 score out of 5 for satisfaction, respondents state score is driven by their dislike for the company and they are unhappy." The feedback is so generic it becomes unusable because it's easily understood by the low score. What recommendations do you have as an analyst to improve Company ABC's satisfaction? Can you tell your board we need to "make people like our company" and "make customers happy." Good luck with your tactical strategic plan.

 

Passive methodologies leave you wanting more and this is why it is critical to use supportive methodologies like phone surveys, in-depth interviews (IDIs), and focus groups to anchor your research. These more active methodologies provide you with the depth and quality feedback needed to explain your satisfaction score of 1.9. Online surveys will offer you some of reasoning because of large number of returned surveys, there's bound to be a few that are in-depth. Therefore you'll get a small number of respondents who are actively engaged and offer up lots of feedback (thank you by the way, from all of us.)

 

I've seen several survey platforms offering increased functionality and quality controls to improve online data quality of late. One feature is a minimum character limit on open-ends. Other more advanced platforms include options for text-based probing. In these platforms, logic can be programmed to use pop-up messages if the response is less than a certain number of words. You can also set up message prompts based on specific keywords. So if someone mentions "meatloaf" you can have the program follow-up by saying "can you give me a little more detail on the meatloaf?"

 

These types of online survey functions help with data quality but do not substitute for the one-on-one interaction you can profit from through other methodologies. A conversational approach allows you as the researcher to actively follow-up on items which are unclear or dig deeper into the why decisions are made. This personal interaction is invaluable and puts more control in the researcher's hand. It also leaves the respondent with a better takeaway on the experience and your company (spending 10 minutes talking with someone about issues versus typing up your issues on a computer.)

 

Online surveys are not going away anytime soon, and new items are always being explored to improve data quality such as text-based probing and character limits. It may not be a question of one approach or the other but rather a combination of passive and active research for your next project. Using both qualitative and quantitative research to both explore and measure is your best option. Want to learn more about the Golden Rule of Qualitative Before Quantitative? Click here. Want to learn more about Drive Research and our market research services? Email us at info@driveresearch.com or call us at 315-303-2040.

Please reload

Featured Posts