Working as a market research supplier, our team walks a delicate line each day keeping both our clients and our survey respondents happy. This is not always an easy thing to do in our day-to-day roles.
However, understanding both perspectives of these audiences is critical in market research. If your service model and market research approaches focus too much on pleasing one of these audiences without considering the perspective of the other, you will set your market research project up for failure.
You will either a) make the survey too long and difficult for the respondent or b) make the survey too short and engaging where the client does not get the answers their organization needs.
In fact when you think about it, if we did everything the sponsoring client wanted for each survey, we would not have any respondents willing to offer feedback. If we did everything the respondent for each survey, we would not have any clients willing to pay for market research.
This is the constant battle in market research: client versus respondent.
It comes down to client needs and expectations versus respondent experience in surveys. Over the years we've seen far too many brands, market research companies, and agencies beat up on the survey respondent. It makes sense, the client is our customer. We should do everything (or nearly everything) they ask for.
Add a question? Sure. Add 5 questions? Sure. Add 3 categories to this grid? Sure. Change this multiple choice to a MaxDiff? You got it.
The market research client has very specific objectives and scientific needs. The market research respondent wants to offer feedback but wants an experience worth their time. This is a constant battle in market research. Who wins?
In case you're wondering, the survey respondent is the one in the fedora. Which self-admittedly based on the apparel choice alone, may deserve a glove to the face. But who is judging in market research? Our job is to be objective anyway.
How do the expectations between client and respondent differ? Sometimes the gap between client objectives and respondent expectations is small and other times the gap can be quite large.
What are some of the areas in market research which can create gaps in expectations? They include the following:
(1) Length of the survey
(2) Types of questions
(3) Incentive or rewards
(4) Qualifying criteria and quotas
Battle 1: Length of the Survey
What the client sponsor wants: A long survey to answer every and all objectives for the study.
What the respondent wants: A quick survey which is fun and engaging.
As a market research company, we admittedly cringe when we see a survey come across our radar which includes 50+ questions and takes more than 20 minutes to complete.
Heck, if we work in the industry and love market research can you imagine how this makes the respondent feel if they receive this invite and read the expectations and estimated length of interview (LOI)?
We've spoken at length about this threshold on our market research blog, but our team aims to keep our surveys in the 3 to 5 minute range. After 5 minutes on a survey, we see less respondent engagement, shorter attention spans, and lower quality open-ends. In many cases the respondent just exits the survey completely after 5 minutes.
From the client perspective, you are likely thinking, "5 minutes?" I cannot cover my objectives in a 5-minute survey!
Not always true. A well-scripted survey focusing on core needs can easily be completed in 5-minutes. Our market research company does it every day. This buys you about 15 questions or so for the survey to address client needs.
That can be plenty for market research.
Be smart about prioritizing questions. Identify and keep the must-have questions while saving the nice-to-have questions for a future survey.
Eliminate the questions you will never use or are unnecessary. For example, is it crucial to cut the data by gender? By children in household? If not, eliminate those questions.
Combine questions where appropriate.
For example asking Q1: Are you 18 years or older? and Q2: Which of the following best describes your age group? 18 to 24, 25 to 34, 35 to 44, etc. with the goal of terminating those under the age of 18.
Instead ask, Q1: Which of the following best describes your age group? Under 18, 18 to 24, 25 to 34, 35 to 44, etc. In one question you can accomplish the same and terminate those under the age of 18.
This style has many applications in market research by simply offering a "None of the above" category.
Battle 2: Types of Questions
What the client sponsor wants: Advanced question types (Conjoint, MaxDiff, etc.)
What the respondent wants: Non-repetitive questions that are easy and self-explanatory.
The foundation of market research is built on statistics. In order to obtain highly reliable and in-depth data, survey questions often follow specific models and lines of questioning to arrive at an insight.
Client needs include question types like conjoint, MaxDiff, semantic differential, and even pricing models like the Van Westendorp pricing sensitivity meter.
The Van Westen... what?
From a respondent standpoint, many of those scientific questions involve repetitive question sets that flat out make a survey boring. It's easy to snooze off during a forced-choice conjoint where the list goes on and on:
Do you prefer A or F?
C or E?
W or J?
Z or H?
M or O?
Another culprit is the long list of grid questions in a survey where respondents are asked to rate 20 rows of factors on a Likert scale of 1 to 5. These often cause straight-lining and speeders and understandably so.
At a glance, it's overwhelming to the survey respondent.
Know when to push, and when too much is too much.
We encourage our survey designers at Drive Research to change scales and ask questions in a different fashion throughout the survey.
Rather than including a long grid of row by row ratings think about using other answer types like stars, slider scales, or click and drag functionality. Change is good.
Depending on the study, sometimes you have no choice other than to include a long series of Conjoint or a MaxDiff sequence.
It is important as a market research company to recognize exhaustion or fatigue points for the survey respondent and ensure you let them know they are making progress (notes or a progress bar).
Rewarding them for this additional time is also helpful, which leads us to...
Battle 3: Incentive or Rewards
What the client sponsor wants: Pay them as little as possible.
What the respondent wants: Earn as much as possible.
Among all of the mini-battles between the client sponsor and respondent, this one may not have an easy solution. There are many levels, opinions, and variables that go into determining rewards for a project.
A simple example of a variable that impacts payouts is geography and HHI in specific areas of the U.S. For example, $50 for participating in a survey interview goes a lot further in Syracuse, NY than 5 hours east in New York City.
From a client perspective, the request is a simple one:
How can we offer as little as possible to save on budget while still getting the number of respondents we need who offer high-quality data?
From a respondent perspective, the request is a simple one too:
If you want me to offer truly thoughtful and articulate responses to your survey, I need to make sure it is worth my time. Time is money!
Take all factors into account. Understand the geography where the survey is being conducted and rely on local market research companies to offer the going rates for rewards.
Try different options for rewards.
Instead of offering each survey respondent $1 for taking the survey on the panel think about offering 10 $100 gift cards for 10 lucky winners.
Which offer would motivate the respondent more?
Also, understand points 1 and 2 above for the reward.
If your survey is 40 minutes long and includes several series of advanced analytics and pricing models about Styrofoam cups, reward your participants more than what they would expect after taking a quick 10 question survey about beer preferences.
How much is a response worth? It truly depends.
Battle 4: Quotas
What the client sponsor wants: A specific number of respondents, no more.
What the respondent wants: Just being given the opportunity to offer feedback.
How many of you have received a survey invite from a market research company saying something like, "Participate in this 10-minute survey about your recent experience with Brand ABC and we will give you a $75 gift card."
It almost seems too good to be true right?
All of those surveys where you get paid a few cents here or a dollar there are now worth it because of this golden invite you just received.
You receive this golden invite and your next steps might look something like this.
You get up from your desk at work to get a refill of coffee thinking about all of the things you want to share about Brand ABC.
Your mind is racing with the juiciest and in-depth open-ends imaginable. You are envisioning your survey response being framed and hung above the Customer Experience (CX) Director's desk at the Brand ABC offices.
Thinking each time the CX Director walks in she smirks at your survey, quietly whispering to herself, "Wow, that was one hell of a survey response."
You are snapped out of the dream by the Keurig loudly exhaling the last ounce of water into your mug. We all know that sound.
You retreat back to your desk, do a quick stretch of your shoulders, and crack your knuckles.
You are ready to go.
You slide your mouse and click on the bright blue "Get Started" link and...
A message pops up saying, "Thank you for your interest. We have reached the quota of responses required for this survey. Fieldwork is now closed."
Remember that ounce of water the Keurig exhaled into your mug just minutes ago?
Well, this is where the survey respondent exhales that ounce of coffee water on to their laptop screen in anger.
From a client perspective, we understand the reason behind quotas such as making samples representative or capping a budget. But why not consider a secondary option?
Although they may no longer qualify for the $50 how about offering a short open-ended comment box giving respondents an opportunity to discuss the brand and their experience. Perhaps some bonus data for the report?
Giving a vehicle for customers to offer feedback and then stripping it away is not good business. This is the anti-VoC (Voice of Customer) approach.
Try to manage the sample invitations better to understand response.
The worse situation is asking for feedback and then not giving an opportunity for the respondent to voice it even though he or she was willing.
If you are handing out $50 gift cards like free candy to the first 100 respondents, do not send the invite to your entire database of 250,000 customers.
Try starting small to estimate a response rate and work from there. Pull a random 500 or so customers to start. This avoids the bad respondent experience.
Drive Research is a market research company located in Syracuse, NY. Our team works with small and large organizations all across the United States and the world.
Our consultative approach to market research examines our projects from all perspectives: our needs, our clients' needs, and respondents' needs. Offering a perfect blend to keep everyone happy.
Interested in learning more about our market research services? Contact us.