Your online survey is written, programmed, and ready to go - but before sending it into fieldwork, have you properly tested it?
Just like there is a science behind survey writing, there is a method to the madness of online survey programming.
Simple mistakes can impact data quality and respondent experience, which means the decisions you make based on the results will suffer as well.
Take the time to thoroughly pre-test your survey - we promise you won't regret it!
The steps to successfully testing an online survey include:
- Check that all questions have been included
- Confirm the proper flow of the survey
- Check that all question formats are correct
- Ensure disqualification questions are routed correctly
- Ensure all routing is working as intended
- Re-check for spelling and grammatical errors
- Check formatting
- Review the disqualification message
- Add the survey thank you page
- Survey programmer testing
- Ask others to test
Why Online Survey Testing is Important
At Drive Research, we use online surveys as a way to collect data for clients that have all kinds of research objectives, broad or specific. Online surveys allow us to shed light on how consumers think and feel about current products and services in the market.
Survey research also allows us to see how consumers react to new or upgraded client offerings.
For example, we recently conducted a survey on a new banking product concept and sought to understand how appealing it was to consumers.
The way we design and present our surveys is very important to the outcome of a research study. If not done correctly, the data we receive from the survey could be low quality or even completely useless for clients.
To mitigate the risk of unreliable data, we never field a survey without first testing it from the perspective of a respondent.
This is perhaps the best way to catch any mistakes that might confuse respondents or negatively impact your data.
This first step might sound too easy and obvious, but even experienced market research professionals can make a simple mistake (especially when the survey is lengthy).
To complete this step of the survey testing process, scroll through your online survey software with the survey document open and check that all questions have been included and are in the correct order.
Now is also a good time to look at word choice.
Word choice is something we aim to optimize during the survey design phase of a research study, but we’d argue it should still be top of mind while testing a survey in the programming phase.
You want to make sure the questions are easy to understand but also specific to what type of information you are seeking.
The questions don’t really serve a purpose if they aren’t worded well enough to help meet the research objectives. Use language and terminology that directly relates to the goals at hand.
In other instances, you may be testing a survey and see the question in a brand-new way.
Don’t be afraid to substitute a word or phrase that makes the question more concise or clear for the respondent (unless you are working with a final, client-approved survey).
Having good word choice will help respondents interpret each question as intended and ensure the right information is collected.
Testing out the flow of a survey is another key component of the programming phase. Sometimes it is not until we run through the survey like a respondent that we discover an issue.
For one, it is imperative that survey testers check the order of questions. Survey questions should flow in a natural way where follow-up questions are built on previous questions, or similar questions are grouped together in the survey.
These checks also go hand-in-hand with testing out the programming logic throughout the survey.
Proper survey documents will spell out where logic exists in the survey. The survey document can be used as a reference point to confirm termination points, skip logic, display logic, or piping logic.
If something in the programming doesn’t match the survey document, be sure to investigate why there is a discrepancy.
From a navigation perspective, respondents should be able to easily move from screen to screen until they are finished with the survey.
Testers must verify there are no points at which a respondent might get stuck and exit the survey out of frustration. This could come in the form of a technical error or a poorly designed question.
Aside from a worthwhile incentive, a smooth survey-taking experience is the best way to support a high conversion rate on survey completions.
Since mistakes can happen, it's always best to go test your online survey with your word document open for a second time to ensure all question formats are accurate.
This means ensuring single-response questions only allow survey respondents to select only one response, or that multiple-response questions allow survey respondents to select more than one response.
As discussed previously, data accuracy is key. Making a mistake here can dramatically change your results, so while this step might sound easy, it's also critical.
Want to disqualify or screen people out of your online survey? Make sure that survey programming is working correctly.
When testing your online survey, these questions should require a response and should route accordingly.
Routing from disqualification questions can mean sending respondents to your sweepstakes entry page, to a disqualification message, or to the online survey end page.
The type of criteria you are screening for will determine which option is chosen.
This is a big piece of online survey testing and is an important piece to a future step as well (Step 9). For the purposes of this step, review all programmed survey routing question by question.
To clarify I am referring to the actual online survey program coding. Oftentimes, market research professionals will notice mistakes here before actually testing the survey online.
During this step, also check to make sure question options like randomization of responses, inverse responses, etc. are accurate.
Ensuring accurate routing is key
This is another quick and easy step. After programming, the online survey re-read all questions and answers to ensure there are no spelling and grammatical errors.
It'd be a drag to pause fieldwork to make a simple fix like this!
Another important survey testing tip is to watch out for inconsistent elements of the survey like scales or phrasing that might affect the data. It goes a long way to minimize inconsistencies.
For example, scales for similar questions should have consistent labels at each end. You wouldn’t want a survey that features a satisfaction scale that goes from “Not at all satisfied” to Very satisfied” and another that goes from “Not at all satisfied” to “Extremely satisfied.”
You also want to make sure the scale labels match what the question is asking (i.e., not measuring satisfaction with an importance scale). It may seem obvious, but programming mistakes can happen to even the best researchers!
The presentation of answer options should also match throughout the survey when relevant.
Take the standard “Other” option, for example.
Small differences like “Other (Please specify)” “Other – please specify” have the potential to take the respondent out of the experience. Representation of company brands and products should also always match throughout.
When we take the time to check for consistency, the survey will be clearer for respondents and express a more professional appearance on behalf of the client.
In this step for testing your survey, also check the font, font size, bolding, use of colors, use of buttons (previous/next/submit), and/or progress bars, logos, and background images.
Have disqualification or screener questions? Want to let those respondents know they have been disqualified? Here's your reminder to craft that message.
Usually, these messages are short and do not allow the option for respondents to review their answers to the prior question.
Quick reminder! Before you start testing the online version of your survey, make sure to add the survey end page (AKA the survey thank you page).
When programming a survey this is typically one of the final steps.
Here is an example from our online survey company.
It's now time for the moment we have all been waiting for... testing the survey online! This is likely one of the most significant time commitments on this list of survey testing steps.
Here is where you will want to ensure all the previous steps are 100% complete and accurate.
Before you begin this step, there will likely be some sort of edit needed whether it has to do with routing, spelling/grammar, disqualification questions, or formatting. Survey programming can be complicated, so it's important to take your time and not rush the process.
Approach your survey testing like there is 1 error in the version and it is your job to find it. With this mindset, you'll pay close attention to all of the details.
Watch our video for more tips on programming your survey.
Think you programmed your online survey perfectly? Great!
Now pass the online survey link along to a colleague with the survey document and see if they have any edits, comments, or suggestions. This final step is for accuracy and peace of mind before unleashing your online survey into the world wide web!
Drive Research is a market research company located in Syracuse, NY. Our team of pro researchers can help you with your next online survey project, including research design and survey programming. We have a designated internal team who tests every online survey for our clients.
Interested in learning more about our market research services? Reach out through any of the four ways below.
- Message us on our website
- Email us at [email protected]
- Call us at 888-725-DATA
- Text us at 315-303-2040
As a Senior Research Analyst, Emily is approaching a decade of experience in the market research industry and loves to challenge the status quo. She is a certified VoC professional with a passion for storytelling.
Learn more about Emily, here.