We Used AI to Write a Survey - Here's What Happened

man and AI robot punching fists

As AI writing tools continue to evolve, our market research company set out to complete an intriguing experiment: using artificial intelligence to write a survey from scratch.

The goal was to witness firsthand the potential of AI in shaping the very tools we use to gather insights. Without spoiling too much, there were clear drawbacks and limitations, but a few shining moments.

Here are my thoughts.


Example AI Written Survey

Using the A.I. language model to write surveys for our clients at Drive Research is not something we do, but with all this talk of ChatGBT and other tools, it had me thinking...

What kind of customer survey questions would AI write if I asked it to?

So, I typed the following prompt into ChatGPT-4:

Write a customer survey for a manufacturing company on behalf of Drive Research

And, here’s what happened…

1. How satisfied are you with the overall quality of our product/service?

Right off the bat, there’s nothing inherently wrong with this question (and this goes for a few of the questions below too). The bigger issue with this question is the survey flow and formatting.

While quality might be a major key performance indicator (KPI) for the organization, you can get so much more out of effectively written grid questions that ask about importance and satisfaction levels separately. 

Combining these two factors (i.e., importance and satisfaction) with the different KPIs the organization would like to test will enhance the insights gathered from the survey as well as the survey experience for respondents. 


2. How likely are you to recommend our product/service to a friend or colleague?

Net promoter score (NPS) is a common metric used in customer satisfaction surveys. It’s certainly a good fit here, but it’s missing the scale (which is very important). 

The NPS scale ranges from 0 to 10 where 0 is “not at all likely” and 10 is “extremely likely”. Using the correct scale will allow users to compare their score to others.

NPS survey question

The bigger problems I have with this question are that...

  1. Net promoter score should be asked first/before specific factors discussed in question 1.
  2. It's missing a follow-up question asking respondents to explain why they chose their score.

A follow-up question is a huge opportunity to understand how to move those with lower ratings up the scale to become raving fans of your brand.


3. How well did our product/service meet your expectations?

Similar to the first question, this question could work well in a grid format. Consider including other customer touch points like the bidding/purchase process, sales representatives, account managers, etc. 


4. How responsive have our customer support representatives been to your inquiries or concerns?

This question works well, but this is a good example of how slight phrasing can enhance the respondent experience.

Moreover, if question 3 is set up in a grid format with various customer touchpoints, customer support representatives could be part of that grid.


5. How would you rate the ease of use/navigation of our website or app?

This is another example of an opportunity to use a grid question more effectively. For example, ease of use/navigation of the website could easily be added to the question set referred to in question 1.

One note about this question is to be careful to avoid double-barreled questions, which means asking about two different factors within one question. 

This question should choose ease of use or navigation and the website or app. Or, the organization could choose to ask about the website and app separately.


6. How often do you engage with our brand or use our product/service?

This question does not feel particularly helpful. Oftentimes, this is something the organization would already know/keep track of past purchase habits. 

If this question asked about competitors and then asked about usage or engagement with the sponsoring organization’s brand as well as competitors, then that would add a lot more value to the analysis/insights.


7. How satisfied are you with the pricing of our product/service?

Again, this question could easily be part of the grid question set discussed in question 1. Think about ways to streamline questions to customers and how the results are going to be used. 

When writing a survey, it is important to keep it short. Our online survey company recommends asking no more than 20 questions. 

Respondent fatigue is real. 


8. How likely are you to continue using our product/service in the future?

This question works well, but I would consider being more specific about the time frame (if it makes sense for the industry/organization).

For example, "How likely are you to use our product/service in the next three months?"


9. How knowledgeable and helpful did you find our sales representatives?

Red flag alert! This question is double-barreled. To ensure high data quality users need to choose one factor (i.e., knowledgeable or helpful). Choosing both will confuse respondents, rendering poor data quality/unusable data.

Also, this is another example of how a question could be part of a grid to make the survey feel faster to respondents.


10. How well does our product/service address your specific needs or requirements?

This question is fine, but I would recommend adding a follow-up question to respondents who are not fully satisfied to understand what specifically could be improved.


11. How would you rate the overall reliability and performance of our product/service?

Another red flag alert! This is another double-barreled question. To ensure high data quality users need to choose one factor (i.e., reliability or performance).


12. How likely are you to consider purchasing additional products/services from us in the future?

This question is fine, but there are a few enhancements to consider here. 

First, consider being more specific about the time frame (if it makes sense for your industry/organization). 

Second, this question feels similar to question 8. Consider choosing the one that best fits your research objectives.


13. How effectively did our product/service solve the problem or meet the desired outcome for you?

This question feels similar to question 10. I would recommend choosing the one that best meets the research goals/objectives.


14. Is there any specific feature or improvement you would like to see in our product/service?

I love ending customer satisfaction surveys with an open-ended question like this. 

To be as pointed as possible I recommend phrasing this question as, “If [insert organization] focused on doing one thing better to improve your satisfaction as a customer, what would it be? Enter your response below. Please note: You can skip this question if you are unsure.”

Remember to make this question optional. Some customers may genuinely be unsure.

open-ended vs closed-ended questions


Why You Shouldn't Use AI to Write Surveys

While AI can play a role in assisting with survey design, it's much better to rely on an online survey company with ~real~ human experience to ensure that the survey questions are well-crafted, unbiased, and tailored to the unique goals of the survey.

Here are a few reasons to consider avoiding AI survey writing tools:

  • Lack of Contextual Understanding: AI lacks the nuanced understanding of human context and emotions that can be crucial in crafting effective survey questions. Human empathy and intuition are often necessary to frame questions in a way that resonates with respondents.

  • Increased Bias: AI models are trained on existing data, which may inadvertently introduce biases. When creating surveys, it's crucial to ensure fairness and avoid biased questions or options, which requires a human's critical thinking.

  • Complex Subject Matter: Surveys dealing with intricate or domain-specific topics might require expertise that AI lacks. Human researchers can better comprehend and communicate the specialized subject matter.

  • Engagement and Response Rates: Human-created surveys often engage respondents better due to the human touch. Engaged participants are more likely to provide meaningful responses, which ultimately enhances the survey's value. For additional information, I recommend reading our blog post: How to Make Online Surveys More Engaging for Survey Respondents.

  • Little Personalization: When conducting surveys, your team has unique goals and objectives – many of which may not be tackled through generic questions found through secondary resources from which AI tools pull from.

  • Little Experience Designing Surveys: There is more to a survey than just asking questions. Things like survey length, question phrasing, and quality controls that can largely impact data quality. All of which are factors that AI cannot account for.

Mistakes to Avoid with AI Survey Writing

Here are a few things to keep in mind when creating a customer survey, whether it be with AI tools or not:

  • Do not forget about the question flow. Start with the broader, overarching questions first and then move on to more specific questions.

  • Compare feedback across different customer segments. Consider ways to append firmographic information (i.e., date of first purchase, total spend, annual spend, industry, region) to the data so you can understand similarities/differences among those who are more/less satisfied (i.e., pinpoint success and opportunity areas).

  • Don't go overboard on open-ended questions. Limit open-ended questions to no more than ~10% of the survey questions. These questions take considerably more effort to answer than close-ended questions. Moreover, if you ask 10 questions only 1 should be open-ended.

  • Rewards matter. Consider using incentives as a thank you for responding to the survey. Surveys can likely engage the happiest and unhappiest customers. Using incentives for participation can alleviate this and increase the response rate. For additional information, I recommend reading our blog post: You Should Offer Rewards for Your Market Research - Here's Why.

  • Think about other survey add-ons. For example, you could route respondents to your review page to boost reviews, ask if customers would like a follow-up from a representative, etc. 

Contact Our Online Survey Company

Drive Research is a full-service market research company, specializing in online survey design, programming, fieldwork, and analysis. Our team has a combined 80+ years of experience in the market research space, learning and adapting to new techniques as we go.

Need help with your online survey project? Contact our team today.

  1. Message us on our website
  2. Email us at [email protected]
  3. Call us at 888-725-DATA
  4. Text us at 315-303-2040

emily taylor about the author

Emily Taylor

As a Research Manager, Emily is approaching a decade of experience in the market research industry and loves to challenge the status quo. She is a certified VoC professional with a passion for storytelling.

Learn more about Emily, here.


subscribe to our blog

 

Online Surveys