Website intercept surveys do an excellent job at collecting feedback from site users in-the-moment. This data and feedback can be an invaluable piece to assist with user journeys, paths, points of satisfaction and dissatisfaction, and provide context for bounces. The website surveys can be set up in a variety of ways.
This case study outlines a recent pilot study. It involved both a site survey and employee survey. Each of the 2 fronts collected data on key objectives for the client. Read more below.
Measuring website user feedback can be tricky. Using a website satisfaction survey company like Drive Research can help.
Drive Research partnered with a marketing strategy company to conduct market research for a motor vehicles website. The site obtained over 185 million visitors last year which equates to over 500,000 visitors per day. The market research study provide OnlineGURU with answers to its core objectives for the client.
The core objectives for the client were: (1) customer satisfaction with the website, (2) information accuracy among end-users, (3) feedback on advertisements on the site, and (4) employee engagement. The market research helped the client better understand the user experience, behavior, expectations, and perception of the website in addition to employee satisfaction.
The market research measured the site and provided the client with a scorecard of metrics. This context will help drive improvements to the site experience, marketing, and strategies for the client. This wave served as a pilot to potential regular and ongoing surveys conducted on a quarterly basis.
To address the objectives at-hand, Drive Research recommended the following market research approach for the survey. Drive Research initially recommended using a combination of online surveys: (1) a survey to a contact database of users, (2) a pop-up survey on the website to catch users in real-time, and (3) an email survey to employees. When used in combination they will address all primary and secondary objectives using both Voice of Customer (VoC) and Voice of Employee (VoE) data.
Due to the constrained timeline and necessary efforts with third-parties to launch the email survey to a contact database of users, it was agreed this component would be delayed until future waves. The timeline for the project was expected to be 6 to 7 weeks from kickoff through report. This pilot study was completed in 3 weeks to meet the deadline.
Drive Research managed a website survey to real-time site users. The surveys were completed in a modal pop-up window on-site. This helps avoid pop-up blockers as it uses code similar to a login and passcode pop-up screen on a website.
The sampling rate for the pop-up survey was set to 1:10. This means the survey pop-up was shown to users on every 10th page visited (on average). Once the survey was shown a cookie was dropped so the same user would not see nor take the survey multiple times. This was set to 1:10 instead of a higher ratio to ensure we captured a cross-section of respondents who had a variety of depth and paths on the site. Setting the ratio to 1:1 or 1:2 would have produced quicker returns but also would have captured a higher percentage of single page visitors or visitors who only visited 2 pages before bouncing.
The survey took users an average of 1 minute to complete and included up to 8 questions. Several open- ended or free text boxes were shown only to a specific set of respondents depending upon their answer to prior questions. The minimum goal was to collect at least 400 responses. The survey received 1,274 responses over the 24-hour period including 687 full responses.
Fieldwork was conducted from Tuesday, May 1 at 11:00 a.m. PT through Wednesday, May 2 at 6:00 a.m. PT. A sample of this size produced a margin of error of +/- 3.7% which makes the results highly statistically reliable.
Drive Research also managed an employee engagement survey sent to all contacts. The surveys were sent through an email invitation from Drive Research to ensure confidentiality and anonymity. Pre- awareness and credibility for the survey was built through an email sent to staff by the President prior to fieldwork starting as well as continual reminders as part of morning huddles.
The first invitation for the email survey was sent on Tuesday, May 1. A reminder was sent on Thursday, May 3. A final generic link was sent to all non-responding employees on Friday, May 4. The deadline to complete the employee survey was Monday, May 7 at 5:00 p.m. PT.
The survey took users an average of 5 to 8 minutes to complete and included 34 questions. The survey received 27 responses over the week of fieldwork from the total of 36 contacts. This represents an incredibly strong response rate of 75%.
No questions forced responses meaning employees could skip any question they chose. No demographics or categorization criteria was collected in the process. A sample of this size produced a margin of error of +/- 9.6% which makes the results reliable.
Unfortunately for you, the results remain confidential with the clients. However, the surveys covered a variety of topics to tackle objectives for the clients. This included a full report with executive summary, recommendations, an infographic, and a question-by-question breakdown of results from both components.
Site survey and employee survey results were benchmarked to industry peers to provide quartiles and context for the results. This helped Drive Research and the client understand whether arbitrary scores were good or bad.
Drive Research is a website satisfaction survey company and website feedback survey company located in Upstate New York. We work with a variety of clients across multiple industries to design, implement, and report on feedback collected.
Contact our team below: