top of page

Accelerate Survey Creation within MR

Updated: Oct 6, 2021

Mapping out the survey creation journey

A 3-day design sprint was held in our London office with the goal of improving survey creation efficiency targeting the market research line of business. Customers, stakeholders from the customer support team, R&D team members and product managers were invited to participate.


At the first day of the design sprint, customer interviews and stakeholder interviews were conducted to map out the survey creation journey from scratch to a full grown project.

Based on all the input we gained from the interviews, and together with the 3 user types we have for MR (client, researcher and programmer), I managed to map out the survey creation journey, and the interaction between all 3 personas along the journey.​


Key findings:

1. The questionnaire design phase frequently involves heavy collaboration between the client and the researcher. Client and researcher are communicating over emails with a Microsoft Word™ document circulated many times over as each party provides their input.


2. Once both client and researcher provide final approval, the survey programmer then would need to translate the Word questionnaire into an online survey. Another circulated process is happening between the researcher and the programmer communicating on how the survey should be built.


3. When the survey is built in Survey Designer and ready to be reviewed, the programmer then would send out a link of the survey to the researcher asking for feedback. Yet another circulated process happens between all 3 parties communicating on the changes to the survey.


The long term goal is to accelerate the online survey creation for the programmer by eliminating the arduous process of manually inputting survey questions; And to improve the communication at the survey reviewing stage by providing an easier way for each party to give their input or feedback on the survey.


 

Accelerate Survey Creation with AI


Since the survey programmer is the user of Survey Designer, we’ve defined a user story targeting the survey programmer.


As a Survey programmer working at a Market Research Agency, I'm building hundreds of surveys every month.

I would like to eliminate the arduous process of manually creating and inputting survey questions.

So that I can spend more time on collecting data and analyzing data.


The idea we came up with at the design sprint is to utilize Artificial Intelligence to translate the survey questionnaires directly from Microsoft Word into Survey Designer. Importing a Word questionnaire automatically transforms it into an online survey by converting each bit of content into system-recognizable survey elements.




At the last day of the design sprint we came up with a workflow and a working prototype. And we managed to conduct usability tests on customers and internal stakeholders. Based on the feedback we collected, we broke down the workflow into 6 steps and came up with the UX requirements for each of the steps.

The workflow of a survey programmer is broken down into 6 steps:
  1. Prepare the final version of the Word document and the information needed for building a survey

  2. Import the Word document into Survey Designer

  3. Review the output of AI

  4. Adjust the output of AI, and the adjustment should be reversible

  5. Approve the questions by adding them into survey editor

  6. Further customise the questions using the survey editor

Understanding questionnaire format and planning the incremental development of both the AI and the UI

In order to understand how our customers are constructing their questionnaires in Word, we collected a bunch of surveys from our customers to understand the format and pattern. We also used these surveys as samples to train the AI model.

At this early research phase, I was working closely with back-ends to find out the common pattern of the questionnaire and to understand how the AI model recognises the pattern from the questionnaire.

By studying the format of the sample surveys, we figured out a pattern commonly used by our customers.

  • Each question starts with question text

  • Question text begins with a question id followed by a dot, for example q1. How old are you?

  • Question instructions are often placed after the question text as a separate paragraph

  • Answer options are normally formatted as a bullet list and placed after the question instruction.

  • Grid questions are formatted as a table by some customers.

Phase 1: an early solution with minimal AI capability

Based on the common pattern we discovered, we developed a working prototype that has minimal AI capability. The AI at this stage is only able to detect the 'hard returns' from the Word document, and is able to transform each paragraph into an 'Unknown' element in the UI. It’s not able to recognise any of the survey elements, users have to manually convert the ‘Unknown’ items into survey elements.

By conducting usability tests on the early prototype, we’ve collected a lot of feedback and come up with a list interaction requirements for further UX improvement:

  • The access points of the AI Word Importer should be exposed at the creation of the survey, and inside the survey editor allowing the users to pull questions from a Word questionnaire into an existing project of Survey Designer. Onboarding is necessary to help users get started with the tool. Considering the processing speed of the AI, and the size of the Word document, waiting/loading indication is necessary.


  • Considering questionnaires normally contain a lot of paragraphs, the screen containing the imported questions should be relatively big. While the user is reviewing the output of the AI, It should be visually clear to the users that the questionnaire is broken into elements by paragraph. And it should be clear in the UI about what elements are recognised, and therefore requires more attention.


  • Users should be allowed to adjust the output of AI. All available options for each element should be exposed to users, with understandable tooltips. Users should be allowed to try all the actions to learn the behaviour, and all the actions done to the element should be reversible.


  • While approving the adjusted questions by adding them into the survey editor, it should be visually clear what questions have been added, and how to update the added questions if the user wants to further customise the question.


  • It should be clear to the users what settings can be adjusted while reviewing the questions in the AI Word Importer, but the rest of the customizations have to be done in the survey editor after the question is added to the survey editor .

Phase 2: Considerable AI capability + question tagging

We are now in the phase 2 of the development, where the AI is capable of detecting several popular question types. We have also added the support that allows users to tag the questions in the Word questionnaire before importing for better accuracy.


 

User onboarding


The AI does not work as magic, and it has its limitations. In order to calibrate users' expectations about the functionality and output, it's important to communicate up-front about what the tool is capable of and what it can not do.


The Getting Started Guide was designed to help users onboard with the AI Word Importer.


In addition to that, we are providing an example questionnaire in the Getting Started Guide for users to download. It provides questions to test run the tool, as well as examples for users to learn about how to improve accuracy by tagging your questionnaire before uploading.


bottom of page