Survey Builder Redesign

Designing a new survey building experience by creating order out of  a messy system of tools, and laying a UX architecture in place for future features.

Project Overview

The goal of this project was to redesign the Survey builder tool to be more usable, intuitive, and accessible, so that all users could take advantage of the deep and powerful feature set
My Role
Lead UX Designer: I led the end-to-end design process, and fostered strong collaboration between engineering and UX research. I created systems for efficient delivery, as well as systems for intuitive interactions. I designed empathetic user transitions, and introduced data analytics to measure success and guide improvements.

Our experience gaps

Ah, the Survey Builder on Qualtrics—the tool adored for its robust features was also infamous for its cluttered interface. Each product has an iconic, primary tool or user interface—Facebook has News Feed, Google Search has the Results Page, and Qualtrics has Survey Builder.

Our users, mostly data professionals and researchers, carried the cognitive burden of learning how to navigate this maze of functionalities. They needed to perform complex survey building tasks but were losing their way in the labyrinth of inconsistent and overly complex controls. Imagine trying to find a needle in a haystack, but the needle keeps changing its position.

The old Survey Builder experiences

I started by synthesizing all known UX research in this space into actionable recommendations for our team.  My summary confirmed:

  • The survey editor was difficult to navigate and not intuitive for our first time users
  • Users felt overwhelmed when first logging in
  • Users needed to watch tutorial videos or read support pages in order to get started on their projects
  • Inaccessible to users with accessibility needs
There were a lot of places on my first time through where I was confused by what to do...it felt like I was having to make a lot of guesses
Research Suite UX Baseline Q4 '16 | Academic participant

Understanding the problem space

Constraints and challenges

We weren’t building a product from scratch, so any solutions needed to be feasible within the existing code structure and required extra consideration since this was Qualtrics’ first major change management on a business critical UI. We had to balance the discoverability needs of more novice users as well as efficiency and feature parity for advanced users who relied on even the most obscure of features. Feature parity was essential to support all existing functionalities in any new design.  

This was also the first large-scale redesign of a product area within our growing UX team, which meant we were establishing our processes and approach for the first time. This meant that we had limited UXR resources and would have to build our quantitative data tracking from the ground up.

Objectives

The goal was to offer an accessible and intuitive survey-building experience that allows first-time users to easily get started with common tasks, such as adding questions or configuring basic settings.  I sought to achieve this without compromising the efficiency of our advanced users in the long term. This included:

  • Scalable interface
  • Accessible survey building experience
  • Trackable product metrics
  • High Quality bar for MVP, and short timespan to MLP
Success Indicators
  • Increase CSAT score of Survey Building Tool
  • VPAT Certification for accessibility

Does this spark joy?

After gathering all this context, I knew that in order to offer a more intuitive experience with these features, a thoughtful structure was crucial.  I began to ideate on a better information architecture of the survey builder by applying a Marie Kondo-like approach to its features, using a new interaction model from our platform team. I assessed whether each feature's placement and functionality met user expectations and reorganized as needed. “Does this spark joy?“ became “Does this fit here or there based on its form and function?”

I iterated on this exercise several times and exchanged feedback with the platform UX team on how this model might affect the future IA of other product areas as the platform interaction model evolved. Without this parallel work, infusing consistency across the different product areas would have been incredibly difficult.
When in doubt, card sort it out.

Validate the structure

After the card sort on the interaction model, I had a blueprint for low-fidelity wireframes. This made it easy to divide tasks between myself and another designer to divide and conquer on the detailed application of the structure. To ensure that we weren’t designing in a silo and actually addressing users problems, we created some higher-fidelity wireframes and collaborated with UX Research. This allowed us to test and confirm that our structural UI changes to the interaction model were indeed creating a more intuitive and enjoyable experience for users.
Mid-fidelity wireframes tested with users
It was hard to teach the students with the other one… I have to make a template, like a picture of it, And then point to each thing… But this is way easier, I think, to navigate.
UXR Usability study 2020 - PSYCHOLOGY PROFESSOR
Some of the tasks users were asked to complete in testing the structure and layout changes.
This seems like a cleaner layout and easier to navigate what’s going on. More modern.
UXR Usability study 2020 - HEAD OF INSIGHT AT A SMALL RESEARCH FIRM

Make a plan of action

Now that we had a better understanding of what we were planning to deliver, we needed to scope a delivery. We resisted the urge to "boil the ocean" and instead prioritized core experiences, working closely with our product manager to define a phased execution. With a clear plan for delivery, I started by reimagining the primary builder canvas to make element configuration more contextual and user-friendly. The prior version had a wealth of customization options, but their organization was inconsistent and often inaccessible to users with assistive technology.
A building canvas that’s intuitive and contextual:
For the configuration panel that housed all the survey building features, I had two main goals. First, the panel should always be contextually relevant to what the user is currently working on in the canvas (the survey question design area). Second, the panel needed to be intuitive and structured, particularly when editing a question. To do this, I made blocks selectable and aligned the organization of controls for both blocks and questions.  

I investigated the parent-child relationships between each control and designed a system for how they build upon each other as needed. By adopting the principles of progressive disclosure, I turned this once chaotic panel into an intuitive guide, flexible enough to elegantly handle all 23 types of questions in the builder.
Organizing the system of interrelated configuration enabled me to deliver this complex panel to my engineer partners in a systematic, more efficient way.
Rethinking the anatomy of a question for survey builders
On the canvas—the stage where our users craft their surveys—the goal was simplicity and inclusivity.  I dissected each question type's anatomy to better understand its components, then reorganized the components to prioritize a more appropriate visual and information hierarchy. Allowing users to quickly scan their survey or navigate through each element with their keyboard.
Question anatomy iterations included categorizing features and organizing placement based on importance and relationships to eachother.
A multiple choice question using the improved question anatomy.
Defining Qualtrics’ first event tracking for data collection
As I delivered the specs for the redesign, I teamed up with our in-house data scientist to align a problem space map to tracking events and put some solid product analytics in place. This marked the first time our organization had implemented event tracking in the product, particularly at this level of detail and complexity. Our approach became the blueprint for how other teams across the organization would build their quantitative data tracking.
Problem space map for Survey Builder which provided a roadmap for event taxonomy. Each question was mapped to a dashboard or chart with the metrics to track the answer.
This system not only provided us with a measurable way to gauge the success of our efforts but also helped us identify gaps and implement iterative improvements during the early customer previews, like leaving certain menus expanded according to usage. It was through this tracking we learned just how popular page breaks were with users, and brought it out of an overflow menu and directly into the question canvas.

Introducing change

When it was time to actually to introduce this new experience to users, we knew it would require a lot of empathy and listening. We knew that this was not only one of the most high-visibility places in the qualtrics platform, but that this was one of the most significant UI changes our users would ever experience.

When you reorganize a room, even if everything is placed in it’s most optimal place, it still takes a little bit of time to retrain your muscle memory to intuit where things moved to. We wanted to help our users to do this for the new survey building experience as seamlessly as possible, so change management of the user experience became a key component in successfully rolling this out.
Fine-tuning the launch
In collaboration with engineers, we developed a preview program to understand our users' needs during the transition to our redesigned interface. Rather than simply fixing known bugs, we set a high bar for the MVP and integrated both qualitative and quantitative feedback to evolve it into Minimal Lovable Product.
Using product analytics to discover how users were actually using the sections in the Config panel, in order to improve it's predictability
We took it a step further by partnering with UXR for a client study within the preview program, honing in on user proficiency, findability, and navigation. This study not only pinpointed key areas for improvement but also helped us craft a targeted video introduction and timeline for the broader rollout. The insights gathered influenced critical changes, making sure the new experience was as intuitive as possible.

Results

By Q2 of 2021, we successfully rolled out and transitioned all users (~400k MAU) to the new survey builder on-time with minimal customer pain. Notably, we found that:

  • By Q3, the Survey Builders CSAT score went from a 3.9 (old experience) to a 4.1 —the highest CSAT score (4.1) of any Qualtrics product. For comparison, the entire Qualtrics platform had a CSAT score of 3.8
  • By the end of 2021 we become the first product area to become VPAT-certified WCAG 2.0 accessible
Customer feedback

Once users of the old experience spent time getting familiar with the new survey builder, their response was overwhelmingly positive. We also found we had delivered on an intuitive experience for brand new users as well.

Absolutely love the new ways to move questions and blocks - so much easier! Thanks for the update.
Anonymous in-product feedback
I never used the old version, but in comparison to other survey builders I have used, this one is pretty easy.
Anonymous in-product feedback
The current Survey Builder tool.
Recognition

Qualtrics was named as the top leader in G2’s Fall 2022 Grid Report for Survey Platform (up 2 points from the previous year (2021).G2's Fall 2022 Grid Report

My takeaways

Feedback is gold
Creating systems and workflows where we could review feedback consistently and directly from customers themselves helped show us blind spots we may not have seen otherwise. It’s much easier to know what customers value the most (sparks joy) when they tell you themselves
Empathy is necessary
Listen, we were asking existing users to change the way they looked at survey building. We knew that in the long run the new experience would be more accessible and usable by a larger group of people, but we also know that change is hard and that most people are generally averse to it. Employing empathy meant listening to what our users were telling us about their experience with change, and using that data to close experience gaps as they were identified.
Become BFFs with your engineers:
The success of this project was largely due to how well the design solutions worked within our complex technical constraints. Building close partnerships with engineering from the start helped me to understand what was realistically possible, and also which constraints were a little more flexible. Having aligned on goals from the start, prioritizing solutions to implement never felt much like a negotiation, but rather a partnership in achieving shared goals.

Final thoughts

Though the journey to this point was long, I can say that I did experience the magic that Marie Kondo promises comes from tidying up. Using the survey builder now sparks joy for me, our customers, and the future teams that want to build new capabilities in Survey Builder.