Traditionally, UX researchers like myself have relied heavily on qualitative research methods to gather feedback from users. That means combing through hours of videos of users interacting with the product, making copious notes, and whittling the information down to actionable items for your coworkers.
The process is time-consuming, however, and makes it difficult to track common themes across multiple tests. That’s why UserTesting decided to incorporate survey-style questions into our arsenal of testing techniques for subscription clients. We call these survey-style questions Customer Experience Analytics, and there are three kinds:
An example multiple choice question
An example rating scale question
- Written answer
An example written answer question
There are a lot of positives to using these questions in your usability tests, but here are the three big ones:
- They save you time
- They improve the quality of your research
- They make it easier to share study results with others
How Customer Experience Analytics save you time
Need to find out what words or phrases testers would use to describe your app? Written-response questions are your best friend. Want to settle a team debate over which version of your homepage appeals to more users? Multiple-choice answers make the answer quick and clear. Curious if users think a drop-down menu makes it easier to find something than a list? Have them try it both ways and rate the task’s ease or difficulty on a 5-point scale.
Customer Experience Analytics saves you time by letting you quickly identify the most important clips to watch.
While the UserTesting Research Team still recommends that you watch the videos to gain more detailed understanding of the users’ choices, their responses to these survey-style questions can be viewed almost instantly, which can save you time and help your team make swift decisions. These answers also indicate which users struggled most and least with your product, so you can focus your qualitative analysis on those users and avoid watching every video start to finish.
How Customer Experience Analytics improve the quality of your research
Unlike the results you get from traditional surveys, which bring you lots of numbers and very little context, asking these survey-style questions during a user test recording gives your users a chance to explain WHY they are providing the low ratings or negative multiple choice responses.
Their answers often concisely highlight certain problem areas on the site, like when a user says, "I'm only going to give this a three, because even though I found the product I was looking for, it took me three tries to get the right category."
Their answers also might highlight confusing questions, for example: “I’m not sure if you’re asking about the colors of the icons or the whole site, but…”
Either way, your research will flourish with the addition of Customer Experience Analytics. And you won’t be the only one who sees it! Let’s talk about...
How Customer Experience Analytics make it easier to share the results of your study with others
Most of the time, our clients are responsible not only for running user studies, but for passing on the insights they’ve gained to their team of developers, designers, and so on. While clips and notes can be powerful illustrations of usability issues, the Research Team has learned to supplement our qualitative findings with charts and graphs that provide “big picture” information in an easily-digestible way. For the chart below, we used mutliple-choice questions to let users indicate which of the two versions of a site they preferred.
In this example, there was a clear preference for the live site.
So if you’re having trouble displaying results to your team (or your boss), a few survey-style responses can improve your report and impress upon your team just how actionable an issue is.
Guidelines for use
Since their implementation, these survey-style questions have proven to be valuable and versatile tools for the Research Team at UserTesting. As we began to put them to use, however, we realized that the feedback they collected from users was most constructive when we followed some basic guidelines. Today we’d like to share some of those guidelines with you.
Guideline 1: Set specific objectives for each study, and keep them close by
As with any usability study, we recommend that you create concrete questions and objectives to focus and streamline your study. Write them down before you craft a single task, and keep them visible throughout the test process to make sure your task writing, analysis, and report remain focused on the same issues.
Keeping your testing objectives handy will keep your study focused.
Guideline 2: Use analytics questions in moderation
Space your questions out with simple instructions. While it’s tempting to use the shiny survey-style questions, steering the user successfully through the test is still important. Signposts like “the Support button at the bottom right of the page” or providing URLs of specific pages ensure that users are providing feedback on the right element of your product. It’s tempting to avoid watching the videos when you have the quantitative answers generated for you, but they aren’t worth very much if the user is reviewing the wrong thing.
Guideline 3: Keep users engaged
When these questions were first implemented, the Research Team went to town with them; we worked them into tests as often as possible! But we quickly found that it's important to keep your questions and tasks balanced. For example, we noticed that if we put too many rating-scale questions into our tests, users became focused on providing the answers required, and stopped talking out loud about why they selected those answers--which is what we really want in a usability study!
You can avoid this user fatigue by alternating the questions within each test. For example, if your objective is determine how helpful they consider a specific feature of your app, have them:
-
Use the feature (Basic task)
-
Describe this feature in 3 words or phrases (Written-response question)
-
Indicate if the tool is extremely helpful, somewhat helpful, or not at all helpful (Multiple-choice question)
-
Rate how likely they would be to use the feature if they needed help (Rating Scale)
It never hurts to have more than one style of answer to work with, and it keeps the user thinking carefully about what’s being asked of them.
Guideline 4: Keep the questions simple
The easiest way to avoid frustrating the user is to keep the questions simple. Again, this is something the Research Team recommends for all tests, but it’s especially important for survey-style questions.
Consider this scenario: You’re a tester who’s just been instructed to find an item that you would like to buy from Macys.com and then add it to the cart. You easily find an item, but have trouble with the size and color selector, which makes adding the item to the cart downright frustrating. You’re then confronted with the following question:
Now you’re in a pickle; you would rate finding the item as a 5, but adding the item to the cart as a 1. You can’t proceed with the test until you provide an answer, and the clock is ticking away.
Many users will split the difference and answer 3, but the person who made the test would have gotten much more value out of breaking up the question into two simpler tasks: first rating the ease of finding an item, then the ease of adding that item to the cart. Simple.
So there you have it: guidelines straight from the Research Team to help you make the most of User Testing’s Customer Experience Analytics. They have proven to be valuable and versatile tools for us, and now you can confidently add them to your testing toolkit in future studies!
If you'd like to see this article broken down into an easy-to-digest deck, check out our SlideShare presentation:
Streamline Your UX Research with Customer Experience Analytics from UserTesting
Insights that drive innovation
Get our best human insight resources delivered right to your inbox every month. As a bonus, we'll send you our latest industry report: When business is human, insights drive innovation.