UX Metrics & SUS Tests

Project Type

Usability Test

 

End User

Hedge funds clients

Project Time

2 weeks

Product Type

Cloud-based investment application

Team

UX Designers, Product Managers, Software Engineers

“What are the KPIs of UX projects?”

“What’s the financial return for this new design?”

“How can you prove Design A is better B?”

“Where is the UX/UI performance record?”

...

From time to time, the UX team at SS&C Eze heard the questions above from internal stakeholders. My supervisor (the Head of UX) and I share the same vision that we should translate experience into numbers and measure our designs. It will not only help the UX team to better communicate with other stakeholders, but will also help to measure, compare, and track the user experience of the product.

 

I led this project and our team together researched UX metrics as well as UX measurement solutions. For the usability test part, we did the System Usability Scale (SUS) tests. We presented our researches and SUS tests internally. Below are the details*.

Index

1. Why Do We Test & Measure UX/UI?

 
  • Designs and experiences can be measured

  • Quantify user experience into measurable metrics

  • Benchmark study for comparisons and tracking purposes

  • UX/UI scores can be related to KPIs and financial returns

  • Discover new UX/UI issues & improvement opportunities

2. UX Metrics & Measurements

 

Based on research, the UX metrics below would apply to the current product:

Quantitative Metrics

Task Success Rate

Time on Task

Use of Search vs. Navigation

User Error Rate

SUS

System Usability Scale

Qualitative Metrics

Expectations and Performance

Overall Satisfaction

3. System Usability Scale (SUS)

 

Considering the costs, easiness and effectiveness, we decided to first experiment with SUS test.**

  • System Usability Scale (SUS) is designed to measure usability. 

  • It helps to evaluate various products and services, including hardware, software, mobile devices, websites and applications.

  • Sample size and reliability are unrelated, so SUS can be used on very small sample sizes (as few as two users) and still generate reliable results. 

  • SUS has been shown to effectively distinguish between unusable and usable systems as well as or better than proprietary questionnaires.

  • It has 10 questions with five response options; from Strongly disagree to Strongly agree.

  • No. 1, 3, 5, 7 and 9 are positive questions. No. 2, 4, 6, 8 and 10 are negative questions. Every question is well designed and the order of questions is fixed.

SUS Survey.jpg

How to calculate and interpret SUS score.**

  • For odd items: subtract 1 from the user response.

  • For even-numbered items: subtract the user responses from 5

  • Add up the converted responses for each user and multiply that total by 2.5.

  • A SUS score can range from 0 to 100, it isn’t a percentage. 

  • A SUS score above a 68 would be considered above average and anything below 68 is below average.

4. Usability Test Plan

A

B

C

3 Design Versions of Feature “Analytics”

  • Version A: Existing design in live product

  • Version B: New concept I in clickable prototype

  • Version C: New concept II in clickable prototype

15 Internal Participants

  • Each version is tested with 5 different participants 

  • The 5 participants should include: 

    • 2 advanced level users

    • 2 intermediate level users

    • 1 beginner level

  • Tests run separately with each participant

Same Set of Tasks

  • Each participant needs to finish the same set of tasks

  • Tasks are the goals participant needs to reach

  • Detailed instructions for tasks should not be provided

Screen Recording 

  • Each participant uses the same laptop for the test

  • Recording starts after the participant read the tasks

  • Recording ends when all tasks are finished

SUS Questions

  • Created with Microsoft Forms

  • Participants are required to finish the survey questions right after completing all tasks 

Analyze Test Results

  • Collect survey responses

  • Calculate survey responses into SUS scores

  • Compare SUS scores

  • Watch and analyze screen recording videos

Conclusion

  • Summarize test results

  • List issues and opportunities

  • Make the design decision

 

5. SUS Survey Results

 
1/1

Above are the responses for SUS survey questions. We translated the responses into numbers and calculated the SUS scores with Excel.

Score A.jpg
Score B.jpg
Score C.jpg

We used Analysis of Variance to compare the SUS scores. The analysis told us that the SUS scores of the three designs are significantly different. 

Final Results.png

6. Notes from Observations

 
  • For Version A: Existing design: “Is there an issue?” “I hate that. My biggest complain”. Users said these when they didn’t know how to adjust Analytic settings.

  • For Version A: Existing design: 3 out of 5 user refreshed the browser in order to run a new Analytic model. 2 users knew the new model button.

  • For Version B: New design concept I, one user mentioned wish the “+ New” button can be more outstanding.

  • In the two new design versions, advanced users still tried to use the keyboard shortcuts they are familiar with in the existing design.

7. Conclusions

 
  • Current Analytic feature’s usability is poor and should be improved.

  • Version B: New design concept I is the best UX/UI solution with Excellent rating usability. “+ New” button should be improved to be easy to seen.

  • Keep the keyboard shortcuts behavior that user are familiar with.

*To comply with non-disclosure agreement, confidential information and details have been replaced or omitted.

** References: 

Copyright © 2020 Xiaonan Ma.