Benchmarking Clinician Data Access

Evaluating usability of a multi-factor, single sign-on solution.

Summary

A security tech company provides healthcare facilities a single sign-on solution to securely access electronic clinical data using multiple methods (e.g., fingerprint). Our team’s goal was to quantitatively benchmark the product's usability and uncover areas to improve its end-to-end experience.

Our approach involved a pluralistic heuristic evaluation and a summative usability test. The in-depth evaluation generated actionable findings and recommendations, established an overall usability grade, and set critical benchmarks to measure the success of future improvements.

Context

The Problem

In healthcare settings, efficient and secure access to clinical data and information systems is crucial. Enrollment and use of advanced security protocols to protect clinical data and electronic medical records (EMRs) require significant resources and help documentation for training and troubleshooting/support.

This is a challenge for healthcare practices because it adds another hurdle to EMR adoption and clinical workflows, and adds IT and onboarding costs.

It’s also a challenge for the business because their associates are too involved throughout adoption and must rely too heavily on on-site IT for training and troubleshooting support with clinical staff.

For clinicians, the enrollment experience is frustrating with too much documentation, and trouble getting into the system disrupts their ability to do their jobs.

Target Audience

Our audience and testing participants were clinicians or healthcare professionals, such as doctors, nurses, and hospital administrators, who must access clinical information systems and EMRs multiple times a day.

Our Team

We were a consulting group of three UX researchers working with an in-house UX team at a security tech company.

Vision

Our vision was to optimize the usability of a multi-factor, single sign-on product, so that healthcare professionals could more self-sufficiently enroll multiple authentication methods and seamlessly sign in and out with all of them.

By establishing UX benchmarks through usability testing, we aimed to pave the way for future iterations and measuring how they simplify onboarding and secure data access.

Process

First, we met with the in-house UX team and stakeholders to download their knowledge about the product space. Next, we prepared a usability testing proposal outlining problem and vision statements, research questions, methodologies, data and measures, timeline, and deliverables. 

Then, we executed our two-phase usability evaluation approach, involving a pluralistic heuristic evaluation followed by a summative usability test.

Throughout, we assessed critical aspects of the user experience while enrolling authentication methods, and signing in and out of a system.

Heuristic Evaluation

Each researcher independently reviewed the sign in and out flows using all authentication methods, including a proximity card, fingerprint, password, and mobile app code.

We applied Nielsen's usability heuristics and severity rating scale to collaboratively analyze and share findings with the larger team in a report presentation.

Our review of the system helped structure the tests to follow, and our findings served as hypotheses for what we might uncover.

Usability Testing

Planning

We prepared a screening questionnaire for a recruiter to recruit >10 testing participants. As participants were being recruited, we planned the study, and prepared the moderator’s guide.

Execution

We conducted 11 in-person tests with clinicians (including 1 pilot test) in a controlled usability lab environment. Participants were asked to:

  • Answer pretest questions to better understand who they were.

  • Complete 5-7 tasks with the product.

  • Respond to post-test difficulty ratings and explain their ratings.

  • Answer post-test questions and respond to the system usability scale (SUS) via a survey.

Our team rotated moderating, observing, and notetaking. We captured several performance and preference metrics to set usability and UX benchmarks, including:

  • Task completion within a set time.

  • Time to successfully complete a task.

  • Errors per task.

  • Taps per completed task.

  • Task difficulty rating.

Qualitative data captured through observation and questions added fidelity to these quantitative measures.

Analysis and Reporting

Our team used a rainbow spreadsheet to analyze our qualitative findings, and collaboratively generate recommendations to address the most significant concerns.

We prepared charts based on calculated benchmarks, such as average errors, average clicks and median time per task, and calculated and visualized a SUS score and grade.

Our findings and recommendations were shared with the rest of the team in a detailed report presentation.

Result

The full usability evaluation yielded valuable insights into the single sign-on solution's strengths and weaknesses, and benchmarks to measure success of future iterations.

By diving deep into the product’s UX with and without clinical staff, we were able to pinpoint many areas for usability improvements and shed light on opportunities to remedy concerns that are impacting clinicians’ experience with the product. The in-house UX team used our insights and recommendations to elevate their experience in the next iteration.

Lessons Learned

  • Heuristic evaluations are truly an efficient usability method, especially with multiple evaluators, and an effective warm up to in-depth usability testing. The combination is excellent for a full-scale usability and UX check-up.

  • Always recruit more participants than you expect to use, as someone will cancel, and having backups on the ready are critical to maintaining forward momentum in the midst of testing.

  • When moderating a session with a participant, not breaking character is extremely difficult, especially when you can’t give hints and something funny occurs.

Want to learn more?

Previous
Previous

Redesigning an Early Warning System

Next
Next

Converting Mobile VoIP Users