In-Person, Moderated, Benchmark Usability Testing
Skills Used: Protocol Writing, Usability Test Moderating, Research Analysis, Video Editing, Report Writing
Tools Used: Webex, Camtasia, IPEVO camera
The client, a regional healthcare system, recently launched a redesigned website. The client completed baseline usability testing to inform the redesign and wanted to complete additional benchmark testing. They wanted to test the redesigned site for ease of use, navigation, and aesthetic appeal.
I moderated 8 in-person usability tests and supported research analysis and report writing.
Immediate recommendations and future forward considerations to improve website usability, navigation and feedback.
What does the client want to know?
Writing Research Questions and a Research Protocol
In order to get the most out of usability testing, I had to understand the research goals. The client’s goal was to improve marketability of their healthcare services through an updated, easy-to-use website.
My research questions included:
How usable is the website?
Is the navigation and layout easy for users to understand?
Is the website aesthetically pleasing?
Does the website include features that would attract new patients to the the health system?
I updated the baseline usability protocol with the revised research questions, and ensured the tasks and probing questions aligned to answer those questions.
Show me how you do that
The tasks were designed around the four most important, redesigned features including:
- Find a Doctor
- Make an Appointment
- Pay a Bill
- Find a Location
The tasks included a brief set up of the context and a request such as:
Say you're a patient at Clinic B and you need to refill a prescription. Show me how you would find the phone number to do that.
Conducting the tests
I conducted in-person, usability testing in the regional area covered by the client.
- A mix of genders
- A mix of ages: 30-65
- A mix of education levels: high school degree through master's degree
- A mix of household incomes: $35,000- +$100,000/year
I moderated the sessions while two team members and the client watched and took notes remotely through Webex. Participants completed tasks on both desktop and mobile. I used an IPEVO camera to record participants completing tasks on their mobile phones and the entire test was recorded on Camtasia.
Analyzing the Results
My team organized our notes by key findings and takeaways. Our key takeaways document was organized by
- What Went Well
- Success Matrix
- Areas for Improvement
Better Usability but Room for Improvement
What We Found
Overall, participants found the website easy-to-use and attractive. Most participants were able to complete the majority of tasks. Users said the website was "clean," "easy to use," and "user friendly." While user feedback was positive, there were still areas that needed improvement.
What did not work
Users were unable to find menu items that were listed in a second layer of a "supermenu."
Two of the search functions on the site did not provide enough feedback to users as the results updated. Users struggled with the search functions because they did not include results or suggestions for misspelled words or plain language search terms.
Users did not recognize that many elements were clickable including buttons, links, and profile pictures.
We used common pain points and task success to make recommendations:
The final report recommended that the search features be redesigned to include above the fold feedback to show users that the search results had changed.
Outcomes and Implementation
Our recommendations are being implemented as time and budget allow.
Include hover states for icons and buttons to show users they are clickable:
Include feedback above the fold to tell users when search results are updating:
Include autofill features and search results that account for misspelled or plain language searches.
What I Learned
Prior to usability testing, my team and I discussed Nielsen Norman Group’s article, Talking with Participants During a Usability Test. While moderating, I focused on minimizing my affirmations, giving participants time to talk through their thought process, and asking them to elaborate and explain their actions.
My key takeaways:
- Take notes as you watch participants move through the site. By taking notes on their task path, it is easier to recreate their experience and ask additional probing questions.
- Do not lead participants. As participants move through the site, give them space to ask questions and try things. If they ask a question about the site or features, reflect that question back to them.
- Note things remote observers won’t notice. Because I was moderating in person, I was able to take note of participants motions such as leaning in to the screen to find something or shaking their head while looking for a page. Additionally, because I was viewing the screen from a distance I was able to identify issues with contrast and layout. These notes proved to be helpful in writing the final report.