Context: Microsoft's Accessibility Insights for Web is a tool that aids web developers in evaluating and enhancing webpage accessibility through comprehensive manual and automatic tests.
Team: Microsoft Product Manager, 3 UX Designers, 1 UX Researcher
Valuable takeaways: I gained insights into conducting user testing with visually impaired users and learned about web accessibility and screen reader considerations.
Click for Figma Prototype.
Role: UX Research/Design
Timeline: 13 weeks
Our industry partner, acting as the product manager, provided us with high-level insights from internal Microsoft research. Drawing from user research, we defined our project scope, the significance of our target users, and prioritized features to design.
User Characteristic
User Motivation
User Selection Significance
I analyzed online reviews and existing literature to gain insights into usability issues within the tool. Our objective was to pinpoint the task flows and features most prone to usability issues.
Leveraging my background in Frontend development, I conducted a walkthrough of the overall assessment tool, performed assessments using the main assessment test, and annotated our findings on Figma. Our goal was to generate hypotheses about the task flows critical for novice developers and anticipate the challenges they might face.
I assisted in formulating survey questions primarily aimed at recruiting developers for interviews with diverse levels of accessibility knowledge. Additionally, we sought to gather insights into the accessibility evaluation tools developers are utilizing for web assessment.
A competitive analysis was conducted of assessment tools across various platforms and pricing models. Our goal was to gain an understanding of the strengths and limitations of other tools available in the market.
We conducted a learnability study with three trials lasting 30 minutes each for every participant, who were novice accessibility developers where they completed the same three tasks in all the trials. These three tasks were related to task flows critical for novice developers. The tasks are
Onboarding the tool and exploring its features
Conducting an assessment to get started
Reporting the accessibility issues
We recorded the time to complete each task, number of errors, user thoughts, and comments. SUS responses were collected at the end. A post-study semi structured interview was also conducted to clarify other data. Our aim was to measure and plot the learnability of the tool across trials. By employing a mixed methodology, we plotted time and error graphs, as well as SUS scores, and rationalized them with qualitative feedback from users. This allowed us to pinpoint the specific usability issues impacting learnability.
Learnability considers how easy it is for users to accomplish a task the first time they encounter the interface and how many repetitions it takes for them to become efficient at that task.