150 likes | 228 Views
Toolkit Support for Usability Evaluation. 05-830 Spring 2013 – Karin Tsai. Overview. Motivation Definitions Background from Literature Examples of Modern Tools. Motivation. To improve or validate usability Comparison between products, AB tests, etc. Measuring progress
E N D
Toolkit Support for Usability Evaluation 05-830 Spring 2013 – Karin Tsai
Overview • Motivation • Definitions • Background from Literature • Examples of Modern Tools
Motivation • To improve or validate usability • Comparison between products, AB tests, etc. • Measuring progress • Verify adherence to guidelines or standards • Discover features of human cognition
Usability Attributes • Learnability – easy to learn • Efficiency – efficient to use • Memorability – easy to remember how to use • Errors – low error rate; easy to recover • Satisfaction – pleasant to use and likable
Evaluation Categories • Predictive • psychological modeling techniques • design reviews • Observational • observations of users interacting with the system • Participative • questionnaires • interviews • “think aloud” user-testing
Challenges and Tradeoffs • Quality vs. Quantity • “Quality” defined as abstraction, interpretability, etc. • User testing – high quality; low quantity • Counting mouse clicks – low quality; high quantity • Observing Context • Abstraction • Event reporting in applications places burden on developers • Complicates software evolution
CogTool • Evaluation Type: Predictive • Description: Uses a predictive human performance model (“cognitive crash dummy”) to evaluate designs.
CogTool Overall Score: 6.5/10
Mixpanel • Evaluation Type:Observational • Description:Aggregates developer-defined event data in useful ways.
Mixpanel Overall Score: 9.5/10
Chartbeat • Evaluation Type:Observational • Description: Real-time data visualization.
Chartbeat Overall Score: 7/10
User Testing • Evaluation Type:Participative • Description:Watch a user complete a task on your system while thinking aloud.
User Testing Overall Score: 8.5/10