1 / 61

CHI99 Panel Comparative Evaluation of Usability Tests

CHI99 Panel Comparative Evaluation of Usability Tests. Presentation by Rolf Molich DialogDesign Denmark molich@dialogdesign.dk. CHI99 Panel Comparative Evaluation of Usability Tests. Take a web-site. Take nine professional usability teams. Let each team usability test the web-site.

lane-levy
Download Presentation

CHI99 Panel Comparative Evaluation of Usability Tests

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Rolf Molich DialogDesign Denmark molich@dialogdesign.dk

  2. CHI99 PanelComparative Evaluation of Usability Tests Take a web-site. Take nine professional usability teams. Let each team usability test the web-site. Are the results similar?

  3. What Have We Done? • Nine teams have usability tested the same web-site • Seven professional teams • Two student teams • Test web-site: www.hotmail.comFree e-mail service

  4. Panel Format • Introduction (Rolf Molich) • Five minute statements from five participating teams • The Customer’s point of view (Meeta Arcuri, Hotmail) • Conclusions (Rolf Molich) • Discussion - 30 minutes

  5. Purposes of Comparison • Survey the state-of-the art within professional usability testing of web-sites. • Investigate the reproducibility of usability test results

  6. NON Purposes of Comparison • To pick a winner • To make a profit

  7. Basis for Usability Test • Web-site address: www.hotmail.com • Client scenario • Access to client through intermediary • Three weeks to carry out test

  8. What Each Team Did • Run standard usability test • Anonymize the usability test report • Send the report to Rolf Molich

  9. Problems Found • Total number of different usability problems found 300 • Found by seven teams 1 • six teams 1 • five teams 4 • four teams 4 • three teams 15 • two teams 49 • one team 226 (75%)

  10. Comparative Usability Evaluation 2 • Barbara Karyukina, SGI (USA) • Klaus Kaasgaard & Ann D. Thomsen, KMD (Denmark) • Lars Schmidt and others, Networkers (Denmark) • Meghan Ede and others, Sun Microsystems, Inc., (USA) • Wilma van Oel, P5 (The Netherlands) • Meeta Arcuri, Hotmail, Microsoft Corp. (USA) (Customer) • Rolf Molich, DialogDesign (Denmark) (Coordinator)

  11. Comparative Usability Evaluation 2 • Joseph Seeley, NovaNET Learning Inc. (USA) • Kent Norman, University of Maryland (USA) • Torben Norgaard Rasmussen and others, Technical University of Denmark • Marji Schumann and others, Southern Polytechnic State University (USA)

  12. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Barbara Karuykina SGI, Wisconsin USA barbarak@sgi.com

  13. Challenges: Twenty functional areas + User preferences questions

  14. Possible Solutions: • Two usability tests • Surveys • User notes • Focus groups

  15. Results: 26 tasks + 10 interview questions 100 findings

  16. Challenges: Twenty functional areas + User preferences questions

  17. Problems Found • Total number of different usability problems found 300 • Found by seven teams 1 • six teams 1 • five teams 4 • four teams 4 • three teams 15 • two teams 49 • one team 226 (75%)

  18. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Klaus Kaasgaard Kommunedata Denmark kka@kmd.dk

  19. Slides currently not available

  20. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Lars Schmidt Framtidsfabriken Networkers Denmark ls@networkers.dk

  21. Team E Framtidsfabriken Networkers Testlab, Denmark

  22. Key learnings CUE-2 • Setting up the test • Insist on dialog with customer • Secure complete understanding of user groups and user tasks • Narrow down test goals • Writing the report • Use screendumps • State conclusions - skip the premises • Test the usability of the usability report

  23. Improving Test Methodology • Searching for usability and usefulness • Hook up with different methodologies (e.g. interviews) • Focus on website context • Test against e.g. YahooMail • Test against softwarebased email clients

  24. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Meghan Ede Sun Microsystems California, USA meghan.ede@sun.com

  25. Hotmail Study Requests • 18 Specific Features • e.g. Registration, Login, Compose... • 6 Questions • e.g. "How do users currently do email?" • 24 Potential Study Areas

  26. Usability Methods • Expert Review • 6 Reviewers • 6 Questions • Usability Study • 6 Participants (3 + 3) • 5 Tasks (with sub-tasks)

  27. Report Description 1. Executive Summary - 4 Main High-Level Themes - Brief Study Description 2. Debriefing Meeting Summary - 7 Areas (e.g. overall, navigation, power features, ...) 3. Findings - 31 Sections - Study Requests, Extra Areas, Bugs, Task Times, Study Q & A 4. Study Description Total: 36 Pages - 150 Findings

  28. Lessons Learned • Importance of close contact with product team • Consider including: • severity ratings • more specific recommendations • screen shots

  29. Discussion Issues • How can we measure the usability of our reports? • How to deal with the difference between number of problems found and number included in report?

  30. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Wilma van Oel P5 The Netherlands w.vanoel@p5-adviseurs.nl

  31. Wilma van OelP5adviseurs voor produkt-& kwaliteitsbeleidquality & productmanagement consultantsAmsterdam, the Netherlands

  32. Structure of Presentation • 1. Introduction • 2. Deviations in approach • Test design • Results and recommendations • 3. Lessons for the future • Change in approach? • Was it worth the effort?

  33. Introduction • Company: P5 Consultants • Personal background: psychologist

  34. Test design • Subjects: n=11, pilot, ‘critical users’, 1 hour session • Data collection: log software, video recording Methods: lab evaluation + informal approach Techniques: exploration, task execution, think aloud, interview, questionnaire Tool:SUS

  35. A Test Session

  36. Results and recommendations

  37. Lessons for the future • Change in approach? • Methods: add a usability inspection method • Procedure: extensive analysis, add session time • Results: less general, severity? • Was it worth the effort? • Company: to get experience & benchmarking • Personally: to improve skills, knowledge

  38. CHI99 PanelComparative Evaluation of Usability Tests Presentation by Meeta Arcuri Microsoft Corporation California, USA meeta@hotmail.com

  39. CUE - 2 The Customer’s Perspective Meeta Arcuri User Experience Manager Microsoft Corp., San Jose, CA

  40. Customer Summary of Findings • New findings ~ 4% • Validation of known issues ~ 67% • Previous finding from our lab tests • Finding from on-going inspections • Remainder - beyond Hotmail Usability • Business reasons for not changing • Out of Hotmail’s control (partner sites) • Problems generic to the web

  41. Report Content: Positive Observations • Quick and Dirty results • Recommendations for problem fixes • Participant quotes – get tone/intensity of feedback • Exact # of P who encountered each issue • Background of Participants • Environment (browser, speed of connection, etc.)

  42. Additional Strengths of Reports • Fresh perspectives • Lots of data on non-US users • Recommendations from participants • Trend reporting • Report of outdated material on site (some help files) • Appreciate positive findings, comments

  43. Report Content: Weaknesses • Some recommendations not sensitive to web issues (performance, security) • At least one finding irreproducible (not preserving fields in Reg. Form) • Frequency of issue reported was sometimes vague. • Some descriptions terse, vague - had to decipher

  44. How Hotmail Will Use Results • Cross-validate new findings with Hotmail Customer Service reports • Lots of good data to cite in planning meetings • Some good recommendations given by labs and participants

More Related