1 / 21

IS 485, Professor Matt Thatcher

Heuristic Evaluation. IS 485, Professor Matt Thatcher. Agenda. Administrivia Heuristic evaluation. Heuristic Evaluation. Helps find usability problems in a UI design Can perform on working UI or on sketches Small set (3-5) of evaluators examine UI

lorant
Download Presentation

IS 485, Professor Matt Thatcher

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Heuristic Evaluation IS 485, Professor Matt Thatcher

  2. Agenda • Administrivia • Heuristic evaluation

  3. Heuristic Evaluation • Helps find usability problems in a UI design • Can perform on working UI or on sketches • Small set (3-5) of evaluators examine UI • each evaluator independently goes through UI several times • inspects various dialogue/design elements • compares with list of usability principles (or heuristics of good interface design) • identify any violations of these heuristics • evaluators only communicate afterwards (i.e., no interaction) and findings are aggregated • Usability principles --> Nielsen’s heuristics • Use violations to redesign / fix problems

  4. Heuristics • H2-1: Visibility of system status • H2-2: Match between system and real world • H2-3: User control and freedom • H2-4: Consistency and standards • H2-5: Error prevention • H2-6: Recognition over recall • H2-7: Flexibility and efficiency of use • H2-8: Aesthetic and minimalist design • H2-9: Help users recognize, diagnose, and recover from errors • H2-10: Help and documentation

  5. Phases of Heuristic Evaluation 1) Pre-evaluation training • give evaluators list of principles with which to evaluate • give evaluators needed domain knowledge • give evaluators information on the scenario 2) Evaluation • individuals evaluateand then aggregateresults 3) Severity rating • determine how severe each problem is (priority) 4) Debriefing • discuss the outcome with design team

  6. How to Perform Evaluation • At least two passes for each evaluator • first to get feel for flow and scope of system • second to focus on specific elements • If system is walk-up-and-use or evaluators are domain experts, then no assistance needed • otherwise might supply evaluators with scenarios • Each evaluator produces list of problems • explain why with reference to heuristic or other info. • be specific and list each problem separately

  7. Examples • Can’t copy info from one window to another • violates “Recognition Over Recall” (H2-6) • fix: allow copying • Typography uses mix of upper/lower case formats and fonts • violates “Consistency and standards” (H2-4) • slows users down • probably wouldn’t be found by user testing • fix: pick a single format for entire interface

  8. Aggregate the Results • Take all the lists and aggregate the results into a single list of violations • Eliminate redundancies and make clarifications • You will end up with the following Problem # [Heuristic Violated] Brief description of the problem found

  9. An Example of Aggregated Results Aggregated List of Violations 1. [H2-4 Consistency and Standards] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] ...

  10. Severity Ratings • Used to allocate resources to fix the most serious problems • Estimates of need for more usability efforts • Combination of • frequency, impact, persistence • Should be calculated after all evals. are in • Should be done independently by all judges

  11. Severity Ratings 0 - don’t agree that this is a usability problem 1 - cosmetic problem only 2 - minor usability problem; fixing this should be given low priority 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

  12. Example of Severity Ratings Evaluator # 1 1. [H2-4 Consistency and Standards] [Severity 3] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] [Severity 4] ... Problem # [Heuristic violated] [Severity rating] Problem description

  13. Summary Report Summary Report 1. [H2-4 Consistency and Standards] [Severity 2.7] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] [Severity 3.3] ... Problem # [Heuristic violated] [Average severity] Problem description

  14. Debriefing • Conduct with evaluators, observers, and development team members • Discuss general characteristics of UI • Suggest potential improvements to address major usability problems • Add ratings on how hard things are to fix • e.g., technological feasibility, time issues, etc. • Make it a brainstorming session • little criticism until end of session

  15. Fix Ratings • Together team should also identify a fix rating for each usability problem identified in the summary report • How much time, resources, and effort would it take to fix each usability problems • programmers and techies are crucial here • Fix the important ones (see severity ratings) • Fix the easy ones (see fix ratings) • Make a decision about the rest

  16. Fix Ratings 0 - Very easy to fix; only takes a few minutes 1 - Relatively simple to fix; takes a few hours 2 - Difficult to fix; takes a few days or more 3 - Impossible to fix

  17. Final Adjustment Final Report for the Heuristic Evaluation 1. [H2-4 Consistency and Standards] [Severity 2.7] [Fix 1] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] [Severity 3.3] [Fix 0] … Problem # [Heur violated] [Avg severity rating] [Fix rating]Problem description

  18. Independent Evaluations Aggregated List of Violations Independent Severity Ratings Summary Report with Avg Severity Ratings (SR) Final HE Report with SR and Fix Ratings

  19. Some Summary Statistics • Number of violations for the entire interface • For each heuristic, list the number of violations • For each evaluator, list the % of violations found • For each evaluator and severity rating, give the % total violations of that rating found by that evaluator

  20. Summary • Expert reviews are discount usability engineering methods • Heuristic evaluation is very popular • have evaluators go through the UI twice • ask them to see if it complies with heuristics • note where it doesn’t and say why • combine the findings from 3 to 5 evaluators • have evaluators independently rate severity • discuss problems with design team • alternate with user testing

  21. TRAVELweather Example

More Related