1 / 58

Code Inspections and Heuristic Evaluation

Code Inspections and Heuristic Evaluation. Objectives. Today The inspection process Practice inspection Heuristic evaluation process Practice evaluation Next time Rationale behind why inspections and heuristic evaluation is so great

karma
Download Presentation

Code Inspections and Heuristic Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Code Inspections and Heuristic Evaluation

  2. Objectives • Today • The inspection process • Practice inspection • Heuristic evaluation process • Practice evaluation • Next time • Rationale behind why inspections and heuristic evaluation is so great • Normally would do this in the opposite order, but this way you should be able to better prepare any materials over the weekend

  3. Presentation Order • Email me your materials (code to review or if doing the heuristic evaluation, software to run if possible) by the Friday prior to your presentation if going on Monday, or on the Monday prior to your presentation if going on Wednesday • Tuesday 3/25: Curtis Conner, Devin Lyons, Matt Grimm, Scott Mahar • Thursday 3/27: Devin Homan, Dean Sawyer, Hector Sanchez, Joshua Tester • Tuesday 4/1: Richard To, Britny Herzog, Bobby Porter, Robert Bailey

  4. Code Inspection / Fagan Inspection • Definition • A formal review of a work product by peers. A standard process is followed with the purpose of detecting defects early in the development lifecycle. • Can inspect many different kinds of documents • We will focus on just the code

  5. A defect is a deviation from specific or expected behavior Something wrong Missing information Common error Standards violation Ambiguity Inconsistency Perception error Design error Defects • Inspections are used to find defects

  6. A defect is a defect • A defect is based on the opinion of the person doing the review • This means that any defect that is found IS a defect • Not open to debate • Not all defects are necessarily bugs • Many defects may not be “fixed” in the end • No voting or consensus process on what is a defect • How to fix a defect should be debated later, not when the defects are logged

  7. What should be inspected? • For existing code or documentation, select • The most critical piece to the program’s operation • Most used section • Most costly if defects were to exist • Most error-prone • Least well-known • Most frequently changed • For new code or documentation • 20% <= inspect <= 100%

  8. Our Inspection Exercise Group in Class Individual Work Owner Planning 30-60 mins Introduction 2-5 mins Inspect and Log Defects 10-15 mins Inspectors review code 20-60 mins Owner rework ? mins

  9. Owner Planning • Owner decides what code/documents to review • Copy of code listing for everyone • Send me code by the prior class before the inspection date and I’ll post it on the calendar page for everyone to get • Code should be numbered by line • Not all code, just the selected code (see previous slide on “What should be inspected?”) • Up to owner’s discretion as to what/how much, but we will stop after 20 minutes • Probably about 2-3 pages

  10. Preparation • Each inspector should have the materials to inspect in advance • Identify defects on their own to ensure independent thought • Note defects and questions • Complete a defect log • High/Medium/Low • Without this preparation, group review might find only 10% of defects that could otherwise be found (Fagan) • Rules of thumb • 2 hours for 10 full pages of text

  11. Common Defects • Mistakes you’ve made in the past • Anything we discussed in class • Code techniques • E.g. variable names, location, initialization, refactoring, defensive programming, error checking, magic numbers, loop length, etc. • Security • Usability • Etc. • Similar issues apply to other languages

  12. Inspection Day • Prior to inspection • Code has already been posted • Inspectors have prepared by inspecting the code and noting their defects • Inspection process • Owner provides brief introduction for code • Round-robin where each inspector describes a defect found or passes if no defects noted • Might find new defects during the inspection exercise • Total of 10-20 minutes in our exercise • Scribe writes down defects in the defect log

  13. Defect Logging • High, Medium, Low, or Question • Brief description should be ~7 words or less, or until the owner understands • If possible, resolve questions: defect or not • Also log defects found in • Parent document, e.g. requirements • Common errors list • Work product guidelines • Will be up to the work owner whether or not to fix a defect

  14. Inspection Example • Requirement: Support authentication based upon user@host using regular expressions Open file Containing operators 1 /********************************************************* 2 * Returns a 1 if the user is on the ops list, and 3 * returns a 0 if the user is not on the ops list. 4 *********************************************************/ 5 int Authorized(char *user) 6 { 7 FILE *f; 8 9 f=fopen(OPSPATH,"r"); /* open authorized file */ 10 while (fgets(tempstr,80,f)!=NULL) 11 { 12 tempstr[strlen(tempstr)-1]='\0'; /* annoying \r at end */ 13 if (!fnmatch(tempstr,user,FNM_CASEFOLD)) { fclose(f); return(1); } 14 } 15 fclose(f); 16 return(0); 17 } Returns true if wildcards match

  15. Defect Log

  16. In-Class Exercise • Take 5-10 minutes to find defects in the code posted online • http://www.math.uaa.alaska.edu/~afkjm/cs470/handouts/CodeReview.pdf • This is C# code that highlights the location of my pen on the tablet screen • We will then do a short round-robin to note defects

  17. Defect Log

  18. Heuristic Evaluation “Discount” Usability Testing

  19. Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators examine UI • independently check for compliance with usability principles (“heuristics”) • Can also refer to any of the GUI Bloopers we covered • different evaluators will find different problems • evaluators only communicate afterwards during meeting • findings are then aggregated • Can perform on working UI or on sketches

  20. Jakob Nielsen’s Heuristics • Aesthetic and minimalist design • Match between system and real world • Recognition rather than recall • Consistency and standards • Visibility of system status • User control and freedom • Flexibility and efficiency of use • Help users recognize, diagnose, and recover from errors • Error prevention • Help and documentation

  21. Evaluation Day • Similar to code inspection • Prior to evaluation • Ideally, the program has already been posted (by class prior to the inspection) and inspectors have prepared by running the program and noting issues • This may not be possible depending upon the nature of your project. If so, you may give an in-class “demo” and do evaluation on the fly • Evaluation process • Owner provides brief introduction for the program • Round-robin where each evaluator describes an issue found or passes if no defects noted • Might find new issues during the exercise • Total of 15-20 minutes in our exercise • Scribe writes down issues in the issue log

  22. Example Problem Descriptions • Have to remember command codes • Violates “Minimize the users’ memory load” (H3) • Fix: add drop down box with selectable codes • Typography uses mix of upper/lower case formats and fonts • Violates “Consistency and standards” (H4) • Slows users down • Probably wouldn’t be found by user testing • Fix: pick a single format for entire interface Adapted from slide by James Landay

  23. Severity ratings • Used to allocate resources to fix problems • Should be calculated after all evaluations are done • Should be done independently by allevaluators • Based on • Frequency the problem will occur • Impact of problem (hard or easy to overcome) • Persistence (will users learn a work around or will they be bothered every time?) • 1 – cosmetic problem • 2 – minor usability problem • 3 – major usability problem; important to fix • 4 – usability catastrophe – must fix

  24. Heuristic Evaluation Issue Log Heuristic Issue Severity Description The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function 3 Consistency 4 Error Message Entering invalid input into dialog box on first form results in “Error 103” …

  25. Exercise • Evaluate an application using heuristic evaluation • Use your computer or share with a neighbor • Java Required: http://www.math.uaa.alaska.edu/~afkjm/GeoPixelCounter/ • Program to compute % of a mineral in a thin section of rock • Refer back to slide with the 10 heuristics • Fill out issues found on paper • We will discuss and debrief after you are done

  26. Heuristic Evaluation Issue Log Heuristic Issue Severity Description

  27. Theory – Code Inspections

  28. Why inspections? • Inspections can be applied to many different things by many different groups • Inspections are a “Best Known Method” (BKM) for increasing quality • Developed by Michael Fagan at IBM, paper published 1976 • Estimates: Inspections of design and code usually remove 50-90% of defects before testing • Very economical compared to testing • Formal inspections are more productive than informal reviews

  29. Formal Inspections • By formalizing the process, inspections become systematic and repeatable • Each person in the inspection process must understand their role • Use of checklists focus concentration on detection of defects that have been problematic • Metrics • Feedback and data collection metrics are quantifiable • Feed into future inspections to improve them • Designers and developers learn to improve their work through inspection participation

  30. More reasons to use inspections • Inspections are measurable • Ability to track progress • Reduces rework and debug time • Cannot guarantee that a deadline will be met but can give early warning of impending problems • Information sharing with other developers, testers

  31. Definition • What is an inspection? • A formal review of a work product by peers. A standard process is followed with the purpose of detecting defects early in the development lifecycle. • Examples of work products • Code, Specs, Web Pages • Presentations, Guides, Requirements, • Specifications, Documentation

  32. When are inspections used? • Possible anytime code or documents are complete • Requirements: Inspect specs, plans, schedules • Design: Inspect architecture, design doc • Implementation: Inspect technical code • Test: Inspect test procedure, test report

  33. A defect is a deviation from specific or expected behavior Something wrong Missing information Common error Standards violation Ambiguity Inconsistency Perception error Design error Defects • Inspections are used to find defects

  34. Other Review Methods

  35. Other Defect Detection Methods

  36. Why a formal review? • Provides a well-defined process • Repeatability, measurement • Avoids some scenarios with less formal processes • “My work is perfect” • Point is not to criticize the author • “I don’t have time” • Formal process proceeds only when all are prepared, have inspected code in advance

  37. Walkthrough vs. Inspection

  38. Typical Inspection Process We are using a shortened process in class, but it is essentially the same as the “normal” process Planning 45 mins Prep 15-120 mins Log Defects 60-120 mins Causal Analysis and Rework Follow-Up

  39. Roles Moderator Inspectors Work Owner Scribe

  40. Causal Analysis Meeting • Purpose – Brainstorming session on the root cause of specific defects • This takes place sometime after the inspection has been completed • This meeting supports the continuous improvement • Initiate thinking and action about most common or severe defects • Can help prevent future defects from occurring • Specific action items may be achieve this goal

  41. Rework • Purpose: Address defects found during the logging process • Rules • Performed by product owner • All defects must be addressed • Does not mean they are fixed, but that sufficient analysis/action has taken place • All defects found in any other documents should be recorded • Owner should keep work log

  42. Follow-Up • Purpose: Verify resolution of defects • Work product redistributed for review • Inspection team can re-inspect or assign a few inspectors to review • Unfixed defects are reported to the team and discussed to resolution • We’re skipping these last few phases for the class • I would like to see how you addressed defects in your final writeup

  43. Theory - Heuristic Evaluation Adapted from material by Marti Hearst, Loren Terveen

  44. Evaluating UI Designs • Usability testing is a major technique • Formal techniques require users, rigid control experiments, statistical analysis • “Discount” methods don’t require users • Heuristic Evaluation • Cognitive Walkthrough

  45. Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators examine UI • independently check for compliance with usability principles (“heuristics”) • different evaluators will find different problems • evaluators only communicate afterwards • findings are then aggregated • Can perform on working UI or on sketches

  46. Phases of Heuristic Evaluation 1) Pre-evaluation training • give evaluators needed domain knowledge and information on the scenarios 2) Evaluation • individuals evaluate and then aggregate results 3) Severity rating • determine how severe each problem is (priority) 4) Debriefing • discuss the outcome with design team Adapted from slide by James Landay

  47. Jakob Nielsen’s heuristics

  48. Pros / Cons • + Cheap (no special lab or equipment) • + Easy • + Fast (about 1 day) • + Cost-effective • + Detects many problems without users • + Complementary to task-centered approaches • + Coverage • + Catches cross-task interactions • - Requires subjective interpretation • - Does not specify how to fix problems • - Performance improves as evaluator knowledge increases

  49. Procedure • A set of evaluators (3-5 is about optimal) evaluate a UI (some training may be needed) • Each one independently checks for compliance with the heuristics • Different evaluators find different problems • Individually rate severity of the problems • Evaluators then get together and merge their findings • Debriefing/brainstorming  how to fix the problems (and point out what’s really good)

More Related