1 / 22

Overview

Overview. Critical concepts Value of accurate estimate Where does estimation error come from Estimation techniques Count, compute, judge Structured expert judgment Estimation by analogy Wisdom. Estimating vs planning. Estimation - an unbiased, analytical process

amity-riley
Download Presentation

Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Overview • Critical concepts • Value of accurate estimate • Where does estimation error come from • Estimation techniques • Count, compute, judge • Structured expert judgment • Estimation by analogy • Wisdom

  2. Estimating vs planning • Estimation - an unbiased, analytical process • The goal is accuracy; the goal is not to seek a particular result • Planning - a biased, goal-seeking process • goal of planning is to seek a particular result • If estimate != plan: Project has to account for that risk

  3. Value of accuracy • Underestimate • Bad stuff happens • Over estimate • Parkinson’s Law will kick in—the idea that work will expand to fill available time.

  4. Industry data

  5. Sources of errors: Cone of uncertainty

  6. Its not automatic ..

  7. Sources of errors: Chaotic dev process • Requirements that weren’t investigated very well in the first place • Lack of end-user involvement in requirements validation • Poor designs and code • Abandoning planning under pressure • Developer gold-plating • Lack of automated source code control

  8. Sources of err: Omitted activities • One study found that developers tended to estimate pretty accurately the work they remembered to estimate, but they tended to overlook 20% to 30% of the necessary tasks, which led to a 20 to 30%estimation error (van Genuchten 1991). • Omitted work falls into three general categories: • missing requirements, • missing software-development activities • missing non-software-development activities.

  9. Source of err: Optimism • In a study of 300 software projects, Michiel van Genuchtenreported that developer estimates tended to contain an optimism factor of 20% to 30% • Applies to management as well (US Defense dept. study)

  10. Source of err: Off cut estimates • Avoiding off the-cuff estimates is one of the most important points in this book. • Don’t give off-the-cuff estimates. Even a 15-minute estimate will be more accurate.

  11. Sources of err: Others • Unfamiliar business area • Unfamiliar technology area • Overstated savings from new development tools or methods • Budgeting processes that undermine effective estimation (especially those that require final budget approval in the wide part of the Cone of Uncertainty)

  12. Estimation technique: Count, compute, judge • If you can count the answer directly, you should do that first. • If you can’t count the answer directly, you should count something else and then compute the answer by using some sort of calibration data.

  13. What to count • Early in the development life cycle, you can count marketing requirements, features, use cases, and stories, among other things. • In the middle of the project, you can count at a finer level of granularity—engineering requirements, Function Points, change requests, Web pages, reports, dialog boxes, screens, and database tables, just to name a few. • Late in the project, you can count at an even finer level of detail—code already written, defects reported, classes, and tasks

  14. Counting Tips • Find something to count that’s available sooner rather than later in the development cycle • Find something you can count with minimal effort • Don’t discount the power of simple, coarse estimation models such as average effort per defect, average effort per Web page, average effort per story, and average effort per use case.

  15. Estimation technique 2:Structured expert judgment • By far the most common estimation approach used in practice • To create the task-level estimates, have the people who will actually do the work create the estimates.

  16. Work breakdown • Is the estimate broken down into enough detail to expose hidden work? • Decompose estimates into tasks that will require no more than about 2 days of effort • Decompose large estimates into small pieces so that you can take advantage of the Law of Large Numbers: the errors on the high side and the errors on the low side cancel each other out to some degree.

  17. Does the estimate include all the functionality areas needed to complete the task?

  18. Best case vs Worst case Is the Worst Case really the worst case? Does it need to be made even worse?

  19. Cocomo II ratings factors for Effort Multipliers (EMs).

  20. Expressing uncertainty • The key issue in estimate presentation is documenting the estimate’s uncertainty in a way that communicates the uncertainty clearly and that also maximizes the chances that the estimate will be used constructively and appropriately

More Related