280 likes | 747 Views
November 2, 1997. 2. Measure, Metrics, and Indicator. Measure -- Provides a quantitative indication of the extent, amount, dimensions, capacity, or size of some product or process attribute.. Metrics -- A quantitative measure of the degree to which a system, component, or process possesses a given a
E N D
1. November 2, 1997 1 Software Process and Project Metrics Outline:
2. November 2, 1997 2 Measure, Metrics, and Indicator Measure -- Provides a quantitative indication of the extent, amount, dimensions, capacity, or size of some product or process attribute. People sometimes talk about a goal of metrics to improve the quality of a software product. Bull. Metrics provide information with which one may be able to improve the process.
Example metrics: SLOC, # errors found per day in test, # errors occurring when product is in the field. SLOC produced per year by a programmer,
Often, metrics will be combined with other information to help improve the process. For example, finding that teams that used formal design reviews had fewer errors occurring in test, might lead one to require formal design reviews.
Recall example of how many new GPCs to use. Looked at metric of number of errors found per OI with level of testing done.People sometimes talk about a goal of metrics to improve the quality of a software product. Bull. Metrics provide information with which one may be able to improve the process.
Example metrics: SLOC, # errors found per day in test, # errors occurring when product is in the field. SLOC produced per year by a programmer,
Often, metrics will be combined with other information to help improve the process. For example, finding that teams that used formal design reviews had fewer errors occurring in test, might lead one to require formal design reviews.
Recall example of how many new GPCs to use. Looked at metric of number of errors found per OI with level of testing done.
3. November 2, 1997 3 In the Process and Project Domains Process Indicator
4. November 2, 1997 4 Process Metrics and Software Process Improvement
5. November 2, 1997 5 Measurement What to measure?
6. November 2, 1997 6 Privacy Issues Should they be used for personnel evaluation?
7. November 2, 1997 7 Use of Software Metrics Use common sense and organizational sensitivity.
8. November 2, 1997 8 Typical Causes of Product Defects
9. November 2, 1997 9 Example of Defect Analysis
10. November 2, 1997 10 Project Metrics Software Project Measures Are Tactical
used by a project manager and a software team
to adapt project work flow and technical activities
11. November 2, 1997 11 Software Metrics Direct measures
Cost and effort applied (in SEing process)
Lines of code(LOC) produced
Execution speed
CPU utilization
Memory size
Defects reported over certain period of time
12. November 2, 1997 12 Software Measurement Size-Oriented Metrics
13. November 2, 1997 13 Typical Size-Oriented Metrics Errors per KLOC
14. November 2, 1997 14 Software Measurement Function-Oriented Metrics
15. November 2, 1997 15 Function Point Calculation
16. November 2, 1997 16 Function Point Calculation
17. November 2, 1997 17 Function-Oriented Metrics FP = count_total * [0.65 + 0.01 * sum of Fi]
18. November 2, 1997 18 Function Point Extensions Function Points emphasizes “data dimension”
19. November 2, 1997 19 3-D Function Point Calculation
20. November 2, 1997 20 Reconciling Different Metrics
21. November 2, 1997 21 Metrics for Software Productivity LOC and FP Measures Are Often Used to Derive Productivity Metrics
22. November 2, 1997 22 Measures of Software Quality Correctness
23. November 2, 1997 23 Measures of Software Quality (Cont’d) Integrity
24. November 2, 1997 24 Defect Removal Efficiency A Quality Metric That Provides Benefit at Both the Project and Process Level
DRE = E / ( E + D )
E = # of errors found before delivery of the software to the end user
D = # of defects found after delivery
25. November 2, 1997 25 Summary View
26. November 2, 1997 26 Summary Metrics are a tool which can be used to improve the productivity and quality of the software system
27. November 2, 1997 27 METRICS CLCS Metrics Philosophy
Phase 1: Provide a mandatory, nearly automated, metrics foundation to track lines of code and errors.
Phase 2: Provide additional high-return metrics with recognized value.
Schedule metrics (milestones)
Additional S/W Problem metrics (actuals, trends, prediction)
Defect correction metrics
Run-time analysis metrics (McCabe tools, automated, COTS)
Phase 3: Be driven to additional metrics only by absolute need.
28. November 2, 1997 28 METRICS