1 / 41

16.842/16.355 Class 3 Notes

16.842/16.355 Class 3 Notes. Factors That Influence Process. Waterfall Model. Ideal or generic process that needs to be tailored for specific project Often used as strawman, maligned, and mischaracterized (e.g., omit all feedback loops). Features of the Waterfall Model.

thu
Download Presentation

16.842/16.355 Class 3 Notes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 16.842/16.355Class 3 Notes

  2. Factors That Influence Process

  3. Waterfall Model • Ideal or generic process that needs to be tailored for specific project • Often used as strawman, maligned, and mischaracterized (e.g., omit • all feedback loops)

  4. Features of the Waterfall Model • A phased development process with clearly identified milestones • Deliverables and baselines with each phase, results of each phase “frozen” (but can be changed later if necessary) • Reviews at each phase • Document-driven process (really deliverable driven, but deliverables in early phases are often documents) • “Big Bang” testing vs. stubs vs. daily build and smoke test • “A Rational Design Process and How to Fake It” • Strict sequencing between activities not usually obeyed

  5. Feasibility Study (used to be called System Analysis) • Definition of problem • Alternate solutions and expected benefits • Required resources, costs, delivery dates for each proposed solution • Economic feasibility and technical feasibility (can hardware and software • deliver performance required?) • Risk identification

  6. Requirements: • What, not how • Contract with customer • Used to engineers to develop solutions • -- Careful analyses: goal is to prevent putting too much effort into • constructing a system that does not satisfy user’s requirements

  7. Requirements • Need to consider project goals • Minimize development costs? • Minimize development time? • Maximize quality • Realism important: Faster, Better, Cheaper (choose two) • Separate essential from nice features • Try to predict possible future requirements (product “families”) • Identify environment and interactions with environment

  8. “Coding” a very small part of any development effort (7%) • Design (architecture) important but often rushed to get to coding • Test usually about 50% of development effort

  9. “V” Model

  10. 16.842 Fundamentals of Systems Engineering Stakeholder Analysis Systems Engineering Overview Lifecycle Management Requirements Definition Human Factors System Safety Commissioning Operations System Architecture Concept Generation Verification and Validation “V-Model” System Integration Interface Management Tradespace Exploration Concept Selection Design Definition Multidisciplinary Optimization

  11. “V” Model/ Waterfall Model • Phases are delineated by review processes

  12. Air Force

  13. NASA Program & Project Life Cycles FORMULATION NASA Life Cycle Phases Approval for Implementation IMPLEMENTATION Operations Systems Acquisition Decommissioning Pre-Systems Acquisition Phase B: Preliminary Design & Technology Completion Pre-Phase A: Concept Studies Phase C: Final Design & Fabrication Phase D: System Assembly, Int & Test, Launch Phase E: Operations & Sustainment Phase F: Closeout Project Life Cycle Phases Phase A: Concept & Technology Development Project Life Cycle Gates & Major Events KDP B KDP D KDP C KDP A FAD Final Archival of Data Draft Project Requirements Launch Agency Reviews Baseline Project Plan7 Preliminary Project Plan ASM5 KDP E KDP F Human Space Flight ProjectReviews1 Re-flights End of Mission ORR PLAR MCR FRR CERR3 SRR SDR PDR SIR SAR PLAR CDR / PRR2 End of Flight (PNAR) (NAR) Inspections and Refurbishment Re-enters appropriate life cycle phase if modifications are needed between flights6 Robotic Mission Project Reviews1 PFAR DR DR ASP5 FRR SRR MDR4 PDR CDR / PRR2 SIR ORR CERR3 Launch Readiness Reviews MCR (PNAR) (NAR) SMSR, LRR (LV), FRR (LV) Supporting Reviews Peer Reviews, Subsystem PDRs, Subsystem CDRs, and System Reviews FOOTNOTES Flexibility is allowed in the timing, number, and content of reviews as long as the equivalent information is provided at each KDP and the approach is fully documented in the Project Plan. These reviews are conducted by the project for the independent SRB. See Section 2.5 and Table 2-6. PRR needed for multiple (≥4) system copies. Timing is notional. CERRs are established at the discretion of Program Offices. For robotic missions, the SRR and the MDR may be combined. The ASP and ASM are Agency reviews, not life-cycle reviews. Includes recertification, as required. Project Plans are baselined at KDP C and are reviewed and updated as required, to ensure project content, cost, and budget remain consistent. ACRONYMS ASP—Acquisition Strategy Planning Meeting ASM—Acquisition Strategy Meeting CDR—Critical Design Review CERR—Critical Events Readiness Review DR—Decommissioning Review FAD—Formulation Authorization Document FRR—Flight Readiness Review KDP—Key Decision Point LRR—Launch Readiness Review MCR—Mission Concept Review MDR—Mission Definition Review NAR—Non-Advocate Review ORR—Operational Readiness Review PDR—Preliminary Design Review PFAR—Post-Flight Assessment Review PLAR—Post-Launch Assessment Review PNAR—Preliminary Non-Advocate Review PRR—Production Readiness Review SAR—System Acceptance Review SDR—System Definition Review SIR—System Integration Review SMSR—Safety and Mission Success Review SRR—System Requirements Review

  14. NASA

  15. Evolutionary Model • Prototyping – “Do it twice” • To assess feasibility • To verify requirements • May only be • A front end or executable specification (“very high level” languages or front end for user interface) • Or develop system with less functionality or quality attributes (speed, robustness, etc.)

  16. Differences with Hardware Prototyping?

  17. Evolutionary Model (2) • 3 approaches • Use prototyping as tool for requirements analysis. • Use to accommodate design uncertainty • Prototype evolves into final product • Documentation may be sacrificed • May be less robust • Quality defects may cause problems later (during ops and maintenance -- Incorporating quality after system built is impossible or extremely costly • Use to experiment with proposed solutions before large investments made

  18. Evolutionary Model (3) • Drawbacks • Can be expensive to build • Can develop a life of its own, turns out to be product itself • Hard to change basic decisions made early • Can be an excuse for poor programming practices

  19. Experimental Evaluation • Boehm: prototyping vs. waterfall for software • Waterfall: • Addressed product and process control risks better • Resulted in more robust product, easier to maintain • Fewer problems in debugging and integration due to more thought-out design • Prototyping • Addressed user interfaces better • Alavi: prototyping vs. waterfall for an information system • Prototyping: users more positive and more involved • Waterfall: more robust and efficient data structures

  20. Incremental Model Communication Planning Modeling Construction Deployment Communication Planning Modeling Construction Deployment Communication Planning Modeling Construction Deployment Increment #1 Increment #2 Increment #3 24

  21. Incremental Model • Functionality produced and delivered in small increments • Focus attention first on essential features and add functionality only if and when needed • Systems tend to be leaner, fights overfunctionality syndrome • May be hard to add features later • CLCS (tried to add fault tolerance later) • Need to be careful about what put off. Requires very complex analysis and deep knowledge to do this right. • Does this work for large, complex systems?

  22. Incremental Model Variant • Incremental implementation only • Follow waterfall down to implementation • During requirements analysis and system design • Define useful subsets that can be delivered • Define interfaces that allow adding later smoothly • Different parts implemented, tested, and delivered according to different priorities and at different times

  23. Spiral Model • Includes every other model • Risk-driven • (vs. document driven or • increment driven) • Radius of spiral represents • cost accumulated so far

  24. Considerations • Do you need one uniform process over whole project? e.g., in requirements analysis, identify aspects that are uncertain Library system: Checkout and Checkin (inventory control): Relatively certain Card catalogue and user search: Relatively uncertain Then have different processes for separate parts

  25. Software Factory • Most software organizations strictly separated between initial development and later maintenance • No incentive to produce a system that can be easily maintained • No incentive to produce reusable components • Project management vs. product management • Extend management responsibility to cover family of products rather than an individual product (product families)

  26. What is CMMI? Consultant Money Making Initiative

  27. Critique of CMMI “The projects most worth doing are the ones that will move you DOWN one full level on your process scale” (Peopleware) [3]

  28. CMM (Capability Maturity Model) • Comes from Taylorism (“scientific management”, statistical quality control) • Maximize efficiency by rational planning procedures • Assembly lines, time and motion studies (Henry Ford) • Behavior of individual controlled by normative rules and preplanned by system design • Projects achieve control over their products and processes by narrowing the variation in the process performance to fall within acceptable quantitative boundaries

  29. CMM (Capability Maturity Model) (2) • Despite rhetoric, does not follow TQM • TQM emphasizes flexibility and learning • Learning orientation seeks to increase variation in order to explore opportunities • CMM focuses on narrowing variation • Formal bureaucratic control undermines intrinsic motivation needed for creative and flexible responses to uncertainty Senge: Humanistic values of caring and individual freedom are essential to building learning organizations Carroll: “In too many TQM programs, it is the difficult-to-implement portions of the program that are being finessed or ignored and the rhetoric that is being retained.”

  30. CMM Criticisms • Treats people as assembly line workers, i.e., replaceable, unreliable • Humans are subordinated to defined processes • Why emphasis on “repeatable”? • Why five levels? Why a rigid order? • Peer reviews at level 3? Defect prevention at level 5? • Creates inflexible organizations and the illusion of control? • Places focus on the wrong things • Does this just ensure mediocrity? • Great software/hardware projects have done the opposite • Skunkworks, Apple (Mac), some Microsoft products

  31. CMM Criticisms (2) Bollinger • Process improvement more than simply adding good measurement and controls to a project • Must address much deeper and uniquely difficult issue of how to distribute creative, intelligent problem-solving across a group of heterogeneous individuals. • Make group IQ improvement a major issue • Group stupification when process metrics structured around repetition • Use people for problem solving, don’t try to turn people into machines

  32. More • Experimental validation? • Industry experience?

  33. CMMI vs. Agile, XP, etc. PEOPLE PEOPLE TECHNOLOGY Process Process Technology

  34. Readings?

More Related