1 / 26

Paul Byrnes, Principal and CEO, Integrated System Diagnostics

Paul Byrnes, Principal and CEO, Integrated System Diagnostics. Lessons Learned Performing a Very Large Multi-Model High Maturity Appraisal. Background and Problem Statement. Problem Statement: CMMI V1.3 appraisals are considered “too expensive”

rasia
Download Presentation

Paul Byrnes, Principal and CEO, Integrated System Diagnostics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Paul Byrnes, Principal and CEO, Integrated System Diagnostics Lessons Learned Performing a Very Large Multi-Model High Maturity Appraisal

  2. Background and Problem Statement Problem Statement: CMMI V1.3 appraisals are considered “too expensive” • Too much money achieving benchmark certifications • Too much effortconducting appraisals with overlapping content. • Organizations tend to treat each model or framework as an independent workstream • Organizations treat each appraisal event as an independent end item and “regenerate” data for each event • Decomposition of certification tasks across frameworks artificially fragments integrated activities. • Process groups are expected to do more with less • 5 HM appraisal events over last 3 years • Performed 45+ Class B and Class A equivalent HM appraisals since 1996. • Roughly one third (11 of 27) organizations named in the 2007 SEI CMMI Performance Data, HM organizations that provided performance data, are or were ISD clients. • http://www.sei.cmu.edu/cmmi/results/results-by-organization.html Systems Engineering Supplier Management Software Engineering Service Delivery

  3. Effort reduced • Quality higher • Utility better Desired multi-model outcomes “Old” effort profile – the same effort repeated during every disparate model event. Multi-model approach: Leverage effort and data across models through common mapping Models treated independently. Wash, rinse, repeat….same evidence use repetitively Benchmark Model 2 Benchmark Model 3 Benchmark Model 1

  4. Alternative Solution • Use multi-model events to save time and money and increase utility. Outcomes aresignificantly more efficient and improves organizational understanding. • Summary of approach: • Reuse data through the appraisal lifecycle. • Base evidence collection on internal organizational processes, not model practices • Use mapping between models and with your own process to manage • Perform integrated events. • Use evidence “threads” to present the data. Data primarily presented data by “topics.” Multi-model implementation greatly increases efficiency, reducing overall effort and cost relative to independent events.

  5. Continuous Process Improvement 2 Managed CMMI DEV and SVC V1.3 Process Areas Focus Level DEV/SVC domain specific and Core Process Areas Organizational Performance Management Causal Analysis and Resolution 5 Optimizing Organizational Process Performance Quantitative Project/Work Management Quantitative Management 4 Quantitatively Managed Double the process areas is not double the evidence! Requirements Development Service System Development Technical Solution Capacity and Availability Management Product Integration Incident Resolution and Prevention Verification Service System Transition Validation Service Continuity Strategic Service Management Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project/Work Management Risk Management Decision Analysis and Resolution Process Standardization 3 Defined • Legend: • Blue italicsfont = Core PAs • Blackfont = domain specific PAs • Left side list = DEV PAs • Right side list = SVC PAs • Center list = Core PAs Requirements Management Service Delivery Project/Work Planning Project/Work Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management Basic Project Management 1 Initial

  6. Most Models Have Overlapping Content • Most standards/models have content overlap • Often based on Total Quality Management (TQM) and Deming’s plan-do-check-act principles • Some core topics show up in most models • Each industry standard/model has a ‘sweet spot’ or particular area of focus. For example: • CMMI-DEV is particularly focused on systems development and maintenance • eSCM-SP is focused on IT-enabled sourcing • COPC is focused on customer care • ITIL is focused on IT Service Management ISO 9000: focus on process standards and management eSCM-SP: focus on IT-enabled sourcing Quality Management System CMMI DEV: focus on systems development/ maintenance ITIL: focus on IT Service Management Six Sigma: focus on data-driven decision making/improvement Systems Integration Outsourcing & BPO Source: Accenture. Used with permission

  7. Market & Competitors Market Organizational Leadership, Culture, Strategic Alignment HumanCapital Infrastructure & Technology Measurement, Innovation and Improvement P&S Execution P&S Sustainment Customers Customers Products & Services Customers (post-contract) Customers (pre-contract) Support Management Suppliers Suppliers Results (performance) Integrated System Framework (ISF™) ISF is an operational example of how model, standard, and framework content overlap in consistent and manageable ways The “green” titles are “Classes” ISF provides a “meta” structure that maps content from many global standards into a common architecture. The boxes are “Categories”

  8. In my case, mappings are embedded and operationalized in Appraisal Wizard™ Multi- Model Mapping works! • Mapping Purpose: Tailor events to leverage overlap between models to minimize cost and disruption. • The Data Collection Plan (DCP) leverages model content overlap • ISO 9001-2015 to CMMI DEV V1.3 example mapping (via NDIA Multi-Model working group) Leverage model “overlaps/dependencies” (e.g., mapping, threads) Mr. Byrnes was a member of the NDIA Multi-Model Working Group

  9. Common Goals – Multi-Model impacts Slide adapted and updated from presentations by Mr. Byrnes while managing the appraisal project at the SEI.

  10. Quality Attribute Comparison

  11. Reduced Team Effort Separate HM events Single Multi-Model HM event Savings Outcome: Significant savings in all appraisal cost attributes

  12. Multi-Model Best Practices – A Generic Practice Perspective

  13. Lessons Implemented - Planning Organization Coverage: large units have a real challenge showing institutionalization across the entire entity for full coverage of all models – how many instances is enough? Model Coverage: In high maturity multi-model events, the need to potentially bring in sampled data from additional projects increases. Life Cycle Coverage: This effects all appraisals, but is exacerbated in high maturity multi-model events due to natural life cycle implementation workflows for differing work types. Functional Coverage: More groups likely need to be included.

  14. Lessons Implemented – Tailoring Some key SCAMPI HM multi-model tailoring and variations Start organization preparation much sooner than a typical appraisal • Allocatemore preparation time to designing appropriate interview sessions (size, scope, type) to ensure coverage of all domains, disciplines, and groups Select optimum team members – high maturity experience, SPC skills, inside/outside org unit, specialized training, cross model experience Prepare a longer, integrated organization in-brief – discussion of goals, models/baselines, subprocesses across each discipline is needed Use automated tools – expansion in data elements required increases need for different approaches to recording data

  15. Lessons Implemented - Evidence Organize objective evidence in a user-friendly manner • Must provide guidance for interpreting objective evidence • Review the evidence for consistencybefore the event HM multi-model events require more evidence to show full coverage of practices that are highly integrated with other areas. Develop “threads” to follow high maturity concepts in a more natural and flowing manner – present evidence by “topic” according to your process, rather than model practice buckets Use interim appraisals to incrementally “build” the appraisal database – HM events are typically not just a big bang single event Automated tooling is essential. It facilitates maximizing data reuse and collaboration and leveraging model mapping. Full Disclosure: I use Appraisal Wizard™ in all my events

  16. Lessons Implemented - Conduct Provide all team members insight into the high maturity practice implementation. Increaseteam skill requirements. And, they must meet method qualifications in each model/domain as if they were separate Leverage“overlaps” and “dependencies” in the model to assign mini-teams Leveragecommonality of tasks HM multi-model events require more, and different, interview participants to cover all disciplines Maximizeteam time for HM sessions and tasks; Conduct parallel interview sessions –optimize team member skills sets across disciplines

  17. Addressing Common Risks Slide adapted from Paul Byrnes’ 2nd ISD Customer Conference presentation

  18. Some Multi-Model Take-Aways Sponsor understanding of event complexity is needed. CMMI V2.0 structure should help facilitate “multi-model” events. Automated tooling is foundational to ensuring increasing appraisal efficiency. Beware of model interpretation issues relative to evidence sufficiency. Model integration and scope expansion increases the need for experienced team members in each discipline. Key challenges addressed: • Addresses the “Frameworks Quagmire.” • Increases efficiency of both internal improvement and external appraisal efforts. • Acknowledges senior management drivers for reduced costs. • Supports a more wholistic improvement methodology.

  19. Benefits of Using a Multi-model Approach

  20. Questions and Answers

  21. Back Up Slides

  22. Problem Statement CMMI appraisals are considered “too expensive” • Too much money achieving benchmark certifications • Too much effortconducting appraisals with overlapping content. • Organizations tend to treat each model or framework as an independent workstream • Organizations treat each appraisal event as an independent end item and “regenerate” data for each event • Decomposition of certification tasks across frameworks artificially fragments integrated activities. • Process groups are expected to do more with less

  23. Common misconceptions in Multi-model Appraisals • It will take twice as much time! • It will take half as much time! • We can use the same team and save 50% • Since we used this Lead Appraiser on the last single model event, this HM multi-model event will be easy. • We can use one process performance model for both models and be fully covered! • “We have one good example of SPC in engineering, why would you need to see more ….” • “We do causal analysis, but we can only show you that in one discipline.”

  24. Market & Competitors Market Leadership, Culture, Strategic Alignment Organizational HumanCapital Infrastructure & Technology Measurement, Innovation and Improvement P&S Execution P&S Sustentation Customers Customers Products & Services Support Customers (post-contract) Customers (pre-contract) Management Suppliers Suppliers Results (performance) ISF Multi-Model Critical Process Performance (CPP) Areas: Examples Category: Market and Competitors • Benchmarking • Brand Management • Market Knowledge • StakeholderManagement Category: Measurement, Analysis and Improvement • Measurement and Analysis • Performance Management • Continuous Improvement Management • Process Assets Management • Innovation and Performance Management • Causal Analysis and Resolution • Knowledge Management Critical Process Performance (“CPPs”) areas are analogous to the concept of “process areas.”

  25. Increased Team Member Skills Needed Integration, Articulation and Expression of Information • Increased need for specialized communication skills • Ability to describe behavior with examples/scenarios/stories Understanding and Adapting to Organizational Context • Understanding Business Goals and Concerns, Understanding Organization structure, context, environment, and culture Examining High Maturity Organizational Behavior: Knowing what to look for and what to ask about (Both org and project and support group) Understanding an array of quantitative and statistical management metrics/techniques that may be applicable depending on the context • What is the answer to “how much is enough” HM application in different work settings Bullet points morphed from SCAMPI BoK HM Working Group assets. Mr. Byrnes was a member.

More Related