1 / 19

Systemic Analysis of Software Findings

Systemic Analysis of Software Findings. Scott Lucero Office of the Deputy Undersecretary of Defense (Acquisition and Technology) Software Engineering and System Assurance. Approach. Question: Are the findings from Program Support Reviews consistent with the NDIA top software issues?.

halima
Download Presentation

Systemic Analysis of Software Findings

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Systemic Analysis of Software Findings Scott Lucero Office of the Deputy Undersecretary of Defense (Acquisition and Technology) Software Engineering and System Assurance

  2. Approach Question: Are the findings from Program Support Reviews consistent with the NDIA top software issues? • Used keywords to pull findings from the systemic analysis database and binned against top issues • Questions about binning methodology • Looked at totality of findings and allocated to new affinity groups, based on SWEBOK • Conducted two one-day workshops with the authors of the findings to provide overall context • First-hand experience with over 90 percent of findings • Developed summary statement of the issues associated with each affinity group • Started looking at associated affinity groups

  3. Top Software Issues* • The impact of requirements upon software is not consistently quantified and managed in development or sustainment. “Requirements” • Fundamental system engineering decisions are made without full participation of software engineering. “SE/SW Integration” • Software life-cycle planning and management by acquirers and suppliers is ineffective. “SW Sustainment” • The quantity and quality of software engineering expertise is insufficient to meet the demands of government and the defense industry. “Human Capital” • Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems. “SW Testing” • There is a failure to assure correct, predictable, safe, secure execution of complex software in distributed environments. “SW Assurance” • Inadequate attention is given to total lifecycle issues for COTS/NDI impacts on lifecycle cost and risk. “SW COTS/NDI/Reuse” *NDIA Top Software Issues Workshop - August 2006

  4. SME Insight Program Support Review Methodology Pgm Reference Mat’l PSR Plan Q’s Program Support Review (PSR) • Repeatable, tailorable, exportable process • Trained workforce with in-depth understanding of PMs’ program issues PSR Evaluation Areas 1. Mission Capabilities/ Requirements 2. Resources 3. Management 4. Technical Process 5. Technical Product 6. Environment • PSR Reference Matl’s • Templates • Sample Questions • Documented Processes • Training Materials • Execution Guidance • “…PSR team serves as ‘disinterested 3rd party’ that allows [the PM] to approach leadership armed with powerful program truths, reinforce issues.” (PM) PMs Report Process is Insightful, Valuable, and Results Oriented; better than 95% acceptance of recommendations

  5. Source of the Findings • 68 reviews of 38 different acquisition programs • Conducted from early 2004 to present • Primarily ACAT 1D programs • Findings of these reviews placed into Systemic Analysis Database (SADB) – a formal repository for all review findings • Data extracted from SADB using the following keywords: • Software • Systems-of-Systems (SoS) • Assurance • Architecture • Security • 600+ findings resulted from the keyword search

  6. 47 164 73 Data Validation • Data validation conducted to: • Remove findings unrelated to software • Ensure that positive, neutral, and negative findings were identified properly • Resulted in 284 findings directly related to software • Keyword search probably missed some software-related findings We examined these software findings without a predefined taxonomy in order to allow issue areas and recurring trends to emerge

  7. Affinity Groups for Negative Findings Definitions of affinity groups use sources such as Software Engineering Body of Knowledge (SWEBOK) to bring consistency to the methodology

  8. Analysis of Findings • Conducted workshops to provide context for findings: • Examined findings to identify related issues based on experience of the review participants • Characterized the strength of the relationship between the finding and the affinity group • Added issues beyond the originally identified affinity groups • Results transferred to a graphing editor tool (yEd) for further analysis

  9. Description of Issues Management Oversight • Insufficient tracking of program against plans throughout lifecycle • Underestimation of system complexity • Failure to manage “the big picture” • e.g., focusing on short-term vs. long term goals, management of SoS and GFE Process Planning • Lack of mature software processes impacting management oversight

  10. Description of Issues (2) Human Capital • Staff lacks software skills and experience, hindering delivery • Insufficient availability of software leads and other key software personnel Knowledge Sharing • Poor communication on software issues within program office and between organizations, resulting in poorly synchronized plans and oversight

  11. Initial Analysis of Relationships between Affinity Groups

  12. Path Forward • Develop issue statements for remaining affinity groups • Continue to examine findings for relationships between affinity groups • Periodically query systemic database for software findings from additional reviews • Conduct analysis about once a year Systemic analysis of software findings is consistent with the NDIA top software issues and overall systemic analysis findings.

  13. Back-up Slides

  14. Negative Software Trends [2]

  15. Relationships between Issues

  16. Common Threads 1st Thread 2nd Thread Thread Definition: In arguments about specific events, a reason for seeing X as the cause of Y.  X must be the only factor common to more than one example of Y; and the examples of Y should not be linked by chance.

  17. Affinity Group Definitions [1] *See SADB Affinity Group Definitions Word Document for complete set of definitions

  18. Affinity Group Definitions [2] *See SADB Affinity Group Definitions Word Document for complete set of definitions

  19. Challenge Define a consistent and flexible SSA Software Systemic Analysis Process that will be used to Identify the top positive, neutral, and negative software recurring trends within Acquisition Category (ACAT) 1D programs Page 19

More Related