1 / 13

Metrics for Computer-aided Validation

Metrics for Computer-aided Validation. Presented by Bret Michael Joint work with Doron Drusinsky, Tom Otani, and Man-Tak Shing Naval Postgraduate School Monterey, CA. Disclaimer.

presta
Download Presentation

Metrics for Computer-aided Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metrics forComputer-aided Validation Presented by Bret Michael Joint work with Doron Drusinsky, Tom Otani, and Man-Tak Shing Naval Postgraduate School Monterey, CA NASA IV&V Facility Workshop on Validation Morgantown, WV

  2. Disclaimer • The views and conclusions in this talk are those of the author and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the U.S. Government NASA IV&V Facility Workshop on Validation Morgantown, WV

  3. System Reference Model Framework • Incorporates advanced computer-aided validation techniques to the IV&V of software systems • Allows the IV&V team to capture both • Its own understanding of the problem • The expected behavior of any proposed system for solving the problem via an executable system reference model • Temporal requirements not covered by the developer NASA IV&V Facility Workshop on Validation Morgantown, WV

  4. Terminology as Usedin the Framework • Developer-generated requirements • The requirements artifacts produced by the developer of a system • System reference model (SRM) • The artifacts developed by the IV&V team’s own requirements effort NASA IV&V Facility Workshop on Validation Morgantown, WV

  5. A process for formal specification and computer-aided validation NASA IV&V Facility Workshop on Validation Morgantown, WV

  6. Product-Line Approach to IV&V IV&V Product (service) Families System Reference Models Independent Validation Independent Verification Orion Domains (project) Juno JWST NASA IV&V Facility Workshop on Validation Morgantown, WV

  7. Role of Metrics • Providing feedback on long-term trends (not short-term results) • Measuring the ROI of conducting IV&V • Improving independent IV&V • Backing up claims about the goodness of the SRM • Assessing the quality of the developer’s products • Determining the corporate impact • Key measures of success at achieving goals • Productivity (e.g., from using product line, reuse) • Product quality • Customer satisfaction • Risk incurred by introducing new IV&V technology NASA IV&V Facility Workshop on Validation Morgantown, WV

  8. Setting the Stage for the Metric Program • Need to establish a new baseline • Prior to 2008, validation within the NASA IV&V Facility was for the most part performed using manual techniques • Not worthwhile comparing results of computer-aided IV to IV conducted manually • The baseline will provide for comparison of developer V&V to IV&V results • Identifying which metrics will be used • Internally (i.e., within the IV&V Facility) • Example: Measures of the “goodness” of SRM products • Externally (i.e., to be reported out to the customer) • Example: Effectiveness of IV&V of the developer’s product NASA IV&V Facility Workshop on Validation Morgantown, WV

  9. Categories of Metrics • Four categories • Process • Program • Project • Product • For each category there can be quantitative and qualitative metrics NASA IV&V Facility Workshop on Validation Morgantown, WV

  10. Sample SRM Metrics • Namespace checking • No. of inconsistencies in naming • Finding gaps in the set of assertions • No. of iterations needed to complete the SRM • Key metrics for internal use: • No. or percentage of assertions that need to be changed or augmented • Coverage afforded by the assertions • SRM test-suite coverage NASA IV&V Facility Workshop on Validation Morgantown, WV

  11. Example Metrics for Validationof Assertions • Testing each assertion using scenario-based test cases: • No. of assertions found to have the wrong logical or temporal meaning • Testing each assertion subject to constraints imposed by the objects in the SRM • No. of defective assertions NASA IV&V Facility Workshop on Validation Morgantown, WV

  12. Measuring the Effectiveness of the Assertions on Developers’ Products • Testing the developer’s products (formal requirements and target systems) via executable model checking • Some examples of metrics are: • No. of assertions that conflict with the formal spec. • No. of severe programming errors • No. of test cases which violate temporal assertions • No. of input sequences that lead the statechart under test to particular states of interest NASA IV&V Facility Workshop on Validation Morgantown, WV

  13. Adequacy of the Assertions for Driving Automated Verification • To be discussed on September 11 at the Workshop on Verification NASA IV&V Facility Workshop on Validation Morgantown, WV

More Related