190 likes | 197 Views
This document provides an overview of the progress and schedule of the testing process at the CERES Delta Design Review. It includes information on testing activities completed, in-progress, and planned, as well as the use of test proxy and synthetic data.
E N D
SDS Testing / Verification @ the CERES Delta Design Review Victor Buczkowski
Progress & Schedule Testing Process Progress Accounting & Reporting Requirements Verification Progress Score Card Agenda
Test Activities Completed, In-progress & Plannedpart 1 of 2 • Test proxy and synthetic dataOngoing • Currently have one Orbit of test data • Awaiting 6 orbits of data • Need 72 hours + of continuous data or dupe the 6 orbits • System Test PlanOngoing • Test Plan completed • Refining the number of requirements that each element has (in final stages) • Test Procedures provided by the element for each test • SDS Functional Test #1 – 3/2008 thru 12/2008 • Test Readiness Reviews 3 days prior to the test Ongoing • Test Execution (usually on Thursdays) • Test Results Report (a week after the test) • SDS NCT3 Readiness Test – 3/2009 - All Elements • Test Readiness Reviews ? days prior to the test In Planning • Test Execution (3 days ?) • Test Results Report (a week after the test)
Test Activities Completed, In-progress & Plannedpart 2 of 2 • SDS Functional Test #2 – 3/2008 thru 10/2009In Planning • Test Readiness Reviews 3 days prior to the test (usually on Monday) • Test Execution (usually on Thursdays) • Test Results Report (a week after the test) • SDS Capacity/Performance Test – 10/2009 - All ElementsIn Planning • Test Readiness Reviews 3 days prior to the test (usually on Monday) • Test Execution (usually on Thursdays) • Test Report (a week after test) • SDS 3 Day in the Life Test – 10/2009 - All ElementsIn Planning • Test Readiness Reviews 3 days prior to the test (usually on Monday) • Test Execution (usually on Thursdays) • Test Results Report (a week after the test)
Identify when an element is ready to test (single point coordination = Vic) Need Build schedule (1, 2 or more deliveries?) Identify Testable Requirements being delivered (what's testable based on the elements capabilities and their interfaces capabilities) Schedule a test date Convene a test discussion meeting Determine what test data is available (real or fabricated) SD3E can help identify what’s available as well as what other elements have available. Develop the test procedure (the steps to be used to verify a particular requirement. This becomes your SOP) Testing Process1 of 2
Conduct the Test Readiness Review (TRR usually three days prior to the test to insure everything is ready) Execute the test Provide the test results/report Testing Process2 of 2
Accounting Process1 of 2 Test Accounting • Conduct Test Readiness Reviews (TRR) • Identify what is being tested • Schedule, Coordinate andExecute the tests • Collect, Review & Maintain test report(s) • What requirements were verified and how • Pass, Fail, Partial, Status assessment • Issues and discrepancies must be identified, tracked and, retested as necessary • These materials are maintained by the SDS Test Coordinator • The test verification matrix and a score card will be maintained regularly
Accounting Process2 of 2 • Test Results: • As testing proceeds, results will be identified as: • Verified (worked as expected and planned) • Failed (discrepancies will be identified by generating a Test Discrepancy Report (TDR) and tracked via Bugzilla) • Workaround (verified, but needed help – all functionality not available) • Deferred (some may not be verified until after launch) • Waived (no-longer needed to support the mission at this time)
Reporting ( page 1 of 2) Sample Procedure
Reporting ( page 1 of 2) Sample Procedure
Reporting Flow Test Execution Test Report & Verification Results Test Manager Updates Test Matrix Score Card Updated Score Card Results Report Delivered to Management & Review Boards
Verification Progress as of 8/12/2008 1 of 3 Green = Passed
Verification Progress as of 8/12/2008 2 of 3 Green = Passed
Verification Progress as of 8/12/2008 3 of 3 • The previous two slides show the following • The total number of requirements for each element • The Level 3 requirement ID number • The number of requirements verified to date highlighted in green • Other colors not yet used – Red = Failed, Blue = Deferred, Waived = Peach
Atmosphere PEATE (U of Wisconsin)Level 3’s NICSE CERES SD3E Total SDS Level 3’s Land PEATE Level 3’s Ozone PEATE Sounder PEATE (JPL) SDS Score Card as of08/12/2008 Level 3 Requirements (T) - Total Requirements = 58 (F) - Failed = 0 (V) - Verified = 0 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 0% (T) - Total Requirements = 53 (F) - Failed = 0 (V) - Verified = 23 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 43.0% RDR, SDR EDR, IP Level 3 Requirements Quality Control, Cal. LUT and S/W updates (T) - Total Requirements = 39 (F) - Failed = 0 (V) - Verified = 12 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 31% RDR, SDR EDR, IP Ocean PEATE Level 3’s (T) - Total Requirements = 56 (F) - Failed = 0 (V) - Verified = 16 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 28.6% RDR, SDR, EDR, IP (T) - Total Requirements = 58 (F) - Failed = 0 (V) - Verified = 23 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 40% NPP Science Team (T) - Total Requirements = 411 (F) - Failed = 0 (V) - Verified = 124 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 30.2% Algorithm Enhancements IP Gen. Request IDPS S/W updates TDR, RDR, SDR, EDR, IP Status reports, Management direction I&TSE Level 3 Requirements Level 3 Requirements Level 3 Requirements Level 3 Requirements (T) - Total Requirements = 12 (F) - Failed = 0 (V) - Verified = 7 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 58.3% (T) - Total Req’s = TBD (F) - Failed = 0 (V) – Verified = 0 (D) - Deferred or = 0 Workaround (W) – Waived = 0 (%) - Verified = 0 (T) - Total Requirements = 64 (F) - Failed = 0 (V) - Verified = 33 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 52% (T) - Total Requirements = 56 (F) - Failed = 0 (V) - Verified = 10 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 18% EDR Assessment Algorithm Enhancements PSOE Level 3 Requirements Status reports, Management direction Score as Of 08/12/2008 (T) - Total Requirements = 15 (F) - Failed = 0 (V) - Verified = 0 (D) - Deferred or = 0 Workaround (W) - Waived = 0 (%) - Percent Verified = 0% Algorithm Enhancements IP Gen. Request Status reports, Management direction Not all data flows shown here.
SDS Score Card (page 2 of 2) • The previous slide reports the level 3 progress by element and total SDS progress. • Requirements status are scored: • Verified (worked as expected and planned) • Failed (discrepancies will identified and tracked using Bugzilla) • Workaround (verified, but needed help – all functionality not available) • Deferred (some may not be verified until after launch) • Waived (no-longer needed to support the mission at this time)