1 / 24

ASW METOC Metrics: NOAT Committee Report

ASW METOC Metrics: NOAT Committee Report. Bruce Ford Clear Science, Inc. (CSI) bruce@clearscienceinc.com Tom Murphree Naval Postgraduate School (NPS) murphree@nps.edu. Brief for ASW METOC Metrics Symposium Two 02-04 May, 2007.

sven
Download Presentation

ASW METOC Metrics: NOAT Committee Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASW METOC Metrics: NOAT Committee Report Bruce Ford Clear Science, Inc. (CSI) bruce@clearscienceinc.com Tom Murphree Naval Postgraduate School (NPS) murphree@nps.edu Brief for ASW METOC Metrics Symposium Two 02-04 May, 2007 Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  2. NOAT Focus Committee Report • Scope • Customers • METOC inputs to mission phases • METOC performance metrics • Customer performance metrics • Operational performance metrics • Proxy operational metrics • Other metrics • Data collection systems • Data analysis process • Operational modeling • Funding Levels Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  3. NOAT Focus Committee Members • LEAD: Clear Science – Mr. Bruce Ford • NOAC SSC LT Tim Campo • NOAC Yoko - LCDR Joel Feldmeier • NRL - Jim Dykes, Josie Fabre • SPA - Paul Vodola, Matt McNamara, Luke Piepkorn Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  4. NOAT Focus Committee Report • What makes Surface ASW Different • One METOC product supports multiple events (ASW/Commodore’s brief) • Unlike MPRA or strike where one MEP supports one mission • Major changes to plans often result from METOC information at brief Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  5. NOAT Focus Committee Report • Scope • Brick from which NOAT metrics will be built is data surrounding the Commodore’s brief • Commodore’s brief • Verification of key discrete elements • Mission objectives • Mission outcomes • Expanded scope will be proposed for additional metrics Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  6. NOAT Focus Committee Report • Customers • Primary • DESRON or CTF Commanders  Primary focus • aka “ASW warfighters” • Secondary • Carrier Strike Groups (CSGs) • Other units benefiting from ASW Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  7. NOAT Focus Committee Report Planning Timeline 1 month • Climo/advisory inputs at IPC/MPC for large scale exercises – sometimes informal • Climo/area brief by NOAT common at FPC • Pre-sail brief by NOAT common near COMEX • Contains best information on existing conditions in exercise area Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  8. NOAT Focus Committee Report Planning Inputs Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  9. NOAT Focus Committee Report Exercise Timeline • Exercise timing and pacing differ significantly from exercise to exercise • ASW briefs provided on regular basis (6, 8, or 12 hour increments most common) • Additional information for specific exercise events Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  10. NOAT Focus Committee Report Exercise Information Flow Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  11. NOAT Focus Committee Report Debrief Timeline • NOAT input sometimes included regarding METOC impacts on exercise • R&A effort a potential source of whole-exercise data Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  12. NOAT Focus Committee Report Potential METOC Performance Metrics • List may be expanded – Recommend all verifiable elements be collected and verified • Verification scheme needs to be developed • Verification BT info should be launched during analysis valid time • May be many ranges forecasted, but few verified. Strive for: • Sensor • Mode (active/passive) • Sensor depth • Target depth • Propagation path • Environmental data source (MODAS, MODAS LT, BT) • Predicted range NOAT tends to use RBC products without changes. If NOAT modifies RBC product, this change and the reasons for the change, need to be documented, and the NOAT modified products need to be verified. Collect NOAT data via NOAT event-by-event battle watch log. Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  13. NOAT Focus Committee Report Potential Customer Performance Metrics • Amount of time contact held • Number of contacts lost • Number of contacts gained • Number of contacts localized • Number of screen penetrations • Number of successful simulated submarine attacks on friendly units • Number of successful simulated submarine attacks on high value unit (HVU) • Number of successful simulated attacks on a submarine target • Number of unsuccessful simulated attacks on a submarine target Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  14. NOAT Focus Committee Report Potential Operational Impacts Metrics • Draw correlations between METOC performance metrics and customer performance metrics • * Proposed proxy metrics: SLD, BLG, COF • Those elements with high correlations over time are likely proxy operational impacts metrics Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  15. NOAT Focus Committee Report Potential Operational Impacts Metrics (directly recorded) • Record information about recommendations made that results in • change to standing search plan • Search plan changes recommended to optimize environmental • exploitation (yes/no) • Category of the recommendation: • Force positioning • Dynamic maneuver • Sensor employment • SONAR mode change (active vs. passive) • Negative impact mitigation • Sonobuoy pattern • Search plan changes implemented (yes/no) • Reason for mitigation recommendation rejection • Maintenance • Event interference • Assets unavailable • Recommendation not believed Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  16. NOAT Focus Committee Report Other Metrics • Brief frequency • Number of detections by ship’s sensors • Number of detections by off-ship sensors (e.g., MPRA, LAMPS) • Number of LAMPS detections • Number of detections by active SONAR • Number of detections by passive SONAR • Number of detections by visual sensors • Number of detections by IR sensors • Number of detections by RADAR sensors • Number of bathythermographs expended • Number of ambient noise buoys expended • Number of sonobuoys expended Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  17. NOAT Focus Committee Report Data Collection • Primary (brief-by-brief) data collection • SIPR web forms for entering source data • Some in situ inputs may be automated • Fcst/anal and recommendation data entered following brief • Other sources can be entered prior to the next brief • Paper forms as back up to facilitate later entry into system • Training must be provided and readily available on system • Multi-media training provided to NOATs • Change support paradigm such that job isn’t completed until all data required for metrics is entered Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  18. NOAT Focus Committee Report Data Collection • Exercise-level data collection • Flag missions as part of an exercise (MPRA, surface ASW, etc.) • Collect planning impacts by METOC information during exercise planning process (e.g., IPC/MPC/FPC/Presail) • Collection outcomes data from post-exercise (hot wash) meetings • Prepare whole-exercise data for further analysis Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  19. NOAT Focus Committee Report Data Analysis/Display - Multi-Level Access • Display determined by user permissions • Level 1 – Single-brief metrics information • Level 2 – Multiple briefs, single METOC unit metrics. Metrics displayed by combination of: • NOAT • Geographical region • Span of time • Exercise • Level 3 – Multiple-brief, multiple METOC unit metrics. Metrics displayable by: • METOC unit • Geographical region • Span of time • Exercise • Include directorate level metrics for top level users Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  20. NOAT Focus Committee Report Data Analysis/Display – Level 1 Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  21. NOAT Focus Committee Report Data Analysis/Display – Level 2 Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  22. NOAT Focus Committee Report Data Analysis/Display – Level 3 Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  23. NOAT Focus Committee Report Operational Modeling • The interoperability of the NOAT with the RBC and specific ASW missions suggest that operational modeling should be considered on a directorate-wide level • Model the extension of the ASW mission planning process back through the NOAT to capture the impact of NOAT products on ASW results / operational impacts • Modeling output used in same manner as the MPRA modeling: • Identify sensitivities of warfighter to METOC elements of information • Provide a basis for the metrics evaluation process • Inform future funding and R&D decisions • Improve metrics data collection methods • Align training and research to add value to and improve METOC information Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  24. NOAT Focus Committee Report Funding Recommendations • 1. Bare bones – Initiate primary collection system with analysis and display of metrics. Provide training to personnel to will enter data or administer collection system. • 2. Adequate to complete project – Same as 1 above, plus: (a) operational analysis studies to expand beyond exercise operations, identify sensitivities and improve data collection/display system. Initiate feasibility study of collecting exercise-level data and computing exercise impact metrics. • 3. Completely funded – Implement an R&A level metrics effort to collect exercise level metrics which incorporate brief-by-brief and mission-by-mission metrics records of surface ships, submarines, MPRA and metrics of other participating units. Ford B and T. Murphree, NOAT Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

More Related