1 / 30

TAF Verification from a Customer Perspective

TAF Verification from a Customer Perspective. Dan Shoemaker Aviation Curmudgeon, NWS FWD From a 2005 study done with: Rick Curtis, Chief Meteorologist, SWA Paul Witsaman, Southern Region RAM. Motivation (now). Stats on Demand (SOD) verifies to five minute intervals.

barbra
Download Presentation

TAF Verification from a Customer Perspective

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TAF Verification from a Customer Perspective Dan Shoemaker Aviation Curmudgeon, NWS FWD From a 2005 study done with: Rick Curtis, Chief Meteorologist, SWA Paul Witsaman, Southern Region RAM

  2. Motivation (now) • Stats on Demand (SOD) verifies to five minute intervals. • I can’t forecast to five minute intervals! • Airlines can’t land in a five minute window. • SOD verification numbers seem low (CSI’s <50%) • Is this the best way to measure performance? • How might an airline customer measure performance?

  3. TAF Study Overview • Pick a day when widespread weather was a factor. • Examine all SWA flights that landed in Southern Region (869 flights, 24 airports). • Use an airline perspective – • Look at “alternate fuel required” (2000/3) performance. • Did each flight’s planning TAF accurately reflect the landing conditions…was the extra fuel needed?

  4. Dec 22, 2004 • Weather impacts included: • SN/PL/FZRA in ABQ AMA BHM BNA DAL LBB LIT MAF OKC TUL • FG in AMA CRP HOU IAH LBB LIT • +RA/SHRA/TSRA in BHM BNA HOU IAH JAN JAX MSY • MVFR/IFR ceiling/visibility at numerous other stations.

  5. Detailed Methodology • For each flight: • Examine the TAF valid 2 hours prior to each flight’s take-off. This is assumed to be the TAF the dispatcher used to compute the fuel load. • Were conditions below 2000/3 forecast at each flight’s landing time (includes prevailing and TEMPO)?

  6. Detailed Methodology (cont) • For each flight: • Examine all observations, include specials, within the hour bracketing each flight’s landing time. • If conditions below 2000/3 occurred at any time within the landing hour…assume alternate fuel was required (the TAF was “reasonable” since the weather occurred near landing time).

  7. More Methodology • Verify the TAF against the observations for each flight creating the 2 x 2 contingency table. • TAF forecasted conditions below 2000/3? (Y/N) • Conditions below 2000/3 occurred at some time during landing hour? (Y/N)

  8. Results POD FAR CSI Accuracy .891 Hits/(Hits + Misses) .220 FA/(FA + Hits) .713 Hits/(Hits + Misses + FA) .852 (Hits + Correct Neg)/Total

  9. POD .891 FAR .220 CSI .713 Accuracy .852 Hits: Extra fuel costs were required. Correct Neg: Extra fuel costs were saved. False Alarms: NWS cost airlines money. Misses: Diversions possible*.

  10. Stats on Demand vs This Study • SOD CSI for 2000/3 – unavailable (early 2005). SOD CSI for 1000/3 -- available. • I examined 3 comparable winter events from 2009 that produced similar 1000/3 CSI’s (and had 2000/3 CSI available). • Averaged 2000/3 CSI’s used as an approximate value for the TAFs used in this study. • Yes, this is an “apples to oranges” comparison. • But…ball park numbers still provide insight.

  11. Stats on Demand CSI vs Study CSI • 2005 study CSI: .713 • SOD ballpark CSI: .425

  12. Conclusion/Recommendations • Verifying five minute intervals makes verification scores look low. Real-world skill and value to customers are likely higher. • Concentrate on SOD scores vs guidance -- change focus to examine local improvement over models. (do we add value compared to an automated product?) • Use SOD results to look for weak areas. Negative forecasting will show up in the bias scores.

  13. Any questions so far?And now for something completely different….. Monty Python

  14. Improving TAFs • Intended for Aviation Program Leaders/ Forecasters • Real change has to be made locally-where the TAFs are written.

  15. Improving TAFs • Biggest obstacle to change—office inertia. • You have to change your office culture. • …but APL’s have responsibility with no authority. • We can’t “make” anybody do anything. • What can you do?

  16. Three Hour TAFs—a “How To” for APL’s • Decide to write them. You/MIC are the advocates. • Your customers want them. • Decide on which airports. • Widespread GA in North Texas - we write all our sites. • Obtain union cooperation. • I took a poll of all 12 forecasters, and got 12 “yes” votes. I did have to do some “lobbying”.

  17. Three Hour TAFs—a “How To” for AFP’s • Let your customers/back-up sites know. • We issued a PNS; added a notice to Aviation AFDs; added a headline on web page; notified sister offices. • Decide on local procedures. • Make some AWIPS changes (alarms, etc).

  18. Improving TAFs • Implement 3 hour TAF amendments. • Emphasize PPTAF best practices. • PPTAF minimizes bad TAF practices that hurt performance/verification. • If all forecasters use PPTAF practices, TAFs will be more consistent between forecasters/offices.

  19. Improving TAFs • Use the AVNFPS conditional climatology. • For operations and research. • Encourage “optimistic” forecasts. • FWD’s PROB30’s “hurt” the TAF 99% of the time. • Coordinate/communicate with your CWSU.

  20. Improving TAFs • Make sure your TAF writers know how they affect their customers. • “TEMPO 2124 1SM TSRA BKN035CB”

  21. How do you make these “cultural”changes? • Use “force of personality” to get individual forecasters to improve. Group statements (emails to all, memos) don’t work. • Provide individual feedback. A number of FWD forecasters will tell you “I didn’t want to get a ‘Shoe’ talk, so I didn’t…” • Newton’s first law: Aviation programs at rest remain so. Change the comfort zone.

  22. Improving TAFs • Provide formal feedback; office and individual. You can’t change what you don’t measure. • FWD - annual IFR/MVFR/alt reqd stats. • Performance vs SR and LAMP guidance. • 2000/3 is more important than total IFR. • Individual annual IFR/alt reqd performance vs office average and LAMP. • Results are kept anonymous.

  23. FWD Annual Stats(percent improvement over LAMP)

  24. Annual Individual TAF stats(percent improvement over LAMP):

  25. Individual TEMPO Use 12-30 Hours(PPTAF No-No)

  26. Curmudgeon’s TAF Rules of Thumb • First, do no harm. • When in doubt, leave it out. • or -- optimism beats pessimism. • Airplanes will NOT fall out of the sky if you write optimistic TAFs. • Forecast the probable weather, not the worst possible weather.

  27. Curmudgeon’s TAF Rules of Thumb • The TAF is not a portrait of the atmosphere, it is a stick figure. As long as it is anatomically correct, it’s a good representation.

  28. Curmudgeon’s Forecaster Rule of Thumb “Constant abrasion produces the pearl…it’s a disease of the oyster.” Lenny Bruce

  29. You cannot wait for improvements to come from above — so make them locally. If you make TAF improvement a priority —TAFs will improve. They need to improve. Most local improvements you can make involve effort, but zero cost. You don’t have to do what I do, but DO SOMETHING!

  30. Questions/Discussion? Thanks for your time.

More Related