YOPP Verification Task Team
The YOPP verification task team has been set up in order to help outlining a YOPP verification strategy and develop/support a YOPP verification effort. The YOPP verification task team has therefore primarily an advisory role, however - through this website - we hope to provide also a platform for exchange, in order to connect YOPP ongoing verification activities (to ensure harmonization and avoid duplication).
The key deliverable of the YOPP verification task team is to outline a YOPP verification plan. The design of a verification strategy depends on the purpose of the verification: the YOPP verification task team focuses on verification for modellers and for the general users. The main goal is to demonstrate the advancements associated with the YOPP modelling and observation efforts. Some of the expected outcomes are: i) quantifying the accuracy of current numerical models in polar regions; ii) demonstrate the added value of enhanced observations in data assimilation, numerical prediction, and verification practices; iii) assess the impact of coupling (ocean-land-atmosphere and sea-ice) in dynamical models on environmental prediction; iv) identify systematic errors in numerical models and physical processes which need improvements in their representation, and v) quantify the impacts of the improved polar prediction on the accuracy of mid-latitude weather (and climate) prediction. User-oriented verification, which needs to be tailored to the very specific end-user parameters and needs, is tackled as a joint topic with the PPP-SERA Task Team.
Verification within YOPP has two components: the first is to produce verification results which are informative for modellers and end-users; the second is to serve as a platform for the investigation of verification challenges and the development of new techniques. The YOPP verification task team aims to provide guidance for both.
Members and Contact
- Barbara Casati, PPP SG, Environment and Climate Change Canada
- Thomas Haiden, European Centre for Medium-Range Weather Forecast (ECMWF)
- Morten Andreas Ødegaard Køltzow, Norwegian Meteorological Institute
- Helge Goessling, Alfred Wegener Institute (AWI)
- Zen Mariani, Environment and Climate Change Canada
- Gunilla Svensson, PPP SG, Stockholm University
- Taneil Uttal, PPP SG, NOAA/Earth System Research Laboratory
- Jonathan Day, PPP SG, European Centre for Medium-Range Weather Forecast (ECMWF)
- Qizhen Sun, National Marine Environmental Forecasting Center (NMEFC), State Oceanic Administration
- Eugene Petrescu, NOAA/National Weather Service
- Stella Melo, Environment and Climate Change Canada
- Pertti Nurmi, Finnish Meteorological Institute (FMI)
- Matthew Shupe, NOAA/Earth System Research Laboratory
- Gregory Smith, Environment and Climate Change Canada
This webpage is in development and aims to serve as an exchange platform: constructive suggestions can be sent to barbara.casati(at)canada.ca or to the ICO.
Documentation relevant for planning the YOPP verification effort has been prepared:
1) The report on "Verification of Environmental Prediction in Polar Regions: recommendations for the Year of Polar Prediction" prepared by Casati and co-authors (2017) focuses mainly on the verification challenges and methods.
2) The presentation "YOPP verification goals, YOPP core-phase activities, and verification plans for the YOPP consolidation phase" outlines the YOPP verification priorities and ongoing activities.
3) A short presentation prepared by Casati and co-authors (Oslo, 10 Nov 2016) lists the needs of the verification community for planning the
- The Forecast Verification FAQ web page developed and maintained by the Joint Working Group on Forecast Verification Research (JWGFVR) includes a review of standard and advanced verification methods, links to verification websites, and a very complete list of references.
- The official webpage of the WWRP/WGNE Joint Working Group on Forecast Verification Research (JWGFVR) includes the verification recommendation reports.
- The Spatial Verification Methods Inter-Comparison Projects is an international meta-verification project where spatial verification methods are tested on a common set of cases, and their diagnostic capabilities are analysed, in order to provide guidance on the optimal application of these techniques.
- The Model Evaluation Tools is a verification package developed at NCAR which includes traditional (continuous, categorical, probabilistic and ensembles) and spatial verification techniques.