1
FV3GFS Development Status
Vijay Tallapragada Chief, Modeling and Data Assimilation Branch
Environmental Modeling Center National Centers for Environmental Prediction
With inputs from Fanglin Yang, Jun Wang, Rahul Mahajan and Rusty
Benson
1
WCOSS Science Quarterly, May 31, 2017.
Outline
FV3GFS Implementation Time Line
Installing Standalone FV3 in GFS Superstructure
NEMS FV3GFS – CAP, IPDv4, Write Component
Sensitivity Experiments – Dycore Precision, Model Vertical Extend, Hydro versus non-Hydro.
Forecast-Only Experiments – Comparison with Q3FY17 NEMS GFS 2
FV3GFS Development/ Implementation Schedule
Today
FV3GFS DA Development/ Implementation Schedule
Today
FV3GEFS Development/ Implementation Schedule
Today
FV3GFS As a Community Model: Version 0 Code Release 05/15/17
Configuration: NEMS + FV3_CAP + FV3_Dycore + IPDv4 + GFS_Physics
Same model used for Phase-2 dycore comparison with upgrade of physics to Q3FY17 GFS configuration.
Resolution: C96 (~100km), C384 (25km), C768 (~13km) Build the model: On WCOSS, THEIA and Jet, with pre-installed
libraries and utilities. Data: Initial conditions for selected cases, and fixed fields Release Date: May 15, 2017 Method of Release: VLab GIT; EMC Subversion Running the model: simple shell script and configuration files Post Processing: Fregrid and Remap tools to convert 6-tile model
output to global lat-lon grid with user defined resolution
NOAA Virtual Lab (VLab) to host FV3GFS Code Release
Access FV3GFS Project on VLab
https://vlab.ncep.noaa.gov/web/fv3gfs
Code repositories set up on VLab GIT & EMC Subversion
Community Wiki page, Forums and Developers Pages on VLab
Case Studies: Sept. 29, 2016 Hurricane Matthew Jan. 18, 2016 East Coast Blizzard Aug. 12, 2016 Louisiana Flooding
Model Resolutions: C96 (~100km), C382 (~25km) or C768 (~13km)
• Limited support from EMC to run FV3GFS forecast only experiments on WCOSS, Theia and Jet
• Unified Community Research and Operations Workflow (CROW) under development
Physics: Two-Stream Strategy
8
Physical Processes Operational Physics (Q3FY17 GFS)
Advanced Physics* (using IPDV4)
Radiation RRTMG/McICA RRTMG (scale and aerosol aware, w/sub-grid scale clouds)
Penetrative convection and Shallow convection
Scale Aware SAS & Mass Flux Shallow Convection
Scale-aware Chikira-Sugiyama & Arakawa-Wu; Grell-Freitas
Turbulent transport (PBL) Hybrid EDMF CS+SHOC (unified convection & turbulence); TKE-based moist EDMF
Cloud microphysics Zhao-Carr Double Moment scheme (Morrison-Gettleman; Thompson;)
Gravity wave drag Orographic GWD Stationary convective GWD
Unified representation of GWD
Ozone physics NRL simplified scheme Modified NRL scheme Land surface model (LSM) Noah Noah and LIS
SST NSST NSST
*Includes aerosol chemistry (NGAC) module
Connect the pieces together for operational implementation (Q2FY19)
9
End-to-end operational configuration for NGGPS implementation: oAdopt Unified Post-Processing (UPP), product
generation and verification software (VSDB/MET) oConduct thorough analysis of scientific and
computational performance with full 3-year retrospective and real-time experiments (MEG, stakeholder evaluation and feedback)
oTest and evaluate the impact on upstream and downstream production suite dependencies
Initial implementation configuration for FV3GFS in FY19
10
• Planned/Projected FY19 FV3GFS configuration • Resolution: ~9 km 128 levels • Physics: New physics options implemented in FY18 tuned for FV3
• Scale and aerosol aware Chikira-Sugiyama Convection Scheme with Arakawa-Wu extension
• Unified representation of turbulence and shallow convection (SHOC) • Double moment microphysics • Upgraded LSM, radiation, GWD and Ozone Physics
• DA configuration: Similar to FY18 NEMS/GSM GDAS with additional developments required for FV3 dynamic core, new datasets (GOES-R, JPSS etc.)
• Run times optimized for production suite requirements • End-to-end system testing for stability, robustness of scientific and technical
solutions, non-negative impact for downstream dependencies • Modern workflow (CROW* for development, T&E; ecflow for production)
FV3GFS CAP and Write Component ESMF Based NEMS FV3GFS - Object
Oriented Design
11
• NEMS is based on ESMF and follows NUOPC convention
• A numerical model in NEMS is represented by software and implemented as an ESMF grid component.
• Each ESMF grid component has its own internal state with internal methods
FV3 Dynamical
Core IPD
CCPP
Initial implementation configuration for FV3GFS in FY19
12
• Planned/Projected FY19 FV3GFS configuration • Resolution: ~9 km 128 levels • Physics: New physics options implemented in FY18 tuned for FV3
• Scale and aerosol aware Chikira-Sugiyama Convection Scheme with Arakawa-Wu extension
• Unified representation of turbulence and shallow convection (SHOC) • Double moment microphysics • Upgraded LSM, radiation, GWD and Ozone Physics
• DA configuration: Similar to FY18 NEMS/GSM GDAS with additional developments required for FV3 dynamic core, new datasets (GOES-R, JPSS etc.)
• Run times optimized for production suite requirements • End-to-end system testing for stability, robustness of scientific and technical
solutions, non-negative impact for downstream dependencies • Modern workflow (CROW* for development, T&E; ecflow for production)
IPDv4 -- Interoperable Physics Driver
13
• Designed to be lightweight and simple
• Works with different physics packages
• Name is a bit of a misnomer
• Not a driver, but an aliasing layer
• radiation and physics aliased to generic “steps”
• data grouped into containers based on purpose
• Self-describing data for I/O-related elements
• diagnostic data
• restart data
The physics package included in IPDv4 is the same physics as the Q3FY17 NEMS GFS, which is scheduled to be implemented for operation in July 2017. GFDL delivered IPDv4 to EMC in early March 2017. The code was further updated to add NSST cold and warm start capability, to make output frequency flexible, and to ensure warm start reproducibility.
Resolution, Physics Grid, and Run-time on Cray 10-d forecast, 6-hourly output, 3.75-minute time
step: C768, 13km, 3,538,944 points
14
Hydro/ non-hydro
precision threads nodes CPU (min/10day)
Non-hydro 32-bit 2 64 89 Non-hydro 64-bit 2 64 137 Non-hydro 64-bit 2 144 69 Non-hydro 64-bit 4 Hyper-Thread 64 135 hydro 64-bit 2 64 95 hydro 64-bit 2 144 51
T1534 NEMS GFS (~13 km, 3072x1536), 61 nodes, 73 minutes
Test #1: Comparison of Forecasts with 32-bit and 64-bit Precision FV3 Dycore
15
Comparison of Forecasts with 32-bit and 64-bit Precision FV3 Dycore
16
17
Comparison of Forecasts with 32-bit and 64-bit Precision FV3 Dycore
18
Comparison of Forecasts with 32-bit and 64-bit Precision FV3 Dycore
32-bit Dycore precision has: • Lower HGT AC at 1000hPa and 500hPa in SH
• Lower SLP AC in both NH and SH
• Higher wind RMSE near SH and Tropical tropopause
• Larger temperature RMSE (colder) in the NH and tropical stratosphere,
and SH stratosphere and troposphere.
• No impact on CONUS precipitation ETS and Bias scores
• No impact on hurricane track error scores.
Computing cost (35% gain) vs. small gain of accuracy; probably need to run FV3GFS at different precision for different applications
19
Test #2: Comparison of GFS Forecasts with 63 and 64 Model Vertical Layers
• Reduced wind and temperature RMSE in the upper stratosphere
• Stronger polar night jets, and improved meridional wind in the
stratosphere
• No significant impact on HGT AC at 500hPa and 1000hPa
• No significant impact on SLP AC
• No impact on CONUS precipitation ETS and Bias scores
• No impact on hurricane track error scores.
Need more tests to make the model run stably with longer time step
20
Test #3: Hydrostatic vs Non-hydrostatic
3 different experiments: • Prfv3gfsb: non-hydrostatic, non-mono • Prf3gfsc: hydrostatic, non-mono • Prfv3gfsd: hydrostatic, mono All ran at 64-bit, C768 resolution, with operational GFS ICs
for the 01oct2016-06nov2016 period.
• Hydro better than non-hydro for 500 mb ACC • Hydro is colder than non-hydro in the troposphere • Difference in the stratosphere between hydro and non-hydro is small. • Hydro is too cold near the tropopause, while non-hydro is too warm • Hydro is worse than non-hydro for short-range CONUS precip forecasts • Non-hydro CAPE is ~35% less than ops GFS CAPE over tropics • Hydro CAPE is much larger than non-hydro CAPE
Benchmark Test of NEMS FV3GFS with IPDv4 and NSST
21
• Model: NEMS FV3GFS + CAP + IPDv3 +NSST, non-hydrostatic, non-mono, 32-bit
• Physics: Q3FY17 NEMS GSM physics, which is the same as the current operational GFS physics except for IPDv4 driver, new NSST model, updated convection, new high-resolution MODIS land datasets and a few other minor updates.
• Resolution: L63 (top at ~1.0 hPa), C768 (~13km) • Initial Conditions: Q3FY17 NEMS GSM (global spectral model) ICs,
converted to FV3 grid using CHGRES • Period: JJA 2016 and DJF 2016/2017 • Forecast Length: 240 hours, 3-hourly output • Control: NEMS GSM parallels
http://www.emc.ncep.noaa.gov/gmb/wx24fy/NGGPS/fv3ipd4/
http://www.emc.ncep.noaa.gov/gmb/wx24fy/NGGPS/fv3ipd4_jja/
Day-5 Zonal Mean Height
JJA DJF
GSM Fcst - anl
FV3 Fcst - anl
GSM Fcst - anl
FV3 Fcst - anl
Compared to the analyses, while GSM is too cold in the stratosphere, FV3 is too warm
Day-5 Zonal Mean Could Water
JJA DJF
GSM Fcst - anl
FV3 Fcst - anl
GSM Fcst - anl
FV3 Fcst - anl
FV3GFS has more cloud water than GSM
500hPa HGT AC
FV3GFS slightly better than GSM
25
Precip Skill Scores
T2m verified against station obs over Northern Great Plains
26
JJA DJF
Surface verification against obs showed that FV3GFS and NEMS GSM have similar skills over all sub-regions over CONUS and Alaska in both seasons.
Ongoing Activities
• Developing NEMS FV3GFS Write Component using ESMF regridding
• Developing tools for changing resolution on FV3 native grid
• Developing unified workflow for development and production
• Developing 64-L and 128-L FV3GFS
• Developing cycled data assimilation capability
• Developing nesting capability for regional high-resolution application
• Developing and testing advanced physics suits for FV3GFS – microphsyics, convection, PBL, gravity-wave drag etc
• Developing unified post-processing and product generation
• Continuous scientific evaluation
• Creating community modeling framework for collaborations and support
THANK YOU