+ All Categories
Home > Documents > Acknowledgments: The authors would like to thank

Acknowledgments: The authors would like to thank

Date post: 14-Jan-2016
Category:
Upload: aloha
View: 25 times
Download: 3 times
Share this document with a friend
Description:
Development of neural network emulations of model physics components for improving the computational performance of the NCEP seasonal climate forecasts P.I.s: M. Fox-Rabinovitz, V. Krasnopolsky, Co-I.s: S. Lord, Y.-T. Hou, Collaborator: A. Belochitski, CTB contact: H.-L. Pan. - PowerPoint PPT Presentation
Popular Tags:
32
Development of neural network emulations of model physics components for improving the computational performance of the NCEP seasonal climate forecasts P.I.s: M. Fox-Rabinovitz, V. Krasnopolsky, Co-I.s: S. Lord, Y.-T. Hou, Collaborator: A. Belochitski, CTB contact: H.-L. Pan Acknowledgments: The authors would like to thank Drs. H.-L. Pan, S. Saha, S. Moorthi, and M. Iredell for their useful consultations and discussions. The research is supported by the NOAA CPO CDEP CTB grant NA06OAR4310047. NOAA CDEP CTB SAB Meeting, August 28-29, 2007, Silver Spring MD
Transcript
Page 1: Acknowledgments:  The authors would like to thank

Development of neural network emulations of model physics

components for improving the computational performance of the NCEP seasonal climate forecasts

P.I.s: M. Fox-Rabinovitz, V. Krasnopolsky, Co-I.s: S. Lord, Y.-T. Hou,

Collaborator: A. Belochitski, CTB contact: H.-L. Pan

Acknowledgments: The authors would like to thank Drs. H.-L. Pan, S. Saha, S. Moorthi, and M. Iredell

for their useful consultations and discussions. The research is supported by the NOAA CPO CDEP CTB grant NA06OAR4310047.

NOAA CDEP CTB SAB Meeting, August 28-29, 2007, Silver Spring MD

Page 2: Acknowledgments:  The authors would like to thank

OUTLINE• Neural Network applications to the CFS model:

Current methodological developments and development of NN emulations for the LWR of the CFS model

• Background information on the NN approach• Development of NN emulations of LWR (Long-

Wave Radiation) for the CFS model and evaluation of their accuracy vs. the original LWR

• Initial validation of NN emulations of LWR through the CFS model run using the LWR NN emulations vs. the control run using the original LWR

• Conclusions and plans

Page 3: Acknowledgments:  The authors would like to thank

– Development of the NN methodology for LWR NN

emulations for CFS – Development of NN experimentation and validation

framework – Creation of training and validation data sets from

the 2-year CFS model simulations– Development of NN emulation versions for LWR

and validation of their accuracy vs. the original LWR– Initial validation of NN emulations through CFS

model runs with the NN emulations vs. the control CFS model run with the original LWR

– Analysis of initial results for seasonal predictions, short to medium forecasts, and climate simulations

Current Developments

Page 4: Acknowledgments:  The authors would like to thank

Background• Any parameterization of model physics is

a relationship or MAPPING (continuous or almost continuous) between two vectors: a vector of input parameters, X, and a vector of output parameters, Y,

• NN is a generic approximation for any continuous or almost continuous mapping given by a set of its input/output records:

SET = {Xi, Yi}i = 1, …,N

mn YandXXFY );(

Page 5: Acknowledgments:  The authors would like to thank

Neural Network

Y = FNN(X)

Continuous Input to Output Mapping

k

jjjqqq taay

10

n

iiijjj xbt

1

tanh( )Neuron

Page 6: Acknowledgments:  The authors would like to thank

Major Advantages of NNs:NNs are generic, very accurate and convenient mathematical (statistical) models which are able to emulate numerical model components, which are complicated nonlinear input/output relationships (continuous or almost continuous mappings ).

NNs are robust with respect to random noise and fault- tolerant.

NNs are analytically differentiable (training, error and sensitivity analyses): almost free Jacobian!

NNs emulations are accurate and fast but NO FREE LUNCH!

Training is complicated and time consuming nonlinear optimization task; however, training should be done only once for a particular application!

NNs are well-suited for parallel and vector processing

Page 7: Acknowledgments:  The authors would like to thank

NN Emulations of Model Physics Parameterizations

Learning from Data

GCM

X Y

Parameterization

F

X Y

NN Emulation

FNN

TrainingSet …, {Xi, Yi}, … Xi Dphys

NN Emulation

FNN

Page 8: Acknowledgments:  The authors would like to thank

CFS Model: LWR NN emulation

NN dimensionality and other parameters: 591 inputs: 12 variables (pressure, T, moisture,

cloudiness parameters, surface emissivity, gases (ozone, CO2)

69 outputs: 6 variables (heating rates, fluxes) Number of neurons for NN versions: 50 to 150 NN dimensionality for the complex system: 50,000

to 100,000 Training and testing data sets are produced by

saving inputs and outputs of LWR during 2-year T126L64 CFS simulations; half of the data is used for training and another half for validation or NN accuracy estimation vs. the original LWR

Page 9: Acknowledgments:  The authors would like to thank

NN Approximation Accuracy (on independent data set) vs. Original

Parameterization (all in K/day).

Parameterization NN Bias RMSE Mean HR HR

LWR(K/day)

NN75 1. 10-3 0.40 -1.88 2.28

NN85 3. 10-3 0.39

NN95 4. 10-3 0.38

NN Computational Performance: LWR NN emulations are two orders of magnitude faster than the original LWR

Overall CFS model computational performance: ~25-30% faster when using LWR NN emulations vs. the original LWR

Page 10: Acknowledgments:  The authors would like to thank

Individual HR Profiles

Page 11: Acknowledgments:  The authors would like to thank

Top of Atmosphere Upward LWR Flux Global Seasonal and Daily Differences

Page 12: Acknowledgments:  The authors would like to thank

T-850 Global Seasonal & Daily Temperatures and their Didfferences

Max Difference = 0.06 K

Max Difference = 0.1 K

Page 13: Acknowledgments:  The authors would like to thank

Surface Downward LWR Flux Differences

Season 2: 0-5 W/m², max 10-20 W/m² Season 4: 0-5 W/m², max 10-20 W/m²

Page 14: Acknowledgments:  The authors would like to thank

Top of Atmosphere Upward LWR Flux Differences

Season 2: 0-5 W/m², max 10-20 W/m² Season 4: 0-5 W/m², max 10-20 W/m²

Page 15: Acknowledgments:  The authors would like to thank

T Zonal Mean Differences

Season 1: 0 – 0.5 K, max 1.5-2 K Season 2: 0 – 0.5 K, max 1.5-2.5 KSeason 3: 0 – 0.5 K, max 1.5-2 KSeason 4: 0 – 0.5 K, max 2-3 K

Page 16: Acknowledgments:  The authors would like to thank

U Zonal Mean Differences

Season 1: 0 – 1 m/s, max 3 - 4 m/s Season 2: 0 – 1 m/s, max 2 m/sSeason 3: 0 – 1 m/s, max 2 m/sSeason 4: 0 - 1m/s, max 2 m/s

Page 17: Acknowledgments:  The authors would like to thank

V Zonal Mean Differences

Season 1: 0 – 0.1 m/s, max 0.2 - 0.4 m/s Season 2: 0 – 0.1 m/s, max 0.2 - 0.3 m/sSeason 3: 0 – 0.1 m/s, max 0.2 - 0.4 m/sSeason 4: 0 - 0.1m/s, max 0 .2 - 0.4 m/s

Page 18: Acknowledgments:  The authors would like to thank

T-500 differencesSeason 1: 0 - 1 K, max 2-3 K Season 2: 0 - 1 K, max 2-4 KSeason 3: 0 - 1 K, max 2-3 KSeason 4: 0 - 1 K, max 2-3 K

Page 19: Acknowledgments:  The authors would like to thank

Day Two: Upward Top of Atmosphere LWR Flux) Differences near 0 - 2 W/m2, a few minor max of 10 - 20 W/m2

Orig. - NN

Page 20: Acknowledgments:  The authors would like to thank

Day Two: T-850 Differences near 0 – 0.2 K, max 0.5 – 1.5 K

Page 21: Acknowledgments:  The authors would like to thank

Day Two: U-850Differences 0 - 0.1 m/s, max 0.5 – 1 m/s

Page 22: Acknowledgments:  The authors would like to thank

Day Seven: T-850Differences near 0. – 1.0 K, max 2 - 3 K

Page 23: Acknowledgments:  The authors would like to thank

2-Year Mean OLR (Upward Top of Atmosphere LWR Flux), in W/m² Differences 0-5 W/m², max 10-20 W/m²

Page 24: Acknowledgments:  The authors would like to thank

Recent Journal and Conference PapersJournal Papers:

• V.M. Krasnopolsky, M.S. Fox-Rabinovitz, and A. Beloshitski, 2007, “Compound Parameterization for a Quality Control of Outliers and Larger Errors in NN Emulations of Model Physics", Neural Networks, submitted

• V.M. Krasnopolsky, 2007, “Neural Network Emulations for Complex Multidimensional Geophysical Mappings: Applications of Neural Network Techniques to Atmospheric and Oceanic Satellite Retrievals and Numerical Modeling”, Reviews of Geophysics, in press

• V.M. Krasnopolsky, 2007: “Reducing Uncertainties in Neural Network Jacobians and Improving Accuracy of Neural Network Emulations with NN Ensemble Approaches”, Neural Networks, Neural Networks, 20, pp. 454-46

• V.M. Krasnopolsky and M.S. Fox-Rabinovitz, 2006: "Complex Hybrid Models Combining Deterministic and Machine Learning Components for Numerical Climate Modeling and Weather Prediction", Neural Networks, 19, 122-134

• V.M. Krasnopolsky and M.S. Fox-Rabinovitz, 2006: "A New Synergetic Paradigm in Environmental Numerical Modeling: Hybrid Models Combining Deterministic and Machine Learning Components", Ecological Modelling, v. 191, 5-18

Conference Papers;

• V.M. Krasnopolsky, M. S. Fox-Rabinovitz, Y.-T. Hou, S. J. Lord, and A. A. Belochitski, 2007, “Development of Fast and Accurate Neural Network Emulations of Long Wave Radiation for the NCEP Climate Forecast System Model”, submitted to the NOAA 32nd Annual Climate Diagnostics and Prediction Workshop

• V.M. Krasnopolsky, M. S. Fox-Rabinovitz, Y.-T. Hou, S. J. Lord, and A. A. Belochitski, 2007, “Accurate and Fast Neural Network Emulations of Long Wave Radiation for the NCEP Climate Forecast System Model”, submitted to 20th Conference on Climate Variability and Change, New Orleans, January 2008

• M. S. Fox-Rabinovitz, V. Krasnopolsky, and A. Belochitski, 2006: “Ensemble of Neural Network Emulations for Climate Model Physics: The Impact on Climate Simulations”, Proc., 2006 International Joint Conference on Neural Networks, Vancouver, BC, Canada, July 16-21, 2006, pp. 9321-9326, CD-ROM

Page 25: Acknowledgments:  The authors would like to thank

Conclusions • Developed NN emulations of LWR for the CFS model show

high accuracy and computational efficiency • Initial validation of NN emulations for LWR through CFS

model runs using the NN emulations vs. the CFS model run with the original LWR show a close similarity of the runs, namely the differences are mostly within observational errors or the uncertainty of observational data or reanalyses.

• For seasonal predictions: Differences are not growing from season 1 to season 4 and are mostly within observational errors or the uncertainty of observational data or reanalyses.

• For climate simulations: Differences are mostly within observational errors or the uncertainty of observational data or reanalyses.

• For short- to medium-term forecasts: Differences are only a small fraction for Day-2 and a fraction for Day-7 of observational errors or the uncertainty of observational data or reanalyses.

Potential applications to GFS and/or DAS?

Page 26: Acknowledgments:  The authors would like to thank

Near term (FY07 and Year-2 of the project) science plans

• Completing work on LWR NN emulation– Generating more representative data sets– Continuing training and validation of LWR NN emulations for the

CFS model– Continuing experimentation and validation of seasonal climate

predictions with LWR NN emulations• Refining the NN methodology for emulating model physics

– Work on an NN ensemble approach aimed at improve accuracy of NN emulations

– Develop a compound parameterization for quality control (QC) and for dynamical adjustment of the NN emulations

• Refining experimentation and validation framework• Continuation of development of NN emulations for the CFS model

radiation block– Analysis of CFS SWR, generating initial training data sets, and

developing initial SWR NN emulations– Initial experiments with SWR NN emulation for the CFS model

• Initial development of the project web site and its link to a relevant NCEP and/or CTB web sites

Page 27: Acknowledgments:  The authors would like to thank

Future (FY08 and Year-3 of the project) science plans • Completing work on SWR NN emulation for the CFS model - Training and validation of SWR NN emulations

- Performing seasonal climate predictions with SWR NN emulation

• Performing extensive CFS seasonal climate predictions with developed NN emulations for the CFS full radiation block (LWR & SWR), and validating their overall impact computational efficiency

• Completing the transition of the developed NN radiation products into the NCEP operational CFS

• Completing development of the project web site and using it for interactions with potential NCEP users and other users in educational and research communities

• Preparation for future developments: other CFS model physics components, and potential applications to other NCEP systems like GFS and climate predictions

Page 28: Acknowledgments:  The authors would like to thank

Additional Plots

Page 29: Acknowledgments:  The authors would like to thank

5/31/2007; GFDL V. Krasnopolsky & M. Fox-Rabinovitz, Neural Networks for Model Physics

29

Linear part Nonlinear part

x1

xn

xi

x2

xn-1

NN - Continuous Input to Output MappingMultilayer Perceptron: Feed Forward, Fully Connected

1x

2x

3x

4x

nx

1y

2y

3y

my

1t

2t

kt

NonlinearNeurons

LinearNeurons

X Y

Input Layer

Output Layer

Hidden Layer

Y = FNN(X)

Jacobian !

Neuron

tj

0 01 1 1

01 1

( )

tanh( ); 1,2, ,

k k n

q q qj j q qj j ji ij j i

k n

q qj j ji ij i

y a a t a a b x

a a b x q m

1

1

( )

tanh( )

n

j j ji ii

n

j ji ii

t b x

b x

jjTj sbX jj ts )(

Page 30: Acknowledgments:  The authors would like to thank

RH Zonal Mean Differences

Season 1: 0 - 1%, max 4 - 5% Season 4: 0 - 2 %, max 4 - 6%

Page 31: Acknowledgments:  The authors would like to thank

T-850 differencesSeason 1: 0 - 1 K, max 2-4 K Season 2: 0 - 1 K, max 2-4 KSeason 3: 0 - 1 K, max 2-4 KSeason 4: 0 - 1 K, max 2-3 K

Page 32: Acknowledgments:  The authors would like to thank

Day Two: Surface Upward LWR Flux Differences near 0 - 2 W/m2, a few minor max of 10 - 30 W/m2

Orig. - NN


Recommended