+ All Categories
Home > Documents > Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter...

Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter...

Date post: 22-Dec-2015
Category:
Upload: james-edwards
View: 216 times
Download: 1 times
Share this document with a friend
Popular Tags:
16
Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.
Transcript
Page 1: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Translating verification experience from meteorology to space weatherSuzy Bingham

ESWW splinter session, Thurs. 20th Nov.

Page 2: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Current space weather validation, verification & metrics

• Validation: plans to work with CCMC for TEC validation.

• Verification: plan to use experience in terrestrial methods; to verify models & forecaster warnings.

• Application metrics/KPIs: have extended terrestrial infrastructure to monitor space weather systems. Plan to verify for stakeholders.

Met Office forecaster Webpages: Enlil & REFM.

Page 3: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Verification of forecasts

• Why verify: to understand quality/accuracy, evidence for stakeholders, forecaster feedback, to further improve, to compare models/methods.

• Space weather forecasters produce guidance: twice daily. These forecasts include probability forecasts for geomagnetic storms, X-ray flares, high energy protons & high energy electrons.

Example probability forecasts

Page 4: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Verification of forecasts

Metrics/performance indicators used in terrestrial weather forecasting verification for stakeholders (with examples):

1. Severe weather warning accuracy (score: impact level, area covered & validity time)

2. Forecast accuracy (e.g. daily minimum temp accuracy should be +/-2°C)

3. Public value (“How useful are forecasts these days”?)

4. Public reach (“Have you seen any weather warnings in the last few days?”)

5. Service quality (Timeliness scores for model delivery.)

6. Emergency responder value (How satisfied are you with the service?”)

7. Responder reach (Availability of Hazard Manager application to emergency responder community)

8. National capability (95% of lightning location data messages should be available within a certain time.)

9. Milestone achievement (Develop a national electronic weather data archive)

Page 5: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

© Crown copyright Met Office

Verification of warnings

time

Low Event threshold

HIT

FALSE ALARM

MISSMISS

NON-EVENT

NON-EVENT

Warning period End time + late hit period

Issue time

EARLY HIT

LATE HIT

Event threshold EARLY

LOW HIT

LATE LOW HIT

LOW HIT

Introducing the warnings verification system

Page 6: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Gale warning verification

St Judes Storm 27-28 October 2013...

... a hit!

Page 7: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.2 0.4 0.6 0.8 1

FALSE ALARM RATE

HIT

RA

TE

NO FLEXTEMPORAL FLEXTEMPORAL + INTENSITY FLEXno skill

Performance: ROC plot

Performance of Forecaster issued Heavy Rainfall Alerts in 2012:

© Crown copyright Met Office

0.50.60.70.80.9

1

RO

C a

rea

Page 8: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Application metrics:timely, reliable, robust

Timeliness plot for Global Model.

Server monitoring to verify system is robust/reliable.

Page 9: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Data latency

• SOHO: EIT 4 channels, LASCO – C2, C3

• LASCO C2 & C3 – 12 min cad / ~1-2 hrs latency (but data gaps – sometimes up to 5 hrs).

• SDO: AIA 12 channels, HMI-magnetogram

• STEREO A&B – COR1, COR2, EUVI

• cadence: COR1 – 1hr, COR2 – 15 min, EUVI – 10 min

• latency – 1-3 hrs (but currently very big data gaps/no data).

• ENLIL: run every 2 hours

• Run completed / graphics produced around 4 hrs after model analysis time (ie T=0).

WMO Coronograph image requirements, OSCAR webpage

Page 10: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Met Office Business Performance Measures (BPMs)

The Met Office BPMs are: Forecast accuracy, Growth, Reach, Customer & Service Delivery, Efficiency & Sustainability Excellence.

Verification underpins the Forecast Accuracy BPM which is set by government. This BPM is to improve:

1. Global NWP Index

2. UK NWP Index

3. Public Forecasts

4. Customer Forecasts

Global NWP Index is compiled from:• Mean sea-level pressure,• 500 hPa height,• 850 hPa wind,• 250 hPa wind.

Plot showing increase in Global NWP Indexwhich shows an improvement in Global model accuracy.

Page 11: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Summary

• Met Office is currently applying some validation, verification & metrics to space weather models/service (e.g. verifying reliability of service).

• Met Office is planning to take part in CCMC TEC challenge with initial longitudinal section case studies over Europe.

• Met Office is planning to adapt terrestrial verification methods to space weather to allow forecasters to understand warning accuracy (e.g. flexible verification system).

• Met Office would like to provide simple metrics to stakeholders.

Page 12: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Useful links

• WMO space weather observation requirements, Observing Systems Capability Analysis & Review Tool: http://www.wmo-sat.info/oscar/applicationareas/view/25

• Forecast verification, Centre for Australian Weather & Climate Research: www.cawcr.gov.au/projects/verification/

• http://www.eumetcal.org/resources/ukmeteocal/temp/msgcal/www/english/courses/msgcrs/crsindex.htm

• NOAA verification metrics glossary: http://www.nws.noaa.gov/oh/rfcdev/docs/Glossary_Verification_Metrics.pdf

Page 13: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Questions and answers

Page 14: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Verification of forecasts

Metrics/performance indicators used in terrestrial weather forecasting verification for stakeholders:

1. Severe weather warning accuracy

2. Forecast accuracy

3. Public value

4. Public reach

5. Service quality

6. Emergency responder value

7. Responder reach

8. National capability

9. Milestone achievement

0-2 Very Poor Warning was missed or gave very poor guidance to customer, perhaps being classed as a “False Alarm”

3-4 Poor Guidance

Although a warning was issued it gave poor guidance to the customer

5-7 Good Guidance

A warning was issued which gave generally good guidance to the customer

8-9 Excellent Guidance

The warning issued gave excellent guidance to the customer

Scoring for quality of sever weather warning.

Page 15: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Performance: reliability

Performance of Forecaster issued Heavy Rainfall Alerts in 2012:

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0

Probability

Ob

serv

ed

Re

lativ

e F

req

ue

ncy

no resolution

no skill

NO FLEX

TEMPORAL FLEX

TEMPORAL+INTENSITY FLEX

perfect reliability

© Crown copyright Met Office

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0

Probability

Ob

serv

ed

Re

lativ

e F

req

ue

ncy

01234567

0.3 0.5 0.8probability

tho

usa

nd

s o

f wa

rnin

gs

Page 16: Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Met Office Business Performance Measures (BPMs)

The Met Office BPMs are: Forecast accuracy, Growth, Reach, Customer & Service Delivery, Efficiency & Sustainability Excellence.

Verification underpins the Forecast Accuracy BPM which is set by government. This BPM for FY13/14 was to improve:

1. Global NWP Index - 101.88 with a stretch of 102.43.

2. UK NWP Index - 123.05 with a stretch of 123.9.

3. Public Forecasts- 12/17 forecast targets met with a stretch of 17/17.

4. Customer Forecasts- 2/3 forecast targets met with a stretch of 3/3.

Global NWP Index is compiled from:• Mean sea-level pressure,• 500 hPa height,• 850 hPa wind,• 250 hPa wind.

Plot showing increase in Global NWP Indexwhich shows an improvement in Global model accuracy.


Recommended