GAIA-CLIMMeasurementMaturityMatrixGuidance
1
GAIA-CLIMReport/DeliverableD1.3
GapAnalysisforIntegratedAtmosphericECVClimate
Monitoring:
Reportonsystemofsystemsapproachadoptedandrationale
AHorizon2020project;Grantagreement:640276Date:27thNovember2015LeadBeneficiary:NUIMNature:RDisseminationlevel:PU
GAIA-CLIMMeasurementMaturityMatrixGuidance
2
Work-package WP1(Geographicalcapabilitiesmapping)Deliverable D1.3Nature RDissemination PULeadBeneficiary NationalUniversityofIrelandMaynoothDate 27/11/15Status FinalAuthors PeterThorne(NUIM),JoergSchulz(EUMETSAT),DavidTan(ECMWF),
BruceIngleby(ECMWF),FabioMadonna(CNR),GelsominaPappalardo(CNR),TimOakley(MO/GCOS)
Editors AnnaMikalsen(NERSC)Reviewers KarinKreher,ArnoudApituley,GregBodeker,BarryGoodison,Mark
Bourassa,MatthiasBuschmannandGePengContacts [email protected] http://www.gaia-clim.euThisdocumenthasbeenproducedinthecontextoftheGAIA-CLIMproject.TheresearchleadingtotheseresultshasreceivedfundingfromtheEuropeanUnion'sHorizon2020Programmeundergrantagreementn°640276.Allinformationinthisdocumentisprovided"asis"andnoguaranteeorwarrantyisgiventhattheinformationisfitforanyparticularpurpose.Theuserthereofusestheinformationatitssoleriskandliability.Fortheavoidanceofalldoubts,theEuropeanCommissionhasnoliabilityinrespectofthisdocument,whichismerelyrepresentingtheauthors’view
GAIA-CLIMMeasurementMaturityMatrixGuidance
3
Documenthistory
Version Author(s) /
ReviewersDate Changes
0 PeterThorne 05/04/15
0.1 Peter Thorne with
input from Fabio
Madonna, David Tan
andJoergSchulz
21/05/15 Substantialadditionsfollowing
first teleconference of task
participants
0.1.1 Peter Thorne based
upon input from
FabioMadonna
12/06/15 Minor changes to reflect
Fabio’sinput
0.2 Peter Thorne, Karin
Kreher, Fabio
Madonna, Tim
Oakley, Joerg Schulz,
Gelsomina
Pappalardo, Arnoud
Apituley
15/07/15 Substantialadditionsandedits
baseduponfeedbackreceived.
0.3 PeterThorne 24/7/15 Edits made based upon
outcomesofgroupcallonJuly
17th
0.31 PeterThorne 19/8/15 Minor changes to account for
resynchronising with the
spreadsheet and edits on
furtherreadthrough
0.4 Peter Thorne, Bruce
Ingleby, Fabio
Madonna
2/9/15 Changes to account for Bruce
Ingleby and Fabio Madonna
input
1.0 Peter Thorne, Joerg
Schulz, Fabio
Madonna
10/9/15 Final changes before external
review
1.1 Peter Thorne, Mark
Bourassa, Greg
Bodeker, Matthias
Buschmann, Barry
Goodison, Bruce
Ingleby, Arnoud
Atipuley, Stephan
29/10/15 Changes in response to
internal and external review
commentsreceived
GAIA-CLIMMeasurementMaturityMatrixGuidance
4
Bojinski, Karin
Kreher,GePeng
1.2 Peter Thorne, Karin
Kreher
12/11/15 Tidyingchanges
2 PeterThorne 30/11/15 Finalsubmittedversion
GAIA-CLIMMeasurementMaturityMatrixGuidance
5
TableofContents
ExecutiveSummary..........................................................................................................71. Documentrationaleandbroadercontext........................................................82. Tieredapproachtoassigningmeasurementcapabilities.........................112.1 Requirementsforatieredapproach..............................................................112.2 Proposedtiersfornon-satellitemeasurementcapabilitiesandpossiblesystemofsystemsbenefits.........................................................................122.3 Tierdefiningcharacteristics.............................................................................142.3.1 Globalreferenceobservingnetworks................................................................142.3.2 Globalbaselineobservingnetworks...................................................................152.3.3 Comprehensiveobservingnetworks..................................................................16
3. Objectivelyassessingmeasurementcapabilities.........................................173.1 Maturityassessmentconcept..........................................................................173.1.1 Maturityscoresandtierednetworksconcept.................................................183.1.2 Interpretingthematurityassessmentresults.................................................193.1.3 Practicalapplicationconsiderations..................................................................21
3.2 Howtoperformanassessment........................................................................223.3 Metadata..................................................................................................................223.3.1 Standards......................................................................................................................233.3.2 Collectionlevelmetadata(includingchangerecords).................................243.3.3 FileLevel.......................................................................................................................25
3.4 Documentation.....................................................................................................263.4.1 Formaldescriptionofmeasurementmethodology........................................263.4.2 Formalvalidationreport.........................................................................................273.4.3 Formalmeasurementseriesuserguidance......................................................28
3.5 Uncertaintycharacterisation..........................................................................293.5.1 Traceability..................................................................................................................303.5.2 Comparability.............................................................................................................313.5.3 Uncertaintyquantification.....................................................................................323.5.4 Routinequalitymonitoring....................................................................................33
3.6 Publicaccess,feedbackandupdate..............................................................353.6.1 Access.............................................................................................................................353.6.2 Userfeedbackmechanism......................................................................................363.6.3 Updatestorecord......................................................................................................373.6.4 Versioncontrol...........................................................................................................383.6.5 Long-termdatapreservation.................................................................................39
3.7 Usage.........................................................................................................................403.7.1 Research........................................................................................................................413.7.2 Publicandcommercialexploitation...................................................................41
GAIA-CLIMMeasurementMaturityMatrixGuidance
6
3.8 Sustainability.........................................................................................................423.8.1 Sitingenvironment....................................................................................................433.8.2 Scientificandexpertsupport.................................................................................443.8.3 Programmaticsupport.............................................................................................45
3.9 Softwarereadiness(optional)..........................................................................463.9.1 Codingstandards.......................................................................................................473.9.2 Softwaredocumentation.........................................................................................483.9.3 Portabilityandnumericalreproducibility........................................................483.9.4 Security..........................................................................................................................49
4. Challengestoadoption..........................................................................................514.1 Namingnomenclatureforexistingnetworksacrossandwithindomains514.2 End-UserAdoption.......................................................................................................524.3 Realisingtechnologicalandscientificbenefitsofatieredsetofcapabilities.................................................................................................................................524.4 Potentialfutureapplicabilitytothesatellitedomain.......................................52
Acknowledgements........................................................................................................55References.........................................................................................................................55AppendixA GAIA-CLIMmeasurementdescription..........................................571 Intentofthedocument....................................................................................................572 Pointofcontact..................................................................................................................573 Measurementsdescription............................................................................................574 Dataorigin...........................................................................................................................575 Validationofanuncertaintyestimation...................................................................576 Considerationsforscientificapplications...............................................................587 References..........................................................................................................................58
AppendixB Measurementmaturityassessmentspreadsheet....................59
GAIA-CLIMMeasurementMaturityMatrixGuidance
7
ExecutiveSummaryInthefirstinstance,thisguidanceisintendedspecificallyforassigningsuitabilityof
candidatenon-satellitemeasurementsforsatellitecalibrationandvalidation,underthe
Horizon2020fundedGAIA-CLIMproject.However,itisenvisagedthatitmaybeadopted
morebroadly.Theguidancebuildsuponasimilarefforttoassessclimatedatarecord
maturityundertheFrameworkProgram7fundedCORE-CLIMAXproject.
Thisguidanceexiststosupportthedesignationofnon-satelliteobservationalcapabilities
intoastructuredsystemofsystemsarchitectureconsistingof:
• Referencequalitynetworksthathaveamongstothers:stricttraceabilityand
comparability,richmetadata,knowndataoriginandquality,andlong-term
infrastructuresupport
• Baselinenetworksthatarewellcharacterisedandhavealong-termmonitoring
commitment
• Comprehensivenetworksthatconsistofabroadrangeofobservationalcapabilities
managedformyriadpurposes.
Suchadesignationhasmanypotentialscientificandsocietalbenefits,relatingtothe
appropriateuseofthedatacollectedformanyapplications.Thedesignationisachieved
throughapplyingasetofsemi-quantitativeassessmentcriteriaagainstthefollowingseven
thematicareas,whichmayreasonablydifferentiatetheobservationalcapabilitymaturity:
1. Metadata
2. Documentation
3. Uncertaintycharacterisation
4. Publicaccess,feedback,andupdate
5. Usage
6. Sustainability
7. Software(optional)
GAIA-CLIMMeasurementMaturityMatrixGuidance
8
1. Documentrationaleandbroadercontext
The purpose of this document is to provide a framework to semi-objectively classify
measurement capabilities, andhence to ensure scientifically rigorous and robust usage. It
relates primarily to specific non-satellite observing networks and / or capabilities such as
observationally-based research infrastructures. It could also potentially be used on
individual instruments / sites, although such a specific assessmentwould be a substantial
undertaking.So,wherepossible,aconsiderationofnetworks/infrastructuresthatoperate
tocommonstandardsisencouraged.
Theaimistoassignobservationalcapabilitiesintoasetoftiers,toensureoptimalusagein
subsequent applications, such as satellite calibration and validation or limited area
forecasting.ItbuildsupontheconceptsofclimatedatasetmaturitydevelopedundertheFP-
7 CORE-CLIMAX project [Schulz et al., 2015]. These in turn were built upon earlier work
undertaken at NOAA [Bates and Privette, 2012]. This document assumes that basic
metadata on themeasurements to be assessed such as themeasurement geo-location(s)
andtheinstrumenttypesareavailable.Withinthiscurrentprocess,adeeperassessmentof
the data andmetadata properties is undertaken, to allow amore rigorous assessment of
suitability. The assessment is based upon a number of thematic areas such as
documentation, uncertainty quantification and sustainability, which can be used to
characterisethecriticalaspectsofmeasurementsystemmaturity.
The assessment of observational measurement capabilities (this guidance) and derived
datasets and products from these measurements (the CORE-CLIMAX based guidance) is
somewhat distinct. The taking of measurements is the collection of original data and
metadatathatisdirectlyor,morecommonly,indirectlyanestimateofthetargetmeasurand.
Measurementseriesresultfromcontinuousorperiodicallyrepeatingobservations,usingthe
sameorsimilarmeasurementtechniques,thatareprocessedfromtherawmeasurementto
an estimate of the target measurand(s). Derived datasets and analyses use sets of such
measurements and apply substantial post-processing steps to aggregate, analyse and,
perhaps,filterand/orinterpolate.Theydonotincludethecollectionofprimarydata.This
distinctioninwhatisdoneincreatingameasurementandadatasetand,therefore,whatis
being assessed, matters. Hence it likely requires separate, but similar, sets of guidance.
Considerationwas given in the first instance to simply reusing the existing CORE-CLIMAX
maturityassessmenttables,whilewritingnewmeasurement-system-specific interpretation
guidance. However, it was felt, after considerable discussion, that there were sufficiently
uniqueaspectstoassessingthemeasurementsratherthandatasets,reanalysesandsimilarly
derivedproducts,towarrantaseparatesetoftables.Indoingsosomecategorieshavebeen
removedormadeoptional,othershavebeenmodified,andseveralentirelynewcategories
andsub-categorieshavebeenadded.Inthelongertermitmaybepossibleanddesirableto
remergetheseguidelines,butthatwouldrequireanewprojecttobeinitiatedtothisend.
To enable such a future reconciliation, wherever possible, the CORE-CLIMAX tables have
beenunchangedtoallowtraceabilityandtransferability.
GAIA-CLIMMeasurementMaturityMatrixGuidance
9
Userswishingtoassessmaturityandsuitabilityofdatasets,reanalysesorotherapproaches
thataggregateandanalyselargesetsofmeasurements,tocreateclimateorenvironmental
datarecordsforgivenapplications,shouldrefertoandusetheCORE-CLIMAXUserGuideon
theSystemMaturityMatrix [Schulzetal, 2015].Userswishing toassess thematurityofa
givensetofmeasurementsshouldusetheguidanceandtablesprovidedinthisdocument.
Thedividinglinebetweenasetofmeasurementsandaclimatedatarecordisrecognisedas
meaningdistinctthingstodifferentusers.Toattempttoclarifywhichsetofguidanceshould
beused,Table1listssomesalientfeaturesandthelikelydistinctions,tosupporttheuseof
themostappropriatesetofguidanceandtablesinanygivencase.
The guidance in this document should be
usedforanon-satellitemeasurementseries
if…
CORE-CLIMAX guidance should be used for
aclimatedatarecordif…
Data being considered is an (set of)
individual time series arising from one or
more defined instruments, either at fixed
locationsorusingmobileplatforms.
Databeingconsideredhaveglobaloratleast
continental scale coverage arising from
satellite data or a substantively aggregated
setofnon-satellitedata.
Available documentation addresses the
instrument and / or arises from technical
documentationdescribingthemeasurement
process.
Available documentation addresses the
construction,usageandvalidationaspectsof
a data product (CDR) in the peer-reviewed
literatureand/ortechnicaldocumentation.
Table 1. Decision guidance as to whether the current set of maturity matrices, or those
developedunderCORE-CLIMAX,arelikelymostappropriateforagivenusecasebasedupon
criteriathatshouldpermiteasydetermination.
Like theCORE-CLIMAXUserGuideon SystemMaturityMatrix, there is an explicit limit to
how far this guidance can take the user. If applied rigorously, the user can gain an
appreciation of the relative maturity of key relevant facets of a set of measurements
undertaken, for example, by a networkormeasurement infrastructure.However, there is
notandcannotbe,asinglethresholdthatcanbeusedtouniquelydecidewhetheragiven
setofmeasurementshasreachedagivenmaturitylevel(Section2).Rather,theassessment
providesthebasis forausertodecideuponadefensible levelofmaturity,andprovidesa
chain of semi-quantitative evidence that can be used to support their assignment. The
assessmentisintendedtodefinethemeasurementsystemmaturity,andnotthesuitability
foragivenapplicationwhichwill,byitsverydefinition,beapplicationspecific.
Inrealitytherearetwoprincipalsetsofpotentialusersof thisguidanceand itsoutcomes.
The first set of users consists of people undertaking the assessment or undertaking the
measurementstobeassessed.Forthisgroupofusers,itiskeythattheyunderstandhowto
implement the assessment outlined in Section 3, and how they can utilise the results to
pointtowaystoimprovethematurityand,hence,scientificutilityofexistingmeasurement
systems.Thesecondsetofusersconsistsofscientists,dataanalysts,policymakersetc.who
may use the outcomes of the assessments to guide either their use of data, or decision-
making,orboth.SubsequenttaskswithinGAIA-CLIMshallundertakeaninitialassessmentof
GAIA-CLIMMeasurementMaturityMatrixGuidance
10
maturity of many existing measurements and develop, and provide a range of tools to
supportthesecondidentifiedsetofusers.
Given the heterogeneity of surface, ground-based remote sensing, balloon-borne and
aircraftmeasurements(non-satellitemeasurements)andtheirfundingandgovernance,this
guidanceconcentratesuponsuchmeasurements. In theory,a similarassessmentcouldbe
madeforthesatellite-basedfundamentalmeasurements(Level0andpotentiallyuptolevel
1A/1B).However,giventheGAIA-CLIMremitthisguidancedoesnotatthistimeextendto
thesatellitedomain.Section4.4brieflydiscussesfuturepotentialextensioninthisdirection.
Theremainderofthisguidanceisstructuredasfollows.InSection2thetieredapproachto
network measurement capabilities concept is outlined. This includes discussion of the
potential scientific andmeasurement technology and practices benefits that could accrue
from an explicit consideration of a tiered network of networks design to non-satellite
measurementcapabilities.Section3containsthesubstantiveassessmentcriteria,alongwith
the guidance necessary to complete the assessment. Therein each assessment area (or
strand)isdiscussedandguidanceonitsappropriatecompletionisgiven.Itiscomplemented
by an excel spreadsheet which can be used to collate the assessment. Finally, Section 4
outlinesanumberof likely challenges tobroaderadoptionby the scientific communityof
theconceptsdetailedherein.
GAIA-CLIMMeasurementMaturityMatrixGuidance
11
2. Tieredapproachtoassigningmeasurementcapabilities
Currently, little to no effort has been made to define and broadly agree amongst global
stakeholdersthemeasurementandnetworkcharacteristicsunderlyingaproposedsystemof
systems approach to non-satellite Earth Observation capabilities. This is despite the
existenceofgroupssuchasGlobalEarthObservationSystemofSystems(GEOSS),withthe
System of Systems implicit in its name. Within the peer reviewed literature, explicit
referencetoatierednetworkofnetworksapproachis,toourknowledge,limitedtoSeidel
et al., 2009. Such a system of systems concept is also present in several recent GCOS
documentsandNAS,2009.Atieredsetofnetworksapproachisarguablynecessarytomake
sense of themosaic of observational capabilities at our disposal, and hence use the right
measurementsforthecorrectapplication.
Specifically,forGAIA-CLIM,itisnecessarytohaveaworkingmodelfromwhichto:
• Define tiers of capabilities that may define fitness-for-purpose, for different
candidate non-satellite measurement programs, to be used to understand and
ultimatelyconstrainsatellitemeasurements;
• Assessandmapthesenon-satellitemeasurementcapabilities;and
• Select those measurements that have the necessary metrological (the science of
measurement)characteristicstobeusedinthoseprojectworkpackagesconcerned
with co-location uncertainty quantification, data assimilation and the virtual
observatory.
Itishopedthatthetierdesignationsandunderlyingassessmentcriteriaproposedhereincan
gainbroadertractionwithintheEarthObservationcommunityasawhole.But,initially,itis
solelynecessarytodefineaworkingmodelthat isacceptableacrossGAIA-CLIM,toenable
subsequenttaskswithintheprojecttobeundertaken.
2.1 Requirementsforatieredapproach
A perfect measurement is not a metrological possibility, because any measurement will
always to someextent differ from the true valueof themeasurand. In an idealworld, all
measurements undertaken to monitor the climate system would be sustained,
metrologically traceable and comparable, and have a robustly determined and
comprehensivetotaluncertaintybudget.Theseuncertaintieswouldbecommensuratewith
thebestpractices intheGuidetoUncertainty inMeasurements [JGCM,2008]. Inthereal-
world, the heterogeneity of different instruments and the complexity of requirements for
observations (including process studies, long-termmonitoring, real-time applications etc.)
require, instead, a tiered systemof systemsarchitecture. Such an approach combines the
advantages of high-quality achieved by a few selected reference-quality sites, with the
ability of baseline networks to both provide a representative sampling and benefit from
GAIA-CLIMMeasurementMaturityMatrixGuidance
12
reference-networkinnovations,andthenwithdensercoverageachievedbycomprehensive
observingnetworks.
Theverybestmeasurements thatwecaneverhopetomakewouldhave fullmetrological
traceability toSIunitsoracceptedstandards,andhavethesmallestpossible technological
achievable associated total uncertainty budgets. These measurements have exacting
requirements.Thusforbothtechnicalandfinancialreasons,theirwidespreadandsustained
deploymentacrosstheglobe,attherequireddensitytobethesolesourceofobservations,
isnotfeasible.This isparticularlysowhenconsideringthemyriadpossibleapplicationsfor
measurementsoftheatmospheric,oceanicandterrestrialECVs.Therewillalwaysbeaneed
for additional measurements, of lower absolute quality, to provide geographical and
temporal detail. Such measurements are still useful for a broad range of applications,
assuming that they are used appropriately. Someof thesemeasurementswill need to be
sustained, to enable characterisation of regional variability and change for longer-term
climatemonitoring.
From the perspective of network operators, there are distinct advantages to a system-of-
systemsarchitecture.Itprovidesanaspirationaltrajectoryforsites,suchthatsitesinagiven
tiercanworktowardspromotiontoahighertier.Italsoprovidesapotentialmechanismby
which innovations in instrumentation and techniques can ‘trickle-down’, aiding all
measurementsandapplicationareas.
Firstly,however,tomaximizethereturn-on-investmentofthecurrentlyavailableandfuture
non-satellite observational capabilities portfolio, it is necessary to clearly define
measurementcapabilitytiers,whichindividualnon-satelliteobservationalprogramscanbe
placed into. In that way, users can employ the measurements appropriately and with
confidence.Itis,therefore,necessarytocreatecriteriawhichareasobjectiveaspossibleby
whichtodesignateagivencandidatemeasurementseriesormeasurementprogramintothe
mostappropriatetier.Finally,mappingthesecapabilitiesinvariouswayscanaidendusers
tomakeinformedandappropriatedecisionsandanalyses.
2.2 Proposed tiers for non-satellite measurement
capabilitiesandpossiblesystemofsystemsbenefitsIt isproposedthatGAIA-CLIMusesthetierdesignationsdefined inSeideletal.,2009,and
discussed further in GCOS, 2014 (Figure 1). The tier designation should be a function of
demonstrable measurement qualities such as: traceability, metadata, comparability, data
completeness, documentation, record longevity, measurement program stability and
sustainability,etc..FollowingtheexampleofCORE-CLIMAX,itisintendedthattheseaspects
be assessed semi-quantitatively, through a combination of self-assessment and external-
assessmentof capabilities, against a consistentlydefined setof assessment criteria. Solely
self-assessmentmaybepossibleforcertainaspects,wherebyonlythenetworkorsitestaff
havetheknowledgenecessarytoundertaketheassessment.Theassessmentprocesshasa
rangeofbenefitstoboththeinstitutions/individualsundertakingthemeasurementsandto
end-users, as will become apparent later. Sites or networks may both transition up and
GAIA-CLIMMeasurementMaturityMatrixGuidance
13
(hopefully less frequently) down between tiers and, as such, periodic reassessment is
encouraged.
Figure1.ProposedtiersinasystemofsystemsapproachtobeadoptedwithinGAIA-CLIM.
Thestartingpointisaschematicviewofmeasurementsasaninherentlyinterlinked“system
of systems”. In general terms, measurements typically involve a trade-off between
propertiessuchasfidelity&traceability(i.e.thedegreetowhichthevaluesreproducethe
real-world state and have fully-characterised uncertainties), and properties such as
representativeness (both in terms of sampling and resolution). The proposed system of
systems recognises that resulting datasets and analyses / reanalyses are generated via a
combination of measurements and subsequent analysis and computational protocols.
Presently,thereisadistincttrade-offbetweenspatio-temporaldatacompletenessanddata
fidelity. In part this arises because the synergies and benefits of a coordinated system of
systemsapproacharenotbeingrealised.
GAIA-CLIM envisages a possible future inwhich fidelity and geographic completeness are
improved for all componentswithin the systemof systems through robust, sustained and
co-ordinatedengagement,bothbetweenandwithinthedifferentobservingtiers.Formany
of the non-satellite systems, we still consider/manage them operationally as entirely
independentnetworks.Takeforexampleradiosondes,wehaveGRUAN,GUANandthetotal
network, which fits well into the proposed tiers. But very few of the National networks
considertheirlocationsandoperationalschedulesasacomponentofanupper-airnetwork
incorporating radiosondes, profiling radars, aircraft, lidars etc.. There are exceptions for
some subsets of observational capabilities. For example, EUMETNET tries to coordinate
observationsundertakenbyEuropeanNationalMeteorologicalServices,andthisandsimilar
effortsmayproveamodel going forwards. Such sustainedengagementwouldencompass
aspectssuchas:
• Pro-activenetworkdesign(includingrationalisationofprograms)andco-locationof
existingobservingcapabilitiestomaximisescientificreturnoninvestment;
• incrementalimprovementsininstrumenttechnology;
GAIA-CLIMMeasurementMaturityMatrixGuidance
14
• step-changeintroductionofnewmeasurementtechniques;
• continueddevelopment,andgreateradoptionof,“best-practice” inall component
systems;
• improvedmetrologicalcharacterisationanduncertaintyquantification;
• iterative life-cycles of dataset generation, validation/evaluation and reprocessing;
and
• better observationally constrained data assimilation systems through use of
additionaldatastreamsandtraceableobservationaluncertaintyestimates.
Threeessentialelementsforrealisingtheseimprovementsare:
• Sustained communication and coordination amongst the various tiers and the
networks, both national and international, which contribute to them, with clear
proceduralprotocolstoensureeffectiveintegration;
• robust operational frameworks capable of delivering iterative reassessments and
reprocessing;and
• targetedresearchthatwillidentify,andaddress,keyobstaclesandlimitations.
SuchanapproachisbeyondtheremitofGAIA-CLIMfundedactivitiesandchartertoachieve.
Rather,itismoreappropriatelyachievedthroughrelevantglobalgovernanceactivities,such
astheWMOIntegratedGlobalObservingSystem(WIGOS),whichwasofficiallyendorsedby
theWorldMeteorologicalOrganizationat its2015Congress.TheWIGOSconceptexplicitly
envisages an integrated approach to the use of observing systems. The designation and
adoptionofthetieredapproachandassessmentcriteriaareapre-requisitetorealisingthis
vision,towhichGAIA-CLIMcancontribute.
2.3 Tierdefiningcharacteristics
It is proposed that GAIA-CLIM defines themeasurement capabilities in the followingway
(modifiedfromGCOS,2014).
2.3.1 GlobalreferenceobservingnetworksThese networks provide metrologically traceable observations, with quantified
uncertainty, at a limited number of locations, or for a limited number of observing
platforms,forwhichtraceabilityhasbeenattained.
• Themeasurements are traceable through an unbroken processing chain (inwhich
the uncertainty arising in each step has been rigorously quantified) to SI units,
Common Reference Points defined by BIPM, or community recognised standards
(ideally recognised by National Measurement Institutes), using best practices
documentedintheaccessibleliterature.
• Uncertaintiesarisingfromeachstepintheprocessingchainarefullyquantifiedand
included in the resulting data. Combined expanded coverage factors (2 standard
deviations of traceable uncertainty estimates which are referred to as expanded
GAIA-CLIMMeasurementMaturityMatrixGuidance
15
coverage factors in the GUM), are reported for each data point. Individual
components of the uncertainty budget are available. Where uncertainties are
correlated,theseareappropriatelyhandled.
• Fullmetadata concerning themeasurements is captured and retained, alongwith
the original raw data, to allow subsequent reprocessing of entire data streams as
necessary.
• The measurement and its uncertainty are verified through complementary,
redundant,observationsofthesamemeasurandonaroutinebasis.
• Theobservationsprogramisactivelymanagedandhasacommitmenttolong-term
operation,totheextentpossible.
• Change management is robust including a sufficient program of parallel and/or
redundant measurements to fully understand any changes that do occur.
Unnecessarychangesareminimised.
• Measurement technology innovation is pursued. New measurement capabilities
throughnewmeasurementtechniques,orinnovationstoexistingtechniques,which
demonstrably improve the ability to characterize themeasurand, are encouraged.
These innovationsmustbemanaged insuchawayas tounderstandtheir impacts
onthemeasurementseriesbeforetheyaredeployed.
2.3.2 GlobalbaselineobservingnetworksThese networks provide long-term records that are capable of characterising regional,
hemisphericandglobal-scalefeatures.
• Thebaselinenetworkisagloballyandregionallyrepresentativesetofobservations
capable of capturing, at a minimum: global, hemispheric and continental scale
changesandvariability.Assuch,abaselinenetworkmaybeconsideredaminimum
and highest priority subset of the Comprehensive networks, which should be
activelycuratedandretained.
• The measurements are periodically assessed, either against other instruments
measuring the samegeophysicalparameters at the same siteor, alternatively / in
addition, through intercomparison campaigns held under international or national
auspices. These activities provide understanding of the relative performance of
differenttechniquesinuse.Ideally,suchintercomparisonsshouldincludereference
qualitymeasurements/networks,torealisescientificbenefits.
• Representative uncertainties, that are based upon understanding of instrument
performanceorpeerreviewedlinesofevidence,areavailable.
• Metadataaboutchangesinobservingpracticesandinstrumentationareretained.
• Theobservationshavealong-termcommitment.
• Changestothemeasurementprogramareminimizedandmanaged(byoverlapping
measurements, or measurements with complementary instruments over the
change), with efforts made to quantify the effects of changes in an appropriate
manner.
• Themeasurementsaimtomeetstakeholderstatedrequirements.
GAIA-CLIMMeasurementMaturityMatrixGuidance
16
2.3.3 ComprehensiveobservingnetworksThese networks provide high spatio-temporal density data information necessary for
characterisinglocalandregionalfeatures.
• The comprehensivenetworksprovideobservationsat thedetailed spaceand time
scales required to fully describe the nature, variability and change of a specific
climate variable, if analysed appropriately. They include regional and national
operationalobservingnetworks.
• Representativeuncertaintiesbasedupone.g.instrumentmanufacturerspecification
and knowledge of operations should be provided. In their absence gross
uncertaintiesbasedupone.g.expertoroperatorjudgementshouldbeprovided.
• Metadatashouldberetained.
• Althoughencouraged,long-termoperationisnotrequired.
GAIA-CLIMMeasurementMaturityMatrixGuidance
17
3. Objectivelyassessingmeasurementcapabilities The measurement system maturity matrix (SMM), like its counterpart for Climate Data
Records (CDRs) developed under CORE-CLIMAX, is a tool to assess various facets of the
maturity of a measurement. The matrices assess to what extent current (at time of
production of theGuidance)measurement best practices have beenmet and, hence, the
maturityofthecandidatemeasurementsystem.
The measurement maturity is distinct from its applicability to a given problem, where
additionalconcernssuchasmeasurementlocation,frequencyetc.pertain.Suchaspectsare
end-userspecific,andcannotbecapturedwithinthematricesdetailedherein.However,the
assessmentresultsherein,incombinationwithsuchadditionalinformation,canbeusedto
helpinformuserstodecideupontheappropriatemeasurementsfortheirusecase.
The assessment can be performed either on individual instruments / sites, or for entire
networks. A network will typically constitute a federated collection of sites, under the
umbrellaofanorganisationthatisgenerallyrecognisedbythecommunity.Examplesarethe
GCOS Reference Upper Air Network, Network for Detection of Atmospheric Composition
Change,andTotalCarbonColumnObservingNetwork.Forsites,instrumentsandnetworks,
theassessedmeasurementprogrammayconsidermultiplemeasurementtechniquesand/or
Essential Climate Variables. In some cases, it may be preferable to consider aspects of a
network on a disaggregated level, either site-wise or instrument-technique-wise. Such an
assessment isencouragedwhere itadds interpretativevalue,andshouldbeagreed in the
rulesoftheroundphase(Section3.2).
Finally, a note of caution: measurement best practices may well change in future,
necessitating new versions of this Guidance. Please ensure you are using themost up to
dateversionof thisguidance,andensurethespecificguidanceversionused is retainedas
metadataalongsidetheassessment.
3.1 MaturityassessmentconceptThere are 6 mandatory major categories and one optional major category, where
assessments aremade,which overlapwith, but are not identical to, those used to assess
CDRs under the CORE-CLIMAX SystemMaturityMatrix approach.Where they overlap, in
many cases the guidance differs substantially, to reflect the frequently substantial
distinction between the measurements and derived CDRs. The strands for assessing
measurementmaturityhereinareasfollows:
• Metadata
• Documentation
• Uncertaintycharacterisation
• Publicaccess,feedback,andupdate
• Usage
• Sustainability
• Software(optional)
GAIA-CLIMMeasurementMaturityMatrixGuidance
18
Thesoftwareoptionshouldbecompletedonlyforthosemeasurementswheresubstantive
routine post-processing is undertaken, to convert the basic measurement to the finally
presentedgeophysicaltimeseries.Forexample:
• the conversion of digital count data returned from a radiosonde to the ground
segmenttotemperatureandhumidityprofiles;or
• frombackscatteredphotonscollectedandcountedbyalidartoageophysicalprofile
of an atmospheric parameter, like aerosol extinction coefficient or water vapor
mixingratio.
Although this requirement to assess software maturity will often apply, there are many
instanceswhereitisnotthecase,suchasstandardmeteorologicalsurfacestationnetworks.
Incaseswhereanythingmorethanverybasicautomatedprocessing(suchasresistanceto
temperature for a platinum resistance thermometer) of the measurements, from the
measuredparametertoderivedparametersisbeingundertaken,thesoftwarestrandshould
be completed. Otherwise, this strand should be noted as not relevant, with necessary
justification being given instead in the assessment.Where a combination of external and
internal assessments isbeingperformed,assessors shouldagreeonwhether the software
categorystrandistobeassessedaheadoftime(Section3.2).
Withineachcategoryareanumberofsub-categories.Foreachofthesesub-categories,the
assessmentwillassigna score from1 to6, that reflects thematurityof themeasurement
with respect to that facet of themeasurement system. The scoresmay help to inform a
decision upon maturity of a given candidate measurement system. All aspects of the
assessmentareimportant.Weaknessinanyonestrandwill,inevitably,impactonthequality
orusabilityofthemeasurements.Forexample,ifthemetadataanduserdocumentationare
assessed as weak, but uncertainty characterisation strong, there is reduced value in the
observations,asthenecessarycontextforend-userstousethemeasurementsappropriately
ismissing.
3.1.1 MaturityscoresandtierednetworksconceptThe maturity can, alternatively, be considered in three broad categories that give
informationonthescientificgradeandsustainabilityofthemeasurementsbeingassessed.
ThisissimilartotheCDRassessmentinCORE-CLIMAX,whichinturn,buildsupontheearlier
NOAA assessment process. However, the category definitions are fundamentally distinct
from those for a CDR reflecting the real distinctions between CDR and measurement
maturityconsiderations.
• Maturity scores 1 and 2 establishComprehensiveMeasurement Capability (CMC,
Comprehensive network type measurements): The instruments are placed in the
fieldandrecordingdata,butmaynotbewellcuratedormetrologicallyunderstood
andcalibrated.
• Maturity scores 3 and 4 establish a Baseline Measurement Capability (BMC,
Baselinenetwork typemeasurements):At this stage themeasurementsarebetter
characterisedandunderstood,andintendedtoberunforthelong-term.Thesemay
be considered a substantial, sustained contribution to the system of systems.
However,theylackstricttraceabilityandcomparability.
GAIA-CLIMMeasurementMaturityMatrixGuidance
19
• Maturity scores 5 and 6 establish a Reference Measurement Capability (RMC,
Reference network typemeasurements): Thesemeasurements are extremelywell
characterised, with strict traceability and comparability, and robustly quantified
uncertainties.Themeasurementsareactivelymanagedandcurated,andenvisaged
asasustainedcontributiontotheobservationalsystem.
3.1.2 InterpretingthematurityassessmentresultsThe major categories of the SMM are subdivided into several sub-categories, and
assessmentscoresareassignedbasedonscoresinthesesub-categories.Itshouldbenoted
thatthenumbersrequireinterpretationforeachassessedmeasurementseries,becausethe
circumstancesunderwhich themeasurementswere takenmayaffectwhatmaturity level
can be reasonably expected to be attained. A degree of expert judgment will, therefore,
always be required to finally assign a measurement system into a given category, that
reflectsthetotalityoftheassessment,includingallrelevantsub-categoryscores.Allrelevant
sub-categoryscoresshouldbeconsideredtoaidbothdataprovidersandusers.Inparticular,
dataprovidersshouldconsiderlow-scoringsub-categoriesastargetareasforfurtherworkto
improve the overall usefulness, accessibility, useability, and utility of their measurement
program.Figure2providesavisualsummaryofthetypicaloutputthatmayaccrue,andcan
beusedtomakeafinalassessmentonmeasurementsystemmaturity.
GAIA-CLIMMeasurementMaturityMatrixGuidance
20
Metadata Documentation Uncertaintycharaterisation
Public access,feedback andupdate
Usage Sustainability Software(optional)
Standards FormalDescription ofMeasurementMethodology
Traceability Access Research Sitingenvironment
Codingstandards
Collectionlevel
Formal ValidationReport
Comparability User feedbackmechanism
Public andcommercialexploitation
Scientific andexpertsupport
Softwaredocumentation
Filelevel FormalMeasurementSeries UserGuidance
UncertaintyQuantification
Updates torecord
Programmaticsupport
Portability andnumericalreproducibility
Routine QualityManagement
Versioncontrol Security
Long-term datapreservation
Legend
1 2 3 4 5 6 NotapplicableFigure 2. Hypothetical example assessment. For this example assessment itwas agreed that the software strandwas not applicable but that the twoadditionaloptionalsub-categorieswere.Blackedoutentriesarisebecausenotallmajorstrandshavethesamenumberofminorcategories.
GAIA-CLIMMeasurementMaturityMatrixGuidance
21
Within Figure 2 it is possible to ascertain areas of both strength and weakness. In thehypothetical example given there is a clear lack of usage for non-research purposes, forexample, which highlights a potential avenue to improve return-on-investment. Similarly,versioncontrol isassessedas lacking,andthispointstoanareathatcouldbe improvedinfuture.Conversely,access,updatesandpreservationareratedhighly,asisscientificuseandsupport. From the data provider’s perspective, such an assessment may inform strategicdevelopments to the measurement program. From the data user’s perspective, theassessmentshouldprovideanindicationofapplicabilitytotheirintendedusecase.Whenconsideringanassessmentofanetwork,ratherthananindividualsiteorinstrument,incertaincategoriesorsub-categoriesitshallbeappropriatetoperformtheassessmentonaper-asset (instrumentorsite)basis, ratherthananetwork-widebasis.This isparticularlythe case for the Sustainability strand, but may also be applicable elsewhere if there areintra-network heterogeneities in protocols pertaining to e.g. metadata, uncertaintyquantificationordocumentation.Insuchcases,andwherepractical,theassessmentshouldbe performed individually on each unique subset and stored in the assessment reportmetadata.Boththenetworkwidemeanscoreandrangeofscoresshouldthenbereportedin the summary. Such a refined assessment helps ensure both appropriate network sub-selection for certainapplications, anda fair assessment, thatmayhelpnetworkoperatorsandcoordinatorsidentifyandaddressintra-networkissues.
Inthefollowingsubsectionsweprovideinstructionsonhowtoassignscorestoeachofthesub-categories.Thesub-categoriessometimesincludecriteriathatcannoteasilybeassessedbyanexternalassessorwithoutaskingtheproviderofthedata,astepthatcouldbedoneinaformalaudittypeassessment.
3.1.3 PracticalapplicationconsiderationsThe SMM is provided as a multi-level Excel file where the scores are input in the pagesassociatedwith the sub-categories. These scores are thenautomatically used tomark therangeofscoresforthemajorcategory.Ifasub-categoryisnotfilledamaturityof1willbeset. There are two exceptions: one in the category Usage and one in the categorySustainability.
1. In the Usage category, usage of a measurement is considered for applications inresearch anddecision-making.Which columns are taken into account depends onthe intention of themeasurement system. For instance, if the description is onlypointing to intended use in research, then that category alone shall be used tocomputetheoverallusagematurity.
2. Within Sustainability, the siting environment is only applicable to fixedmeasurement assets that are always made from the same fixed location. Thisparticular sub-category assessment should not be completed for mobile non-repeating observing assets such as aircraft measurements or field campaigns.However, observational assets that take repeated profiles, along a consistenttransect,maybesuitabletobeassessedinthiscategory.
Where either of these categories are not applicable, the entry in the equivalent plot toFigure2shouldbegreyshadedtoindicateitsnon-relevanceratherthanleftblank.
GAIA-CLIMMeasurementMaturityMatrixGuidance
22
It isvery importanttouseauniquemeasurementsystemnameand identificationnumber(version)whentheSMMisfilled.Thisshallmatchthenameandidentification informationonthemeasurementdescriptionform(AppendixA).Also,documentationoftheassessmentdate, to follow the evolution in maturity of a particular measurement system, is veryimportant if changes inmeasurementmaturity are to be tracked through time. Sufficientassessmentmetadatashouldbeappendedtoenablethetrackingofmultipleassessmentsofa candidate measurement system over time. This should include the version of thisGuidancedocumentthatwasused.
3.2 HowtoperformanassessmentAssessmentsshouldberepeatedandrefinedonamulti-yearcycletocapturebothimprovementsanddegradationsinperformanceoftheobservingnetworks,andnewinsights.Thusensuringthatatanytimetheappropriatedataarebeingemployedtotheappropriatescientifictasks.Anassessmentusingthematurityconceptshouldbeconductedbyanassessmentleaderthatorganisestheassessment,providesneededguidancetotheparticipants,andcollectsandanalysestheresults.Itislikelytobeusefultohaveaspecificmeetingtoagreeontheanalysisresultsbeforepublication.
Itisintendedthatthisguidancebeupdatedrelativelyinfrequently.Theover-archingassessmentframeworkinthisdocumentshouldremainstableforaconsiderableperiodoftime,andnotgetsubstantivelydated.Thishasrequiredinmanycasesgenericratherthanspecificguidancewheredetailsmayreasonablybeexpectedtochangewithevolvinginstrumental,metrologicalandcommunitybest-practicesdevelopments.Forexample,inthenextsectiontheguidancerefersto“appropriatehigh-qualitymetadatastandards,whichpermitinter-operabilityofmetadata.”,ratherthanreferringtoacurrentstandardthatmayreasonablyquicklybecomesuperseded.Thisisoneofseveralexampleswherethisguidancerequiresadditionalinterpretationinthecontextofthestate-of-the-artatthetimeofanyassessment.
Whereasubstantiveassessmentofthestateofmultiplenetworks,instrumentsorsitesisbeingorganiseditisthereforerecommendedtocreateanadditionalsupplementofspecificassessmentcriteriadetailsor‘rulesoftheround’,whichprovidesadditionalguidanceonsuchaspects.Thisguidanceshouldbeagreedbyallparticipants,andshouldberetainedalongsidethecompletedassessmentsinsuchcases,topermitfullinterpretationoftheassessmentroundresults.
3.3 Metadata
Metadata is ‘data’aboutdata.Metadata shouldbe standardised,as completeaspossible,andadequatelydocumenthowthemeasurementwasattained.Thisinvolvesaspectssuchasinstrumentation, siting, observing practices etc. The measurement system should useappropriatehigh-qualitymetadatastandards,whichpermitinter-operabilityofmetadata.IfanISOstandardisdefined,thentheassessmentinfuturewouldbeagainstsuchastandard.
GAIA-CLIMMeasurementMaturityMatrixGuidance
23
However,atthepresenttimenosuchuniversallyagreedstandardexiststhatpertainsacrossall aspects of EO science. There are emerging efforts underWIGOS [WIGOS, 2015a,b] tocreateuniversalmetadatastandards1,andthereareseveraldefactoworkingstandardssuchasCF-compliantfileheaders.UnlessanduntilanISOstandardisdevelopedandapplied,theassessors’ judgement will be required as to the appropriateness of the standards beingadheredto(seerulesoftheroundsub-sectionabove).
In this category the maturity is assessed using three sub-categories that consider thestandardsused,themetadataatthecollectionlevel,i.e.,validforthecompletedatarecord;andatfilelevel,i.e.,validforthedataataspecificgranularity.
3.3.1 Standards
Standards– It isconsideredtobegoodpracticetofollowrecognizedmetadatastandards.These may differ depending upon the instrument or measurement program underconsideration, and may be determined on a network / infrastructure-wide basis. AsdiscussedpreviouslycurrentlynoISO-standardformetadataexists.
1 No standard considered
2 No standard considered
3 Metadata standards identified and/or defined and partially but not yet systematically applied
4 Score 3 + standards systematically applied at file level and collection level by data provider. Meets international standards
5 Score 4 + meta data standard compliance systematically checked by the data provider
6 Score 5 + extended metadata that could be useful but is not considered mandatory is also retained.
Table2:The6maturityscoresinmetadatasub-categoryStandards.
Note: It is likely that this sub-category can only be fully assessed by the measurementinitiator.Anexternalassessmentcanbemadebyaskingthedataproviderdirectly,orifthemetadataanddataarefreelyavailablefromaportal(whichwouldtendtoindicateamaturemeasurement system).However, signs for used standards can be foundby looking at thedatarecorddocumentationand/oratasampledatafile.
Theassessmentcanbemadeasfollows:
Score1and2:Nostandardisconsidered.Dataaremadeavailablesolelyasiswithatmostthegeographicalmeasurementlocation,timeofobservationandinstrumenttypemetadataprovidedthatenablesuse,butprohibitsmeasurementunderstanding.
1https://www.wmo.int/pages/prog/www/wigos/documents/Cg-17/Cg-17-d04-2-2(3)-add1-MANUAL-ON-WIGOS-approved_en.docx
GAIA-CLIMMeasurementMaturityMatrixGuidance
24
Score3:Standardidentified/definedmeansthatthemeasurementoriginatorhasidentifiedor defined the standard to be used, but has not yet systematically applied it. TheinformationaboutthismostoftencanbefoundinFormatdescriptiondocumentsavailablefromwebpages,orfromstatementsonwebpages.
Score4:Asystematicapplicationrequiresthatyoucanfindtherelevantmetadataprotocolidentifieranddetailsineveryfileofthemeasurementproductanddescriptions.
Score5:Thismeansthatthemeasurementproviderhasimplementedprocedurestocheckthemetadata contents. This could be ascertained by a check on consistency ofmetadataheaderinformationinindividualdatafiles.
Score 6: This score will be attained if, in addition to mandatory metadata, additionaloptionalmetadataiscollected,retainedandtransmitted.Thisscoremaynotapplytosomedatastreamswhereallmetadata isconsideredmandatorybutmayhelpdifferentiatetrulywellperformingmeasurement series inothercases,wheremetadata isdifferentiated intomandatoryandoptionalclassessuchas theWIGOSmetadatastandards [WIGOS,2105a,b]forexample.
3.3.2 Collectionlevelmetadata(includingchangerecords)
Collection Level metadata – these are attributes that apply across the whole of ameasurement series, such as processingmethods (e.g., same algorithm versions), generalspaceandtimeextents,creatorandcustodian,references,processinghistoryetc.Discoverymetadatathroughe.g.useofdigitalobjectidentifiers,canformpartofthisandensurelong-term discoverability. Collection level metadata allows other people to find out what themeasurementseriescontains,whereitwascollected,whereandhowtheseriesisprovided,andwhatusagerestrictionsapply.
1 None
2 Limited
3 Sufficient to use and understand the data independent of external assistance; Sufficient for data user to extract discovery metadata from metadata repositories
4 Score 3 + Enhanced discovery metadata
5 Score 4 + Complete discovery metadata meets appropriate (at the time of assessment) international standards
6 Score 5 + Regularly updated
Table3:The6maturityscoresinsub-categoryCollectionLevel
Theassessmentcanbemadeasbelow:
Score1:Datafileshavenoglobalattributes.
GAIA-CLIMMeasurementMaturityMatrixGuidance
25
Score 2: Only attributes like location, space and time coverage, custodian of data, areprovided,butnoinformationonmeasurement/processingmethodsorhistoryareavailable.
Score 3: All relevant information on processing (for example: software used, recordingplatform, raw data type) and for general understanding the data (such as references andcomments) is provided. Also, contains information on how to extract discoverymetadatafromrepositories.
Score4: Score3 +more informationondiscoverymetadata (for example, how toobtainrawdataandthenecessaryinformationtoenableausertoreprocessthosedata).Thismayinclude relevant information such as instrument batch, set-up, time averaging period etc.andtheavailabilityofadatadoi.
Score5:Score4+alltheavailableinformationonthedataareprovidedwiththedatausingan internationally recognized and agreed defined standard, that is appropriate to themeasurement system in question at the time of the assessment. Theremay exist severalsuch standards, and an appropriately agreed standard should be used if defined for the‘rulesoftheround’.
Score 6: Score 5 +Updates are providedwhenever newmetadata becomeavailable. Forexample: information on events impacting the quality of themeasurement series, or theadditionofcommentarymetadatasuchaspublicationswrittenaboutthedatarecord.
3.3.3 FileLevelFile levelattributesarethosespecifictothegranualityofthedata(onapermeasurementbasis) and vary with each measurement entity. The file level metadata includes suchelements as time of observation, location, measurement units, measurement specificmetadata such as ground check data,measurement batch number, ambient conditions attimeofobservationetc..Suchmetadataarenecessarytounderstandandproperlyusetheindividualmeasurements.
1 None
2 Limited
3 Sufficient to use and understand the data independent of external assistance
4 Score 3 + Limited location (station, grid-point, etc.) level metadata along with unique measurement set metadata (e.g. batch, set-up, time, averaging period)
5 Score 4 + Complete location (station, grid-point, etc.) level and measurement specific metadata
6
Table4:The6maturityscoresinsub-categoryFileLevel
Theassessmentcanbemadeasfollows:
Score1:Datafilescontainnovariableattributes.
GAIA-CLIMMeasurementMaturityMatrixGuidance
26
Score2:Datageographicalcoordinatesaredescribedanddataunitsareprovided.
Score 3: The data files are provided with measurement geographical coordinates, units,validrange,andmissingand/orfillvalues.
Score4:Score3+measurementfootprintdetailsareprovided.Thereissomelocationlevel(i.e., station level for an in situ data set, pixel level for a swath level satellite data)informationavailableinthedatafiles.Anexampleforlocationlevelmetadataissurfacetype.In addition there is information on the instrument batch, the instrumental set-up,measurementtimeandaveragingperiod.
Score 5: Score 4+ additional location level metadata such as level of confidence in theretrievalforeachdatalocationisprovidedforaballoonascent.Includesvicariousmetadata,wherenecessary,to interpretthemeasurement,suchasprecipitationorcloudfractionforthosemeasurementtechniquespotentiallyimpacted.
Score6:Notused.ThereisnoinnovationpossiblebeyondScore5.
3.4 DocumentationDocumentation is essential for the effective use and understanding of a measurementrecord.Therearethreesub-categoriestoassessthecompletenessofuserdocumentation.Note that the description of operations category used in the CORE-CLIMAX CDRmaturityassessment model was not deemed applicable to measurements, and so is not utilisedherein.Althoughthecategoryhas3sub-categories,itispossiblethattwoormoreofthesecategoriesmaybe coveredbya singledocument for a given candidatemeasurement. Forexample,theformaldescriptionofmeasurementmethodologymaybewritteninsuchawayastoalsoconstitute/containauserguide.
3.4.1 FormaldescriptionofmeasurementmethodologyFormal description of measurement methodology refers to a description of the physicalandmethodologicalbasisofthemeasurements,networkstatus(ifapplicable),processingoftherawdataanddissemination.Itshalloftenbeusedasamanualbythesitetechniciansforhow to take themeasurements. Fornon-satellitemeasurement capabilities this can coversuchaspectsasdescriptionsofmeasurementprinciples,methodsofobservation,calibrationprocedures, data filtering, data processing, corrections, aggregation procedures, datadistribution etc.. As such documents aremost often grey literature, it is required to alsohave a peer-reviewedpublication(s) on themethodology to increase thematurity.Wheresoftware is involved in the processing of the data, its availability should be assured. Formeasurementsthatinvolvesubstantialpost-processingtogetfromtherawmeasurementtothe processed measurement series, the optional software elements strand (Section 3.7)shouldbecompleted.
1 Limited scientific description of methodology available from data collector or instrument manufacturer
2 Comprehensive scientific description available from data collector or instrument manufacturer
GAIA-CLIMMeasurementMaturityMatrixGuidance
27
3 Score 2 + Journal paper on measurement methodology published
4 Score 3 + Comprehensive scientific description available from Data Provider
5 Score 4 + Comprehensive scientific description maintained by Data Provider
6 Score 5 + Journal papers on measurement system updates published
Table 5: The 6 maturity scores in sub-category Formal description of measurementmethodology
Theassessmentcanbemadeasfollows:
Score1: Documentationof themeasurementtechniqueprinciplesandprocessingchain isavailableanddiscoverable,e.g.ontheInternetusingarecognizedsearchterm.
Score2: Completedocumentationof themeasurement techniqueandprocessingsteps isavailable,whichincludesallthestepsthatwereusedtoprocessfromtherawmeasurementbasis,suchasdigitalcounts,tothefinalproduct,suchasatemperatureprofile.
Score3:InadditiontoScore2ajournalpaperinarecognisedappropriatescholaryjournal,outliningthemeasurementprinciplesandprocessingisavailable.ThiscanbecheckedusingtoolssuchasWebofScience.
Score4: Measurementtechnique informationsufficientforathirdpartytoreproducethemeasurement at another location is available from the measurement provider, e.g., aninstrumentmanualdescribinghowtotakethemeasurements,andanynecessaryprocessingsoftwarepackageisavailable.
Score 5: This score is related to updates of the documentation, following updates of themeasurementtechniquesormetadata(seePublicAccess,FeedbackandUpdate).Asignformaintenance is if the instrumentmanual has proper document version numbering and isreferringtoaspecificversionofthemeasurementseriesrecord.
Score 6: Each substantive update to themeasurement technique is published in the peerreviewedliterature.
3.4.2 FormalvalidationreportA Formal validation report contains details on the validation activities that have beenundertaken to assess the fidelity / reliability of the measurement record. It describesuncertainty characteristics of the measurement record found through the application ofuncertaintyanalysis(seesectiononUncertaintyCharacterisation),andprovidesallrelevantreferences.
1 None
2 Informal validation work undertaken.
3 Instrument has participated in certified intercomparison campaign and results available in grey
GAIA-CLIMMeasurementMaturityMatrixGuidance
28
literature
4 Report on intercomparison to other instruments, etc.; Journal paper on product validation published
5 Score 4 + Sustained validation undertaken via redundant periodic measurements
6 Score 5+ Journal papers describing more comprehensive validation, e.g., error covariance, validation of qualitative uncertainty estimates published
Table6:The6maturityscoresinsub-categoryFormalvalidationreport
Score1:Novalidationisdone,andhencenoreport;
Score 2: Report on limited validation activities, undertaken using other measurementtechniques,orbycomparisontovicariousmeasurementsorrelevantmodel-basedanalyses/ reanalyses is available, but no formal published validation / characterisation of themeasurementseriesexists.
Score3:Themeasurementtechniquehasbeenevaluatedinaformallyrecognizednationalor international intercomparisonorvalidationcampaign.Forexampleforaradiosondethemodel has participated in either a CIMO (Commission for Instruments and Methods ofObservations) intercomparison, or a regional comparison that includes instruments thatparticipated in one or more such CIMO campaigns. The results of the comparison orvalidationareavailableinasuitablereport,butarenotpeerreviewed,andthecomparisondataisavailableforanalysis.
Score4: Themeasurement techniquehasbeenevaluatedandvalidatedusingappropriatetechniques, and compared to other independent techniques that measure the samemeasurand and have similar maturity. Analyses verifying the performance of themeasurementtechniqueareavailableinthepeer-reviewedliterature.
Score5: Themeasurement technique is regularly validatedusing appropriate techniques,and regularly contributes to internationally recognised intercomparison activities. Thesevalidationreportsarepubliclyavailablealthoughmaynotbepeer-reviewed.
Score 6: More papers on instrument characterisation are published and measurementdeveloper/provider maintains up-to-date information on the validation activities andresultinguncertaintyestimatesintheirdataseries.
3.4.3 FormalmeasurementseriesuserguidanceFormalmeasurement seriesuserguidance–Thisdocumentcontainsdetailsnecessaryformeasurement users to discover and use the data in an appropriate manner. It includesaspects such as the technical definition of the measurement series, overview ofinstrumentation andmethods, general quality remarks, validationmethods andestimateduncertaintyinthedata,strengthandweaknessofthedata,formatandcontentdescription,references,andprocessingdetails.Itmaybethatthissamedocumentationalsoconstitutestheformaldescriptionofmeasurementtechnique.
1 None
GAIA-CLIMMeasurementMaturityMatrixGuidance
29
2 Sufficient information on the measurements available to allow user to ascertain minimum set of information required for appropriate use
3 Comprehensive documentation on how the measurement is made available from data collector or instrument manufacturer including basic data characteristics description
4 Score 3 + including documentation of manufacturer independent characterisation and validation
5 Score 4 + regularly updated by data provider with instrument / method of measurement updates and/or new validation results
6 Score 5 + measurement description and examples of usage available in peer-reviewed literature
Table7:The6maturityscoresinsub-categoryFormalMeasurementSeriesUserGuidance
Theassessmentcanbemadeasfollows:
Score1:Datacollector/ instrumentmanufacturerhasnotprovidedanydocumentationonthemeasurementsandhowtheyweretaken.
Score 2: There is sufficient information regarding themeasurements and how they weretaken to enable informed use of the data, for at least some applications. However, theinformationisnotcomplete.
Score 3: A reviewed (for example by the data provider) set of documentation is availablefrom data collector’s, network’s or instrument manufacturer’s webpages. Thedocumentationiscomplete.
Score 4: Score 3 + the documentation includes steps that have been undertaken toindependentlycharacterisetheinstrumentperformance.Forexample,theuseofanicebathtocalibrateathermometer,orawell-characterisedlampcheckforalidar.
Score5: Score4+Updatedguidance is available fromdataprovider’swebpage.A signofupdating is increasing version numbering and date. This is related to both updates in themeasurement technique itself and its understanding. This may include new validationtechniques,orresultsornewmethodsofobservationandtheirimpact.
Score6:Score5+themeasurementtechniquedescriptionispublishedinthepeerreviewedliterature,andthereareoneormoreexampleusageapplicationsdocumentedeitherinthedescriptionpaperorsubsequentapplicationpapers.
3.5 UncertaintycharacterisationThe categoryUncertainty Characterisation assesses thepractises used to characterise andrepresentuncertaintyinameasurementseries.Foursub-categoriesareconsideredwiththeaimtoencompasstraceability,thevalidationprocess,howuncertainty isquantified,and ifan automated qualitymonitoring process is implemented that increases the efficiency ofproduction and validation. Note that uncertainty nomenclature and practicesmust followestablisheddefinitions[JGCM,2008oranysubsequentupdatestothis]toattainascoreof5or6inanyofthesub-categories.
GAIA-CLIMMeasurementMaturityMatrixGuidance
30
3.5.1 TraceabilityTraceability is the property of the result of ameasurementwhereby it can be related tostated references, usually national or international standards such as SI units, through anunbroken chain of comparisons, and these processing procedures all have stated /quantifieduncertainties.Tosupportaclaimoftraceability,theproviderofameasurementmust document the measurement process or system used to establish the claim, andprovideadescriptionofthechainofcomparisonsthatwereusedtoestablishaconnectiontoaparticularstatedreference.AnymeasurementclaimingSItraceablemeansthatanyunitusedshallbetraceablebacktothesevenwell-definedbaseunitsoftheSIsystem:themetre,thekilogram,thesecond,theampere,theKelvin,themole,andthecandela.Alternatively,traceabilitycanbeattainedtorecognizedcommunitystandards,whereSItraceabilityisnotpossible. Full traceability on a sustainedbasis requires in-depth instrumentunderstandingandregularcomparisonstostandards,andwilltypicallyinvolveandbecertifiedbyNationalMeasurement Institutes. A fully traceable measure shall always have an associated totaluncertaintybudgetthataccountsfortheuncertaintyarisinginalloftheprocessingsteps.
1 None
2 Comparison to independent stable measurement or local secondary standard undertaken irregularly
3 Score 2 + independent measurement / local secondary standard is itself regularly calibrated against a recognized primary standard
4 Score 3 + processing steps in the chain of traceability are documented but not yet fully quantified.
5 Score 4 + traceability in the processing chain partly established
6 Score 5 + traceability in the processing chain fully established
Table8:The6maturityscoresinsub-categoryTraceability
Theassessmentcanbemadeasfollows:
Score1:Noattempthasbeenmadetoascertaintheabsoluteorrelativeperformanceofthemeasurements.
Score2: Periodic comparisonsaremadeagainst secondary standards toascertaindriftorgross biases. For example, a temperature sensor is compared to the reading from athermometershelter,oralidariscalibratedagainstastablelamporradiosondeprofile.Thispermits traceability to a secondary standard, which is stable but of unknown absolutequality.
Score3:Score2+theindependentcomparisonmeasurementisitselfperiodicallycalibratedagainst a primary standard from a National Measurement Institute, or other holder ofcertifiedprimarymeasurementstandards.Continuing the firstexampleunderScore2, theshelter thermometer is periodically calibrated against an NMI certified calibrationthermometer.
GAIA-CLIMMeasurementMaturityMatrixGuidance
31
Score 4: Score 3 + the processing steps in the traceability chain from the fundamentalmeasurement to SI or community recognized standardshavebeen identified, andat leastgrossestimatesfortheuncertaintiesinsomeofthesestepshavebeenestimated.
Score5: Score4+manyof theprocessingsteps in themeasurementareunderstoodandquantifiedinarigorousmanner.
Score 6: Score 5 + the traceability is fully established and verified, and a peer reviewedpaperdescribingthemeasurementseriesanditsuncertaintyispublished.
3.5.2 ComparabilityComparability-Thiscategoryevaluatestheextenttowhichtheproducthasbeenvalidatedto provide realistic uncertainty estimates and stable operations through in-the-fieldcomparisons.Suchvalidationissubstantivelydistinctfromtraceabilityinthatitrelatestoasustainedprogramofcomparisonboth inthemeasuredenvironment,andusing lab-basedexperimentstoascertainpotentialbiases,driftsandartefactsbetweentwomeasurements.Unlike for traceability, the comparison need not be to a measure that itself is traceabledirectly or indirectly to SI or community standards. However, for the highest qualitymeasurements such comparisons should be against measurements that are themselvestraceable.Thiscouldbethroughintercomparisoncampaigns,withfixedormobilestandardsavailableinthenetwork,orthroughcomplementarytraceablemeasurementsusingdistincttechniquesonasustainedbasis.
1 None
2 Validation using external comparator measurements done only periodically and these comparator measurements lack traceability
3 Score 2 + Validation is done sufficiently regularly to ascertain gross systematic drift effects
4 Score 3 + (Inter)comparison against corresponding measurements in large-scale instrument intercomparison campaigns
5 Score 4 + compared regularly to at least one measurement that has a traceability score >=5
6 Score 5 + compared periodically to additional measurements including some with traceability assessment >5
Table9:The6maturityscoresinsub-categoryComparability
Theassessmentcanbemadeasfollows:
Score1:Novalidationactivityhasbeenperformedonthemeasurements.
Score 2: The measurement is validated only periodically. For example, there are annualcomparisons to a similar instrument that does not have SI traceability as part of routinemaintenance.
Score 3: Regular comparisons to a similarmeasurement, or appropriate characterisationtechnique, to ascertain measurement relative performance in a sustained manner. Forexample, ground-checks for radiosondes using manufacturer standard ground-check
GAIA-CLIMMeasurementMaturityMatrixGuidance
32
recalibrations, or regular comparisons of a lidar system to radiosondes launchedcontemperaneously.
Score 4: Score 3 + instrument is characterised against other similar instruments orinstrumentsmeasuringthesamemeasurandinintercomparisoncampaignssuchase.g.theCIMO intercomparison for radiosondes, the screen temperature / humidity comparisonscarried out inAlgeria, or radiometer intercomparisons atDavos. Ideally such comparisonsshallbecarriedoutinarangeofenvironments(tropical,sub-tropical,temperate,polar),toascertainenvironmentaleffects.
Score 5: Score 4 + compared to well characterisedmeasurements from an independenttechniqueorinstrumentonaregularbasis.
Score6:Score5+comparedtofullytraceablemeasurementsonaperiodicbasistoproviderobustquantificationofabsolutebiasesanddrifts.
3.5.3 UncertaintyquantificationUncertainty quantification -This sub-categoryevaluates theextent towhichuncertaintieshavebeenfullyquantifiedandtheireaseofuse.
1 None
2 Limited information on uncertainty arising from systematic and random effects in the measurement
3 Comprehensive information on uncertainty arising from systematic and random effects in the measurement
4 Score 3 + quantitative estimates of uncertainty provided within the measurement products characterising more or less uncertain data points
5 Score 4 + systematic effects removed and uncertainty estimates are partially traceable
6 Score 5 + comprehensive validation of the quantitative uncertainty estimates
Table10:The6maturityscoresinsub-categoryUncertaintyquantification
Theassessmentcanbemadeasfollows:
Score1:Novalidation,andthereforenouncertaintyquantification.
Score2: Onlylimitedinformationonuncertainty isavailablebecauseof limitedvalidation,butitispossibletopartitionrandomandsystematiceffects.
Score 3: Comprehensive information is available, so that thenature of theuncertainty iswellunderstood.Forexample,whethertheuncertaintyvariesdependingupon:geographicregion,atmospheric state,and instrumentgeometry.Uncertaintiesareestimated foreachstepofthemeasurementproduction.
Score4:Score3+quantitativecomprehensiveinformationdescribedinScore3isavailableforeachdatapointofthemeasurementprofileorseries.
GAIA-CLIMMeasurementMaturityMatrixGuidance
33
Score 5: Score 4 + the systematic effects are removed and uncertainty estimates arepartiallytraceabletoSIorcommunityacceptedstandards.Inaddition,whereapplicable,thecorrelated and uncorrelated uncertainty terms in the measurement series or profile arequantified. For example, the calibration of an instrument may be an uncertainty that isabsolutelycorrelated,whereastheeffectsoffluctuatingcloudcovermaybeuncorrelatedorpartiallycorrelatedintheseries.
Score6: Score5+theuncertaintyestimatesarefullytraceableandvalidated,usingotherhighqualitytraceabledata,onasustainedbasis.
3.5.4 Routinequalitymonitoring Routine quality monitoring is the monitoring of data quality while processing the data.Quality monitoring is a robust and quantitative measure of how closely an individualmeasurementconformstoanexpectationagainstwhichtheobservationscanbecomparedandassessed.Suchqualitymonitoringhelpstoassess, innearreal time,major issueswiththemeasurements,andpermitsproactivemanagement.Itmayleadtoastopandrestartofprocessingactivitiesormeasurementseriesifanytypeoferrorisdetected.Inthatsenseitcansavesignificantresourceswhilstminimizingbaddatavolumes,andisaclearsignforamatureobservingsystemwithactivemanagement.
Routinedataqualitymonitoringmay requirean integratedapproach that includes severalsteps,dependingonthelevelofcomplexityofqualityassuranceprocedures.Thisisdirectlylinked to the complexity of the calibration procedures required for each measurementtechnique,andonthelevelofcomplexityoftheprocessingchain(seeoptionalassessmentareaSoftware).Moreover,robustdataqualitymonitoringalsodependsontheavailabilityofco-located redundant measurements, or high quality estimates based upon e.g. dataassimilation based short-term forecasts. Such data facilitate the assessment of the dataqualitythrough inter-comparisonofdifferenttimeseries,andthroughthedevelopmentofhigher-levelsynergisticproducts.
Monitoringofdataqualitycontrolcanbemanuallyappliedbysiteoperatorsandscientists,orperformedautomatically,orboth.Qualitychecksaretypicallyrealizedthroughaflaggingsystem applied to the data. Such a system shall typically include several or all of thefollowingsteps.
1. Data file format checks: catch files with missing metadata or data, incorrect dataformatting,oranyothertypeofgrosserrors.
2. Consistency checks: identify unreliable values based upon our understanding of thephysicsoftheconsideredECV.Forexample,negativerelativehumidityvaluesorvaluesthatexceedsubstantially100%forasustainedperiodcannotbecorrect.
3.Calibration: verify that calibrationprocedureshavebeenappliedand recorded foreachmeasurement technique following traceable procedures and, when possible, performedusingdifferent calibration approaches and reference tools. This stepmay also include theprovision of maintenance information, and reports on the expected and the actualInstrumentperformancebythesiteoperatorsandscientists.
GAIA-CLIMMeasurementMaturityMatrixGuidance
34
4. Uncertainty: identify those data whose uncertainty is beyond thresholds considereduseful for most intended applications. Such thresholds may be application specific, anddepend upon to what extent the uncertainties can be segmented into systematic andrandomcomponents.
5. Retrieval chain: ensure that all the processing steps from the basic data to processedproductshavesuccessfullycompleted;thisalsoincludesthenumberofcorrectionstypicallyapplied to thedata,as requiredbyeachmeasurement technique (e.g.multiple scattering,gasabsorption,multipathcorrections,radiationbiascorrectionsetc.). Ifanautomaticdataprocessingisused,checksareimplementedinthecalculuschain.
6. Redundancy checks: measurement intercomparisons and cross-checking with othertechniquesmeasuringthesameECV, ifphysicallyco-located.Inaddition,thecalculationofsite atmospheric state best estimates, that combine information from several synergisticmeasurementplatforms, canhelp to learnmore aboutmeasurements health status. Suchactivities canaugment the routinecheckingbyprovidinganestimateof theutilityofdatastreams. Thesehigher-level checks canalsopointoutdeficiencies that arenotnecessarilydetectablewithinindividualdatastreamchecks.
7.Timeseriesanalysis:routinenearrealtimeanalysisofthecollectedtimeseriesmayhelpidentifyinginconsistenciesandmistakesintheappliedprocedure,ornon-physicalanomaliesin themeasurement series. Intercomparisonsof co-located redundantmeasurementsmayalsohelpininvestigatingtimeseries.
8.Collectionoffeedback,throughtheimplementationofawebsitewithacombinationof:diagnostic plots browser with thumbnail views, an interactive plotting capability, a dataqualitydocumentation,aproblemreportingsystemandinstrumentandmaintenancelogs.
Data quality flags should be applied without rejecting data as subsequent innovations ininstrumentunderstandingmaypermitreprocessingandrecoveryofgoodvalues.
1 None
2 None
3 Methods for routine quality monitoring defined
4 Score 3 + routine monitoring partially implemented
5 Score 4 + monitoring fully implemented (all production levels)
6 Score 5 + routine monitoring in place with results fed back to other accessible information, e.g. meta data or documentation
Table11:The6maturityscoresinsub-categoryRoutinequalitymonitoring
Theassessmentcanbemadeasfollows:
Score1:Noautomatedqualitymonitoringinplace.
Score2:Noautomatedqualitymonitoringinplace.
GAIA-CLIMMeasurementMaturityMatrixGuidance
35
Score3:Ametric(e.g.,radiometricnoiseofoneormorechannelsoftheinstrumentusedissignificantlyabovespecification,numberofgoodmeasurementsisbelowathresholdvalue,agreement between duplicate measurements, measurements fail to attain stated heightrequirements, procedures, data used in comparisons, setting of thresholds for deviations,etc.)forroutinequalitymonitoringhasbeendefined.
Score4: Score3+ theproposedmonitoring ispartially implemented,e.g., forasubsetofthemeasurementsthatcontributetoaglobalcollectionbutnottotheremainder.
Score5:Score3+qualitymonitoringisimplementedforallthemeasurements.Variantsinperformancearereportedtothetechniciansundertakingthemeasurementsandresolvedinatimelymanner.
Score 6: Score 5 + Results of routine quality monitoring are reflected in metadata anddocumentation.Forexample,thequalitymonitoringproceduresandresultsaredescribedinthepeerreviewedorgreyliterature.
3.6 Publicaccess,feedbackandupdateThis category contains five sub-categories related to archiving and accessibility of themeasurementrecord,howfeedbacksfromusercommunitiesareestablished,andwhetherthese feedbacks are used to update the measurement record. It also concerns versioncontrolandarchivalandretrievalofpresentandpreviousversions.Amaturemeasurementsystemwouldbeavailable routinely toallowoperationaluse,with formal versioncontrol,andmaturearchivalprocedures. Furthermore, amaturemeasurementdata streamwouldhaveanestablishedmechanismtocollect,andactupon,userfeedback.
3.6.1 AccessAccessevaluatestheeaseofdistributingtherawandprocesseddata,documentation,andany necessary source code used to process the data from the raw measurement togeophysical or radianceparameter space, to users. Public accessmeans that thedata areavailablewithoutrestrictionsforatleastacademicuse,butsuchaccessmaystillbesubjecttoareasonablefee.Therawdatamayonlybeprovideduponrequest,butamechanismforrequestingshouldbereadilyapparentinsuchcases.Thehighestscoresinthiscategorycanonlybeattainedfordataprovidedfreeofchargewithoutrestrictionsonuseandre-use.
Dataproviderheremeanseitherthedatacollectorororganisationssuchasspaceagencies,nationalmeteorologicalcentresorresearchinstitutes.Aninstitutionaliseddataprovisionisconsidered to be more robust (and hence mature), compared to the provision by anindividualinvestigatororgroup.
1 Datamaybeavailablethroughrequesttotrustedusers
2 Dataavailableforusethroughoriginator
3 Dataanddocumentationavailablethroughoriginator
GAIA-CLIMMeasurementMaturityMatrixGuidance
36
4 Score3+availablethroughrecognizeddataportal
5 Score4+sourcedata,codeandmetadataavailableuponrequest
6 Score5+noaccessrestrictionsapply
Table12:6maturityscoresinsub-categoryAccess
Theassessmentcanbemadeasfollows:
Score1: Measurementrecord isnotreadyyettobegiventousers; itmaybeavailabletobeta-users for testing. Data originator is still conducting initial validation of the observedproduct.
Score2:Measurementrecordisnowreadytobegiventouserswithoutanyrestrictionsonacademicusage.Userscangetthemeasurementdataeitherbyrequestingitfromthedataoriginator,orfromapubliclyaccessiblesite.
Score 3: Measurement series and appropriate documentation to understand themeasurements ispubliclyavailable foracademicusethrougheither thedataproviderorapubliclyaccessiblesite.Academicre-useispermitted.
Score 4: As Score 3 + measurement series are available through a recognised andmeasurement-appropriatedataportalsuchastheCopernicusClimateChangeServicesDataPortal,NDACCportal,orNOAA’sNationalCentersforEnvironmentalInformation.
Score5:AsScore4+thesourcedata,metadataandanyprocessingcodeisalsoarchivedbythe data provider, allowing subsequent reprocessing of the full measurement series ifrequiredbyathirdparty.
Score6:AsScore5buttherearenorestrictionsonuseorre-useofthedata,metadata,orcode,andallaspectsaremadeavailablefreeofcharge.
3.6.2 UserfeedbackmechanismUser feedback is important for developers and providers of measurement records toimprove quality, accessibility, etc. of a given measurement series. This category is toevaluatewhethermechanismsareestablished to receive, analyse, anduseuser feedback.Feedback can reachameasurementprovider inmanyways, butneeds tobeorganised insuchawaythatitcanbeusedtoimproveameasurementrecordand/ortheservicearoundit. In the scientific community, measurement records are presented and discussed atworkshopsandconferences.Ascientistmaytakemessagesbacktohis/herlabandstarttothinkandrealise improvements, if resourcesareavailable.Ahighermaturity forgatheringfeedback isobviously reachedwhenameasurement recordhasbeen institutionalisedandtheresponsibleinstitutehasestablishedregularfeedbackprocesses.
1 None
2 Ad hoc feedback
GAIA-CLIMMeasurementMaturityMatrixGuidance
37
3 Programmatic feedback collated
4 Score 3+ consideration of published analyses
5 Established feedback mechanism and international data quality assessment results are considered
6 Score 5 + Established feedback mechanism and international data quality assessment results are considered in continuous data provisions
Table13:6maturityscoresinsub-categoryUserfeedbackmechanism
Theassessmentcanbemadeasfollows:
Score 1: Measurement record is intended as what you see is what you get, and so nofeedbackmechanismisconstituted.
Score2:Adhocfeedbackreceivedandmaybeactedupon.
Score3:Aprogrammaticcollectionofuserfeedbackisinstigatedthatmayrelatetoabroadnetworkofmeasurements,andlessonslearntaredisseminatedeitherformallyorinformallyperiodically.
Score4: Score3+ themeasurementprogram takes intoaccount findingsdocumented inthepeerreviewedliterature.
Score5: Themeasurementprogramhasawell-establishedandrecognizedsystemfor thecollection of metadata, which allows users to provide and track feedback. The results ofinternationalcomparisonsandcampaignsareconsidered.
Score6:Aninternationalreviewpanel(suchasanetworktaskteamormanagementgroup)thatmeetsregularlywouldindicateamaturesystem,thattookaccountofinnovationsandfeedback. A further sign of this is to check whether interim data records are provided(operationalcontinuationofameasurementrecordemployingthesameprocedures),andiffeedbackisalsoconsideredforthis.
3.6.3 UpdatestorecordUpdates to record evaluates if data records are systematically updated when newobservationsorinsightsbecomeavailable,orifthisisdoneinadhocfashionifatall.Amoreadhocupdatecycle is indicativethat theupdateverymuchdependson irregular funding,andisnotdonebyabiggerinstitutionthatprovidestheupdateaspartofanoperationallyoriented service. More mature measurement series will tend to be updated in anoperational manner that assures both their sustainability and their suitability forapplications requiring reliable data updates. The most mature measurement systemsdistributedatainnearreal-timesothatitcanbeusedinforecastingapplications.
1 None
2 None
GAIA-CLIMMeasurementMaturityMatrixGuidance
38
3 Irregularlyfollowingaccrualofanumberofnewmeasurementsornewinsights
4 Regularly updated with new observations and utilising input from establishedfeedbackmechanism
5 Regularly operationally by stable data provider as dictated by availability of newinputdataornewinnovations
6 Score5+initialversionofmeasurementseriessharedinnearrealtime
Table14:6maturityscoresinsub-categoryUpdatestorecord
Theassessmentcanbemadeasfollows:
Score1:Noupdateismadetothemeasurementseriesafterinitialrelease.
Score2:Noupdateismadetothemeasurementseriesafterinitialrelease.
Score 3: There are irregular updates to the measurement series record available to thepublic.Suchupdatesmayresultfromuserfeedback,innovationsinunderstanding,orsimplyconstitute a string of new measurements. Such updates are made in an ad hoc (un-timetabled)manner.
Score4:Thiscanbeseenbyregularupdatesforthemeasurementrecords,accompaniedbydocumentationofupdatesatreasonablefrequency.Forexample,aregulardaily,monthlyorannualupdateoccurstoappendnewobservations.Updatesperiodicallyincludeinnovationsto account for user feedback. In cases where no feedback has been received, despite afacilityforfeedbackbeingmadeavailable,thisshouldbestated.
Score 5: The updates to append data are made on a stated regularity, allowing theoperationalusageofthemeasurementseriesinapplications.Updatesperiodicallytakeintoaccount methodological innovations that improve the utility of the measurement series.Suchupdatesareclearlydifferentiatedfromstraightdataupdates.
Score6:Score5+aversion(whichmaynotbethefinalprocessedversion)ismadeavailableinnearrealtime(typicallydefinedaswithin2-3hours)forapplicationsthatcanmakeuseofthisinformationforforecastingpurposes.
3.6.4 VersioncontrolVersion controlallowsauser to traceback thedifferentversionsofalgorithms, software,format, input and ancillary data, and documentation used to generate themeasurementrecordunderconsideration. Itallowsclear statementsaboutwhenandwhychangeshavebeen introduced,andallowsusers todocumentthepreciseversionof thedatatheyused,thusenabling replicationofusers’analyses.Typically,amatureversioncontrolwillhaveadocumented version control protocol that is openly documented and may include inadditiontoversionnumberadatestamponeachversion.Themostmatureversioncontrolshouldallowuserstoretrievepreviousversionsifrequired.
GAIA-CLIMMeasurementMaturityMatrixGuidance
39
1 None
2 None
3 Versioning by data collector
4 Version control institutionalised and procedure documented
5 Fully established version control considering all aspects
6 Score 5 + all versions retained and accessible upon request
Table15:Sixmaturityscoresinsub-category-Versioncontrol
Theassessmentcanbemadeasfollows:
Score1:Noversioningsysteminapparentuseforthemeasurementseries.
Score2:Noversioningsysteminapparentuseforthemeasurementseries.
Score 3: The measurement series has an informal version control undertaken by, anddocumentedby,thedatacollectorthatisusedinternallytodocumentversions;
Score 4: Data version control is transferred from the data collector to an institutionallymaintainedarchive,andformalised.Theversioncontrolprotocolshallbedocumented.ForexampleaversioningN.x.y.zmightbeinstituted,andthereasonsforincrementinganyofNx,y,orzwillbeclearlyarticulated.
Score 5: Data provider has established full version control for the measurement recordincluding versions of algorithms, software, format, input and ancillary data, anddocumentation.
Score6: Score5+allhistoricalversions,since instigationofversioncontrol,canbemadeavailabletointerestedusersuponrequest.
3.6.5 Long-termdatapreservationLong-termdatapreservationrelatestothepreservationofmeasurementseriesrecords.AccordingtoLongTermDataPreservation(http://earth.esa.int/gscb/ltdp/)guidelinesanarchiveshouldkeepmorethanonecopy,usedifferentmedia/technologies,anddifferentlocations.Mostimportantistoretaintherawdata(e.g.thesolarspectralmeasurementsofanFTIR)andnecessarymetadata,whichmayallowsubsequentreprocessing.
1 None
2 None
3 Local archive retained by measurement collector
4 Each version archived at an institutional level on at least two media
GAIA-CLIMMeasurementMaturityMatrixGuidance
40
5 Data, raw data and metadata is archived at a recognised data repository such as a National Meteorological Service, national archive or international repository.
6 Score 5 + all versions of measurement series, metadata, software etc. retained, indexed and accessible upon request
Table16.Sixmaturityscoresinsub-category–Long-termdatapreservation
Theassessmentcanbemadeasfollows:
Score1:Noarchivingsysteminapparentuseforthemeasurementseries.
Score2:Noarchivingsysteminapparentuseforthemeasurementseries.
Score 3: Themeasurement series has a local archive,maintainedby the instrument datacollector,whichmaybeusedtoretrievedataonanadhocrequestbasis,butisdependentuponthedatacollectororasinglesmallgroup.
Score4:Dataarchivalistransferredfromthedatacollectortoaninstitutionallymaintainedarchiveandformalised.Thedataispreservedonatleasttwomedia,intwodistinctlocations.
Score 5: Data archival is undertaken by a recognised institution with expertise in datapreservation.Thepreservationextendstorawdata,metadata,software,anddataversions.
Score 6: Score 5 + all historical versions since instigation of archival can be uniquelyidentified,andmadeavailabletointerestedusersuponrequest.
3.7 UsageThis category contains two sub-categories related to the usage ofmeasurement series inresearchapplicationsandfordecisionsupportsystems.Publicandcommercialexploitationmeans the use in applications that directly support economic or public decisions, e.g., aradiosonde measurement may be used in an NWP model or forecast assessment, or anozonemeasurementmaybeusedtomonitorstratosphericozoneconditions,andhencetheeffectiveness of the Montreal Protocol and its amendments. In addition all usages increatingclimatedatarecords,andcitationsinreports,suchastheIntergovernmentalPanelforClimateChange (IPCC) reports, that supportdecisionsandpolicymakingonmitigationandadaptationarecountableforthepublicandcommercialexploitationsub-category.
The two sub-categories allow for a separate assessment of the usage of measurementrecords, i.e., the assessment result can state a highmaturity for usage in research, and alower or no maturity for public and commercial exploitation. For the overall score, it isimportant to know for which application area(s) the measurement was intended. ThisinformationshallcomefromSection1oftheGAIA-CLIMMeasurementRecordDescriptionForm(seeAppendixA).Ifthisdescriptionisonlypointingtouseinacademicresearch,thenonlythatcategoryshallbeusedtodisplaytheoverallmaturityforthiscategory.
GAIA-CLIMMeasurementMaturityMatrixGuidance
41
3.7.1 ResearchResearch applications of a measurement series can be evaluated by its appearance inpublicationsandcitationsofsuchpublications.
1 None
2 Benefits for research applications identified
3 Benefits for research applications demonstrated by publication
4 Score 3 + Citations on product usage occurring
5 Score 4 + product becomes reference for certain applications
6 Score 5 + Product and its applications become references in multiple research fields
Table17:6maturityscoresinsub-categoryResearch
Theassessmentcanbemadeasfollows:
Score1:Measurementseriesisnotusedyet.
Score2:Anavailableresearchplan,orsimilardocument,outlinesactualorintendedusageofthemeasurementseriesinresearchapplications.
Score3: Apeerreviewedpublicationexists,thatdescribestheusageofthemeasurementseriesinaresearchapplication.
Score4:Thepeerreviewedpublicationunderscore3iscitedbypeerreviewedpublicationsofotherapplications.
Score5: Themeasurementseries isusedasareference/contributingseries inalmostallpeerreviewedpublicationforaspecificapplication.
Score 6: The measurement series is used as reference in almost all peer reviewedpublication forapplications indifferent research fields,e.g., climatemodellingandclimatesystemanalysis.
3.7.2 PublicandcommercialexploitationAsdescribedaboveunderusageforPublicandCommercial Exploitationcoversanydirectuseinreal-timemonitoring,forecasts,infrastructureplanning,supporttoagenciesorotherbusinessareassuchasinsuranceandindirectsupport,e.g.,throughcitationsinIPCCreports,todecisionandpolicymakinginsocio-politicalcontexts.
1 None
2 Potentialbenefitsidentified
3 Useoccurringandbenefitsemerging
GAIA-CLIMMeasurementMaturityMatrixGuidance
42
4 Score 3 + societal and economical benefits discussed, data being distributed viaappropriatedataportals.
5 Score4+societalandeconomicalbenefitsdemonstrated
6 Score5+influenceondecision(includingpolicy)makingdemonstrated
Table18:6maturityscoresinsub-categoryPublicandCommercialExploitation
Theassessmentcanbemadeinthefollowingmanner:
Score1:Productisnotusedyetforanypublicorcommercialapplication.
Score2:Anavailablereportsuggestingthatthemeasurementseriescanbeusedforcertainpublic or commercial applications exists, and can be found online or in a recognisedrepository.
Score3:Producthasbeenusedinpublicand/orcommercialapplications,andareport(s)isavailablethroughappropriatedataportalsforuse.Forexample,thedataisavailableviatheClimate Data Store of the Copernicus Climate Change Service, or is used in NWP orreanalyses.
Score 4: The results of studies in Score 3 are used for a relevant public or commercialsystem. For example, a state or national government report on the planning is available,whichcitesthestudyusingthemeasurementsunderconsideration,ortheforecastresultingfromtheiruseenablesdecisionsbypublicandcommercialactors.
Score5:TheresultsofstudiesinScore4areusedinanapplicationarea,andhaveresultedindemonstrablesocietalandeconomicbenefits.
Score6:Substantivecontributiontonationalandinternationalpublicdecisionmaking,andapplications such as climate policy discussions or to economic applications. One can alsopoint to the use of a measurement series in other applications, which have economicalbenefits, such as use by an insurance company for decision making or use in a climateservice, e.g., the major application areas mentioned in the WMO Global Framework ofClimateServices(agricultureandfoodsecurity,disasterriskreduction,healthandwater).
3.8 SustainabilityThis category pertains to aspects of sustainability, and hence suitability, of any givenmeasurement program for scientific, operational, and societal applications. For ameasurementprogram tobeused in critical applications, its long-termsustainabilitymustbe assured. There are three primary strands to sustainability of ameasurement programthatrelateto:sitingenvironment,scientificandexpertsupport,andprogrammatic(funding)support.
Whereaninternationalmeasurementnetworkisbeingassessed,thenetworkshalltypicallyconsist of individual measurement sites operated by distinct legal entities, with distinctfundingmechanisms,and inavarietyof sitingenvironments. In suchcases, thereare two
GAIA-CLIMMeasurementMaturityMatrixGuidance
43
options.Oneistoprovideatypicalscore,thatisrepresentativeofthenetworkasawhole,butthisisthennotindicativeofthematurityofindividualcontributingsites.Thealternative,preferredoption,isthatthisassessmentbeperformedsite-wise,withthesite-by-sitescoresretainedasmetadataassociatedwith theassessment,and the rangeof scoreshighlightedappropriately in the assessment summarybyprovidingboth amean value and the range.Thelatterapproachispreferredbecauseitenables,forexample,applicationsthatrequirearepresentative sampling environment, to use the site-by-site scoresmetadataprovided toretainonly theappropriatesubsetof thenetworkthat issited inregionally representativelocales.Asite-by-siteassessmentalsoavoidsconflatingcontributingentitieswithlong-termcommitmentwith other contributiorswhichmay be less secure. This then helps networkcoordinators to highlight potential areas for within-network improvement / remediation.Therangeofindividualsitescoresacrossthenetworkmayalsoprovideausefulindicatoroftheoverallmaturityofthenetwork.
3.8.1 SitingenvironmentSiting environment only applies to fixedmeasurement assets, forwhich observations aretakenrepeatedlyfromasinglelocation(includingweatherballoonswhichoriginatefromaconstant location but may drift), or mobile observations using repeating transects. Non-repeatingmeasurementsmade fromaircraftandothermobileplatformsshould leave thisentry blank, and use solely the remaining strands to assign a score under sustainability.Withinthiscategory,considerationislimitedtotherepresentativenessofthesite/transectof its immediate surrounding environment / landscape. Questions of network design areoutside the scope of thismaturity assessment, although clearly are important in networkdesignandexpansionconsiderations.
1 None
2 Siteenvironmentisstableintheshortterm
3 Score2+siteownershipissustainable
4 Score3+Siteisrepresentativeofabroaderregionaroundtheimmediatelocation
5 Score 4 + site ownership, immediate environment is likely to be unchanged fordecades
6 Score5+long-termownershipandrightsareguaranteed
Table19.Sixmaturityscoresincategorysitingenvironment
Theassessmentcanbemadeinthefollowingmanner:
Score1: No information is available about the sitingof the instrumentused tomake themeasurement, or its representativity of the local surroundings and their environmentalconditions.
Score 2: The instrument location is known and characterised by photography, satelliteimageryorothermeansandtheenvironmentunlikelytobemodified,beyondmaintaining
GAIA-CLIMMeasurementMaturityMatrixGuidance
44
theenvironmentstabilitybye.g.,grassmowing,treeandbushmanagementetc.,bydirecthumaninfluenceintheshort-term.
Score3:AsScore2,plustheownershipofthesiteissustainablesuchthatthemeasurementprogramisviableatthespecificlocationfortheforeseeablefuture.
Score 4: As Score 3, plus the site is representative of a broader region surrounding itsimmediatelocation.HerebroaderregionmaybeapplicationandECVdependent.Foruseinsatellite characterisation (thepurposeofGAIA-CLIM) thismayextend to a typical satellitepixelfieldofview,forexample,wherethethermal,albedoandothersurfacecharacteristicsaresufficientlyhomogeneousforthemeasurementtobedeemedrepresentative.
Score5:AsScore4,plusthesiteownershipandtheimmediatesurroundingenvironmentislikely tobeunchanged fordecades.Evidence for thismayarise fromplanningdocuments,governmentownership,orotherrelevantnationallanddesignations.
Score 6: As Score 5 but the long-term site ownership and management is assured. Forexample the measurement is undertaken on managed government property that isprotectedbystatute.
3.8.2 ScientificandexpertsupportScientificandexpertsupportevaluatesthedegreeofscientific,technicalandmeasurementscienceexpertisethatunderpinsthemeasurementprogramme.Higherqualitynetworkswillbenefit fromsustainedcuration,development,andexploitationthattypicallyarisesfromastronginfrastructuresupportbasis,andacontinuousrecruitmentpolicy,thatisabletofillinthepersonnelandskillsgapsthatmightoccur.
1 None
2 Minimalscientificsupportrequiredtosustaintheprogramisavailable
3 Relevantinstrumentexpertiseisavailabletosupportthemeasurements
4Score 3 + at least two experts available to support the measurement programoperation
5 Activeinstrumentationresearchanddevelopmentbeingundertaken
6
Table20.SixmaturityscoresincategoryScientificandexpertsupport
Theassessmentcanbemadeinthefollowingmanner:
Score1:Noscientificorexpertsupportisavailabletothemeasurementprogram.
Score2:Aminimallevelofscientificortechnicalsupportisavailable,sufficienttomaintainthemeasurementprograminasustainedmannerintheabsenceofmajorfailuresorevents.
GAIA-CLIMMeasurementMaturityMatrixGuidance
45
Score 3: There are effectively sufficient resources available to ensure continuation andupkeepof themeasurementsystem,onasustainedbasis,whichmay includecalibration/replacement of sensors, effecting repairs and monitoring of instrument performance toidentifyandcorrectobviousfaults.
Score 4: As Score 3, but the maintenance and upkeep is not dependent upon a singleengineerorscientist,suchthatthesupportforthemeasurementseriescanbesustained.
Score 5: In addition to sustained upkeep, there is active scientific assessment of themeasurementsand investigationofpotential improvements ineither the instrumentor itsperformance characterisation, including traceability and uncertainty quantification, beingundertaken.
Score6:Notusedasnofurthersupportbeyondscore5isenvisaged.
3.8.3 ProgrammaticsupportThis category assesses the long-term programmatic support that underpins themeasurement program. Typically, higher quality measurements will be supported bysustained national or international programs, and infrastructure support that can assurelonger-termoperationandsustainability.
1 None
2 Projectbasedfundingsupportavailable
3 Score2+withexpectationoffollowonfunding
4 Score3+notdependentuponasingleinvestigatororfundingline
5Sustained infrastructure support available to finance continued operations for asfarascanbeenvisagedgivennationalandinternationalfundingvagaries
6Score 5 + support for active research and development of instrumentation orappliedanalysisoftheobservations
Table21.SixmaturityscoresincategoryProgrammaticsupport
Theassessmentcanbemadeinthefollowingmanner:
Score1:Nodedicatedprogrammaticsupportisevidentforthemeasurementprogram.
Score2: There is dedicated funding support, but it is tied to aproject and, therefore, thesupportisnotenvisagedtobecontinuous.
Score3:AsScore2,butthereisareasonableexpectationthatfundingwillberenewed.
Score4:As Score3,but themeasurementprogram is supportedbymultiple investigatorsand/orfundingstreams,toensurelong-termsustainability.
GAIA-CLIMMeasurementMaturityMatrixGuidance
46
Score5:Themeasurementprogramfundingarisesfromasustainablefundingstream,suchas national or international infrastructure funds, which are stable and unlikely to beremovedintheforeseeablefuture.
Score6:AsScore5,butsiteisalsofundedtoactivelyanalyseanddevelopthemeasurementprogram, ensuring that the highest possible quality observations are always undertaken.This may be ascertained by evidence of peer reviewed papers, book chapters, ormembership of committees / working groups / task teams of high quality observationalnetworkssuchasGRUAN,NDACC,AERONET,EARLINET,andTCCON.
3.9 Softwarereadiness(optional)AsnotedatthestartofSection3thismajorstrandisoptional,andshallapplyonlytothosemeasurementswhereroutineautomatedandsubstantiveprocessingoccursfromtherawmeasureddatatotheprovidedgeophysicalparametersofthemeasurementseries.
Caseswherethiswouldbeappropriatewouldincludemeasurementserieswherethedirectlymeasuredparameterisadigitalcount,aradiance,aphotoncountorsomeotherindirectproxyforthereportedmeasurand,whereprocessingexiststoconvertfromthemeasuredquantitytothereportedquantity.Conversely,wherethemeasurementconstitutesadirectproxyforthemeasurand,suchasaplatinumresistancethermometeroranemometer,andtheconversionisfacile,thesoftwarereadinesscategoryisnotappropriate.
Itshouldbeagreed,anddocumentedintheassessment,whetherthisstrandisapplicableornotaheadoftime,whendecidingtherulesoftheround.Whereitisnotapplicable,thecolumnshouldbegreyedoutinthesummary(seeFigure2).Notethatthesoftwarereadinessstrandissolelyrelatedtothesoftwarethatisusedintheproductionoftheprimarymeasurementproducts.Itdoesnotconsidersoftware,oftencreatedandcuratedbythirdparties,usedinsubsequentapplicationsofthedata,includingpost-processinganddatasetconstruction.
Inthismajorcategorytherearefoursub-categories.Thesearemainlymeanttobeforself-assessmentbecausetheinformationisrarelypubliclyavailable.Thesoftwarereadinesscategoryprovidesinformationontheavailabilityandmaintainabilityofsoftwareusedtogeneratethemeasurementrecord.Allsoftwareusedtomanipulatethemeasurementtoitsdistributedproductshouldbeassessed.Highmaturityisindicativeofasystemthatisinstitutionallywellunderstood,anddoesn’tdependonspecificindividualsthathaveknowledgeofthesoftwaresinceitsorigin.Softwarebecomesmoreeasilyunderstandableiftheprogrammingfollowsstandardsandtheinstallationandusageisdocumented.Softwareisalsomaintainableifitcanbeportedtootherlocationsandacrossoperatingsystems.Morematuresoftwaremaytendtoalsobeopen-source,andopen-sourcecodeshouldbeencouragedwhereitcanbeattained.However,forcaseswherethedataareusedoperationallyitmaynotbepossibleorpracticaltosharethefullprocessingcode.
GAIA-CLIMMeasurementMaturityMatrixGuidance
47
3.9.1 CodingstandardsCodingstandardsareasetofconventions/rulesspecificforacodinglanguage,whichdescribesstyle,practicesandmethodsthatgreatlyreducetheprobabilityofintroducingbugs.Thisisespeciallyimportantinateamenvironment,orgroupcollaboration,sothatuniformcodingstandardsareused,andhelpstoreduceoversighterrorsandsavetimeforcodereviews.Itiskeytoassuringthemaintainabilityofthecodeatreasonablecost.ThereareISOstandardsavailableforsoftwarecodingwhichmaybeapplicable.IfsuchISOstandardsaretobeusedshouldbeagreedinthe‘rulesoftheround’.
1 Nocodingstandardorguidanceidentifiedordefined
2 Codingstandardorguidanceisidentifiedordefined,butnotapplied
3 Score2+standardsarepartiallyappliedandsomecomplianceresultsareavailable
4 Score3+complianceissystematicallycheckedinallcode,butnotyetcomplianttothestandards
5 Score4+Measurementproviderhasidentifieddeparturesfromthestandardsandactionsareplannedtoachievefullcompliance
6 Codeisfullycompliantwithstandards
Table22:The6maturityscoresinsub-categoryCodingstandards
Codingstandardscanbeevaluatedasfollows:
Score1:Thereisnoevidenceavailablethatcodingstandardshavebeenconsidered.
Score2:Standardidentified/definedmeansthatthemeasurementrecordproducerhasidentifiedordefinedthestandardstobeused,buthasnotappliedit.Theinformationaboutthismostoftencanbefoundinsoftwaredescriptiondocumentsorprogrammingguidelinesavailablefromwebpages,orbyaskingthemeasurementprovider.
Score3:Thismeansthatthemeasurementproviderhasstartedtoapplythestandards,andimplementedprocedurestocheckthecompliance.Thisinformationmaybeavailablebyaskingthemeasurementprovider.
Score4:Score3+proceduresaresystematicallyappliedtocheckthecompliance,andtheresultsareoftenavailableasinternalreports.
Score5:standardsaresystematicallyappliedinallcodeandcomplianceissystematicallycheckedinallcode.Codeisnotfullycomplianttothestandards.Improvementactionstoachievefullcompliancearedefined.
Score6:Atthisstagethesoftwareshallbefullycompliantwithitsdescriptionandthedocumentedstandard.Thisincludesprocedurestocheckthecomplianceandtheresultsoftheunittestsconducted.
GAIA-CLIMMeasurementMaturityMatrixGuidance
48
3.9.2 SoftwaredocumentationSoftwareDocumentationiskeytoensuringusability,portabilityandoperatorunderstanding.Thissub-categoryisconcernedprimarilywithwhetherthecodeisdocumentedwithproperheaders,changehistory,andsufficientlycompleteandunderstandablecommentsdescribingtheprocesses.FurtherstepsarewhethertheREADMEfileisup-to-date,thereisdocumentationavailable,whichdescribesdesignrationaleandarchitecturaloverviewofthesoftware,andthereisasoftwareinstallationandusermanualavailable.
1 Nodocumentation
2 Minimaldocumentation
3 Headerandprocessdescription(comments)inthecode
4 Score3+adraftsoftwareinstallation/usermanualavailable
5 Score4+enhancedprocessdescriptionsthroughouttheinstallation/usermanualcomplete
6 Score5+codeanddocumentationispubliclyavailablefromawebpage
Table23:The6maturityscoresinsub-categorySoftwaredocumentation
Theassessmentcanbemade,forexample,asbelow:
Score1:Nosoftwaredocumentationexists.
Score2:Thereareheaderandlimitedcommentsinthecodeandinstallationinstructionsavailable,butnootherdocumentationisavailable.
Score3:READMEfileshouldatleastcontaininformationon“Configurationinstructions”,“Installationinstructions”,“Operatinginstructions”,“Copyrightandlicensing”,“Contactinformation”,etc..
Score4:Score3+SoftwareUserManualshouldatleastcontaininformationonsoftwareconceptanddesignandprovidinginstructionsforinstallingandusingthesoftware.
Score5:Codeisverywelldocumentedandinstallation/usermanualiscompleteandavailableondataprovider’swebpage.
Score6:Thecodeanddocumentationisopenlyavailablethroughawebsitetoallowusersfullunderstandingoftheprocessingsuite.
3.9.3 PortabilityandnumericalreproducibilityPortabilityandnumericalreproducibilityconcernstheusabilityofthesoftwareindifferentenvironments(differentcomputingplatformssuchasLinux,Solaris,MacOS,Windowsetc.anddifferentcompilerssuchIntel,IBM,GNU,Portland,etc),andwhethertheresultsarenumericallyreproducible.Itisimportantformigratingsoftwarefromoldtonewcomputersystemsandfromoneplacetoanother.
GAIA-CLIMMeasurementMaturityMatrixGuidance
49
1 Notevaluated
2 Reproducibleunderidenticalconditions
3 Reproducibleandportable
4 Thirdpartyaffirmsreproducibilityandportability
5 Score4+thirdpartycaninstallthecodeoperationally
6 Score5+Turnkeysystem
Table24:The6maturityscoresinsub-categoryportabilityandnumericalreproducibility
Theassessmentcanbemade,forexample,asbelow:
Score1:Notevaluatedmeansthishasnotbeenconsideredatall.
Score2:Measurementseriesinvestigatoraffirmsthatthesoftwarereproducesresultswhenrerunonthesameplatformwiththesameinputandsamecompiler.Thisinformationcanbeobtainedbyaskingtheinvestigator.
Score3:Thesoftwareproducesnumericallyreproducibleresultstospecifiedprecisionondifferentcomputingplatforms(suchasLinux,Solaris,MacOS,Windowsetc.),and/orwithdifferentcompilers(suchIntel,IBM,GNU,Portland,etc).
Score4:Score3+3rdpartycaninstallthecodeoperationallywithminimalmanualefforts.Runsrevealthattheoutputisnumericallyreproducible(withinmachineroundingerrors).Thisinformationshalltypicallybefoundinsoftwaredescriptiondocumentsavailablefrommeasurementseriesinvestigator’swebpages.
Score5:Score4+thecodeisalreadyusedbya3rdpartyinoperationalenvironmentunderconfigurationcontrol.Thisshallbedescribedinthesoftwareinstallation/usermanual.
Score6:Turnkeyissoftwarethatisdesigned,supplied,builtorcompletelyinstalledandreadytooperate.Thetermimpliesthattheenduserjusthastoturnakeyandstartusingthesoftware,e.g.,LinuxOS.Thisshallbedescribedinthesoftwareusermanual.
3.9.4 SecuritySecurityisassociatedwithsoftwarecontentsthateitherhavethepotentialtodestroyfilesandcompleteenvironmentsorarerelatedtofiletransferbetweencomputeenvironments.Bothshouldnotbecontainedinsoftware.Thesecuritycategoryalsocheckswhetherthefilesystemcanbeaccessedfromoutside,asthismayhampertheintegrityofthemeasurementseriesgenerationenvironment.
1 Notevaluated
2 Dataprovideraffirmsnosecurityproblems
3 Submittedfordataprovider’ssecurityreview
GAIA-CLIMMeasurementMaturityMatrixGuidance
50
4 Passesdataprovider’ssecurityreview
5 Continuouslypassesperiodicdataprovider’sreview
6
Table25:The6maturityscoresinsub-categorySecurity
Theassessmentcanbemade,forexample,asbelow:
Score1:Notevaluatedatthisstagemeansthatsoftwaresecurityissueshavenotbeenconsideredtodate.
Score2:Dataproviderhasdonethetestingforsecurityissuesinthecodeandfoundnone.Thisinformationcanbeobtainedbyaskingthedataprovider.
Score3:Thisinformationcanbeobtainedbyaskingthedataprovider.Thisisanecessarystepbeforeportingthesoftwarefromaresearchenvironmenttoanoperationalenvironment.
Score4:Thismeansthesoftwarehaspasseddataprovider’squalityassuranceandsecuritytests.Informationonthisshallbeobtainedfromsoftwareinstallation/usermanual.
Score5:Dataproviderdoessecurityassessmentperiodicallyandalsowheneverthereisasoftwareupdate,andtheresultsshallbeavailablefromupdatedsoftwareinstallation/usermanual.
Score6:Notused.
GAIA-CLIMMeasurementMaturityMatrixGuidance
51
4. Challengestoadoption
The approach introduced herein will be used in the first instance solely for the internalpurposes of GAIA-CLIM. During the development of this guidance, a number of internalprojectpartnershaveattemptedtouseittoclassifyanumberofnon-satellitemeasurementsystems,andthatfeedbackhasbeenusedtomodifythecriteriatoensurethatthisisfit-for-purpose,at least for the specificneedsofGAIA-CLIM.So,wecanbe reasonably confidentthat this should be applicablemore broadly to aid the consideration ofmaturity of non-satellitemeasurementcharacteristicsforvariouspossiblepurposes.
However,thereisalsoabroaderneedtoarticulateandadoptasystemofsystemsapproach,which this documentationmay help to nurture [GCOS, 2014, 2015]. There are significantchallengestoitslikelybroadadoptionwhichwerehighlightedintherecentGCOSmeetinginIspra[GCOS,2014],andwhichareexpandeduponhere.Themostappropriatemechanismto take this forwards, at least in the atmospheric domain, would be through theWIGOSprogram,recentlyadoptedbyWMOatits2015Congress.
4.1 NamingnomenclatureforexistingnetworksacrossandwithindomainsPerhapsthelargestchallengeisthatcurrentlyabroadrangeofnon-satellitemeasurementnetworks and infrastructures have been called ‘reference’, ‘baseline’ or ‘comprehensive’that, when assessed against the criteria detailed in Section 3, would instead fall within adifferentcategory.Thelackofclarityhistoricallyregardingasystemofsystemsarchitecture,taken togetherwith fractured governance and support structures, has led to a varied useand adoption of network nomenclature and practices both across, and within, EarthObservationsciencedisciplines.Thismeansthatwhatdifferentsub-communitiesconcernedwith environmental measurements refer to as ‘baseline’, ‘reference’ or indeed‘comprehensive’ network measurements is not always the same. Often it is not evenremotely similar. If a system of systems approach is to be broadly adopted, significantfurtherworkisrequiredtoreconcilethedisparateapproachestonetworkdesignations,andto manage the transition to a more trans-disciplinary approach to network assignations.Thereareseveralrisks/challengesinanysuchtransition:
1. Nationalor internationalfundingsupportforameasurementprogrammaybetiedto itspresentdesignation.There isa risk inenforcinganychangethat the fundingsupport for the program is endangered. Take, for example, the Ocean referencenetwork.Thisnetworkisnotareferencenetworkinthesenseadvocatedhere,butrather closer to baseline capability. But, it is still the best set of observationsavailable,andriskingitslosswouldbeasignificantmistake.
2. Usersmayuseameasurementprogrambecauseofitscurrentdesignation,andmayget confused if measurement programs are reassigned or renamed withoutadequateconsultationorjustification.
3. Theobserversundertakingthemeasurementprogrammaynotfullyunderstandtheimplicationsifupdatestoprotocolsand/orpracticesarerequired.
GAIA-CLIMMeasurementMaturityMatrixGuidance
52
4. Ensuring program support sustainability and harmonization of practices acrossnationalboundaries.
On the flip side to these concerns is that allowing the status quo to continuemeans thatusers referring to e.g., a ‘reference’ network in themarine, atmospheric and compositioncommunities(justasbywayofanexample)maybecomparingmeasurementprogramsthatarewidelydifferingconcerningtheirfundamentalmeasurementcharacteristicsandqualitiesand,therefore,suitabilityforagivenapplication.Thestatusquoplacestheresponsibilityofunderstanding themeasurement systems and networks on a system-by-system and evenECV-by-ECV basis firmly on the end-user. Experience shows that end-users are,understandably,unlikelytohaveeitherthetimeorthenecessaryknowledge/expertisetofully understand the distinctions that may exist between similarly named programs andassume, incorrectly, that they are equivalent. This is a barrier to the effective usage ofexistingEOcapabilitiesbyscientists,policymakersandotherendusers,andwillcontinuetobe so unless and until a more holistic approach, such as suggested in this guidance, isadopted.
4.2 End-UserAdoptionIt is clear that alongside adoption and designation of a tiered network capabilitiesframework, it is necessary to providematerial to aid users to understand what the tiersmean,andtoshowrealcaseexamplesofhowtheycanbeused.GAIA-CLIMwill,throughitsworkpackages,providecase studyexamples in thedomainareaof satellitemeasurementcharacterisation. But, further examples in other domain areas and application areas arenecessary,thatwillbebeyondtheremitofGAIA-CLIM.
4.3 RealisingtechnologicalandscientificbenefitsofatieredsetofcapabilitiesEven if the tier designations and criteria documented herein were adopted, there wouldremain the challenge of ensuring linkages between the different components to realisebenefits. This includes aspects such as infrastructure co-location, intercomparisoncampaigns, information sharing, training and development etc.. Such inter-linkages willbecome both more obvious and more realisable if a system-of-systems architectureapproachandassessmentisadopted.Somesubsetoftheseaspectsthattouchuponsatellitecalibration / validation are covered within the living Gap Assessments and ImpactsDocument of GAIA-CLIM, which the interested reader is encouraged to refer to (seewww.gaia-clim.eu).
4.4 PotentialfutureapplicabilitytothesatellitedomainThetiersandtheirdesignationsforGAIA-CLIMdetailedhereinpertainexplicitlyonlytonon-satellitemeasurementcapabilities.Theirextensiontosatellitemeasurementsisnon-trivial.Thus, the guidance in Section 3 is explicitly solely for application to non-satellitemeasurements.In particular, the relation of fidelity and spatio-temporal completeness, that is clearlyapplicable to the non-satellite measurements domain, does not readily apply to satellitemeasurements. For satellites the fidelity instead depends on instrument design and itscharacterisationbothpriortothelaunch,andusingonboardcalibration.Forchannelswhere
GAIA-CLIMMeasurementMaturityMatrixGuidance
53
cloudshaveanimpact,italsodependsupontheefficacyofclouddetectiontechniques.Also,metadatacontent is lessofaconcernas thehistoricevolutionof themetadatahas led tostandards, which are both comprehensive and broadly applied, with very little differenceamong satellite data suppliers. Finally, satellite systems do not form networks in ageographical sense with the exception of geostationary sensors that always observe thesamearea.However, some of the characterisations given for the observational tiers for non-satellitecapabilities are broadly applicable to satellite based measurement systems, with someadditionalinterpretation.
• Reference qualitymeasurements in space (often called benchmarkmeasurementswithinthatcommunity)wouldfulfillcriteriaonveryhighaccuracyandtraceabilitytothe SI standard. Currently, no such system exists but several, such as the ClimateAbsolute Radiance and Refractivity Observatory (CLARREO) and TraceableRadiometry Underpinning Terrestrial- and Helio- Studies (TRUTHS) missions, havebeenproposed.Suchmissionspotentiallyrepresentacalibrationlaboratoryinorbitfor the purpose of accurately measuring climate change. A specific value of thepositedCLARREO/TRUTHSstylemeasurements lies intheirhighvalueto functionasreferenceforremainingspace-basedinstrumentsnotbuiltspecificallytomeasureclimate change. Important for satellite reference systems, in particular in theinfrared range, is thatmeasurements are takenwith high spectral resolution thatenables analysis and characterisationof the performanceof instruments in space,for instance with respect to changing spectral response of filter radiometers thatmeasure an integral over a broader spectral range. In addition, higher spectralresolution may also be calibrated more accurately. Comparison of such positedmeasurementstoothersatellitemeasurementswouldestablishanunbrokenchainforSItraceableaccuracyonorbit.ClosetosuchasystemistheGNSS-ROtechniquewherethebaseunitisatimedelay,thatistraceable,andmaybeabletoconstitutea reference measurement, assuming all steps in the processing chain can beunderstoodandtheiruncertaintyquantified.
• The category baseline, as described for ground-based observations, has little in
common with satellite systems as satellites either are in orbit and measuringeverywhereordonotexist.Thereisnoeffectiveminimalsetofmeasurementsthata satellite takes – it is either operational or it is not. The closest analogy in thesatellite domain to the non-satellite baseline network concept, therefore, is theprovision of long-term (multi-decadal) measurements in some sub-set of theemissionsspectrathatcanbeusedtocharacterisechangeandvariabilityinarangeof ECVson climate timescales.Many satellitedata records start in the late1970s.The measurements are mostly in the visible, infrared, and microwave spectralranges, but were built for the purpose to observe weather and not climate. Abaseline conceptwould ensure their continuation into the future to enablemulti-decadalcontinuousmonitoring.
As in the non-satellite domain, new measurements may provide enhancedmonitoring in spectral domains with a long measurement heritage. For instancehyper-spectral infrared as delivered by the IASI instrument aboard the Metopsatellitehavehighspectralresolution,andareapproximatelyanorderofmagnitudemoreaccuratethanhistoricinfraredmeasurements.Suchinstrumentscanserveasacomparatorforhistoricinstruments,establishinganunbrokenchainofinter-satellite
GAIA-CLIMMeasurementMaturityMatrixGuidance
54
calibrations that enable relative calibration tomoremodern, better characterised,measuresevenifabsolutecalibrationremainselusive.Baseline implies the need for sustained missions, which is best achieved foroperational weather observations. Such measurements may be also achieved bymoreoperationaloceanandlandsurfaceorientedmissions,suchasbothlaunchedandplannedSentinelmissions.
• Comprehensive capability for satellite missions needs to be interpreted very
differently frominsitunetworks,as littleofthecharacterisationprovidedfornon-satellitemeasurementsfits.However,itmightbeinterpretedinawaythatthisclassis established as a catch-all for all other Earth Observing missions not capturedabove.Theseadditionalmissionsexpand theability tomeasuremorecomponentsof the Earth system, with higher accuracy over shorter periods, fostering processunderstanding. Or they contribute by proofing measurement concepts for futuremissions.Inmanycasestheyeventuallytransitiontoabaselinecapability.
GAIA-CLIMMeasurementMaturityMatrixGuidance
55
AcknowledgementsThis work has been based upon the substantial work undertaken by CORE-CLIMAX and anumber of precursor studies assessing dataset maturity.Without these preceding effortsthisworkwouldnothavebeenpossible.KarinKreher,ArnoudApituley,GregBodeker,BarryGoodison,MarkBourassa,MatthiasBuschmannandGePengprovidedfeedbackbaseduponearlydraftsthatservedtoimprovetheGuidance.
References
Bates, J. J.andJ. L.Privette, (2012),A maturity model for assessing the completeness ofclimatedatarecords,EosTrans.AGU,93(44),441.
GCOS,2014,WorkshoponthereviewoftheGCOSSurfaceNetwork(GSN),GCOSUpper-AirNetwork (GUAN), and related atmospheric networks,Ispra, Italy,April 2014, GCOS-182[http://www.wmo.int/pages/prog/gcos/Publications/gcos-182.pdf]
GCOS, 2015, Status of the Global Observing Systemfor Climate, GCOS-195[http://www.wmo.int/pages/prog/gcos/Publications/GCOS-195_en.pdf]
JGCM, 2008 Evaluation ofmeasurement data – Guide to the expression of uncertainty inmeasurement[www.bipm.org/utils/common/documents/jcgm/JCGM_100_2008_E.pdf].
NationalAcademyofSciences.2009.‘Observingweatherandclimatefromthegroundup:Anationwide network of networks.’ http://dels.edu/Report/Observing-Weather-Climate-from/12540
J.Schulz,V. John,A.Kaiser-Weiss,C.Merchant,D.Tan,E.SwinnenandR.Roebeling,2015:EuropeanClimateDataRecordCapacityAssessment,inprepforGeoscienceDataJournal
Seidel,D.J.,F.H.Berger,etal.(2009)."REFERENCEUPPER-AIROBSERVATIONSFORCLIMATERationale,Progress,andPlans."BulletinoftheAmericanMeteorologicalSociety90(3):361-+.
WIGOS,2015a,WMOTECHNICALREGULATIONS(WMO-No.49)-MANUALONWIGOS
WIGOS, 2015b, WMO TECHNICAL REGULATIONS (WMO-NO. 49) - MANUAL ON WIGOSAttachment WIGOS METADATA STANDARD
GAIA-CLIMMeasurementMaturityMatrixGuidance
56
Glossary
AERONET AerosolRoboticNetwork
BIPM InternationalBureauofWeightsandMeasures
CDR ClimateDataRecord
CF-compliant ClimateForecastconventioncompliantdata
CIMO CommissionforInstrumentsandMethodsofObservation
CLARREO ClimateAbsoluteRadianceandRefractivityObservatory
CORE-CLIMAX CoordinatingearthobservationvalidationforRE-analysisforCLIMAte
ServiceS
EARLINET EuropeanAerosolResearchLidarNetworktoestablishanaerosol
climatology
ECV EssentialClimateVariable
EO EarthObservation
EUMETNET EUMeteorologicalNetwork
GAIA-CLIM GapAnalysisforIntegratedAtmosphericECVCLImateMonitoring
GCOS GlobalClimateObservingSystem
GEOSS GlobalEarthObservationSystemofSystems
GNSS-RO GlobalNavigationalSatelliteSystemRadioOccultation
GRUAN GCOSReferenceUpper-AirNetwork
GUAN GCOSUpper-AirNetwork
GUM GuidetoUncertaintiesinMeasurements
IASI InfraredAtmosphericSoundingInterferometer
IPCC IntergovernmentalPanelonClimateChange
ISO InternationalstandardsOffice
NAS NationalAcademyofSciences
NDACC NetworkfortheDetectionofAtmosphericCompositionChange
NOAA NationalOceanographicandAtmosphericAdministration
NWP NumericalweatherPrediction
SI SystemeInternationaloffundamentalmeasurementunits
SMM SystemMaturityMatrix
TCCON TotalCarbonColumObservingNetwork
TRUTHS TraceableRadiometryUnderpinningTerrestrial-andHelio-Studies
WIGOS WMOIntegratedGlobalObservingSystem
WMO WorldMeteorologicalOrganisation
GAIA-CLIMMeasurementMaturityMatrixGuidance
57
AppendixA GAIA-CLIMmeasurementdescription
(General Note: This measurement description shall not become longer than 5 pages permeasurement system described. Please only state to the most important facts and usetablesandbulletliststoprovideinformationwhereappropriate.)
(TypeMeasurementsystemNameandifavailabledigital identifierhere.Thenamemustbeuniqueandshouldincludeinstrumenttypeandlocationand/ornetworkidentifier):
Version Author Reviewers(ifany) Date
(Pleaseusetheabovetabletonoteversioncontrolonthisrecord)
1 Intentofthedocument(Provide information on what measurement system is being described and for whatapplication(s) itwascreated.Keepinmindthatthe informationistargetedatusersofanylevel who wish to use the measurements for scientific applications. Users may not beexpectedtobeexpertsforinsitu,remotesensingorreanalysistechniques.)
2 Pointofcontact(Pleaseprovideapointofcontact:OrganisationandContactdetails(atleastacontactname,organisationande-mailaddress)).
3 Measurementsdescription(Providealinktoanexistingtechnicalproductspecificationorprovidetheinformationinaformofatableinthisdocument.Thespecificationshallat least includemeasuredvariablenames(identifyinganythatareEssentialClimateVariables)andunits(includinguncertaintyestimates indicators if provided), length of record, spatial coverage, spatial and temporalsampling.)
4 Dataorigin(Provideabasicdescriptionofthemethodologyusedtoderivethemeasurementsincludinga description of data processingmethods such as the processing used to convert from adigitalcounttransmissionfromaradiosondetoageophysicalprofileestimate.)
5 Validationofanuncertaintyestimation(Provideasummaryofanyvalidationactivitiesperformedforthemeasurementproductandprovide a summary of uncertainty quantification of the product including whether the
GAIA-CLIMMeasurementMaturityMatrixGuidance
58
measurement ismetrologicallytraceabletoSIunitsoracceptedstandards (tabulatedformappreciated).
6 Considerationsforscientificapplications(Provideinformationontheapplicabilityoftheproductforthepossiblescientificapplicationincluding limitations. This includes aspects such as the ability tomeasure the full diurnalcycle,geographicalrepresentativity,samplingfrequencyetc.)
7 References(Provide a complete list of references used in this document and, if applicable, provideadditional reading referencesonmeasurementprinciples, retrievals,modelling, validation,uncertaintycharacterisation,product,andapplications.)
GAIA-CLIMMeasurementMaturityMatrixGuidance
59
AppendixB Measurementmaturityassessmentspreadsheet
TheGuidanceismosteasilycompletedusingtheassociatedexcelspreadsheettorecordthematurityofcandidatemeasurementsystems.ThesespreadsheetsarebasedupontheguidanceoutlinedinSection3.Thereisaspreadsheetforeachmajorassessmentstrand.Thespreadsheetsaregivenbelowintheorderthattheyarise.
1
Measurement system / program name here maturity level as of mm/dd/yyyy
GAIA-CLIM Measurement System Maturity Matrix
Maturity METADATA DOCUMENTATION UNCERTAINTYCHARACTERISATION PUBLICACCESS,FEEDBACK,ANDUPDATE USAGE SUSTAINABILITY SOFTWAREREADINESS
1 Nometadataavailable Limitedscientificdescriptionofthemeasurementmethodologyavailable None Restrictedavailabilitythroughrequest None None Conceptualdevelopment
2 Verybasicmetadataavailable
Comprehensivescientificdescriptionofthemeasurementmethodology,reportonlimitedvalidation,andlimitedmeasurementseriesuser
guide
Limitedstepstakentowardsassuringtraceabilityandcomparability;limitedinformationexistsonsystematic
andrandommeasurementuncertaintiesDataavaliablefromoriginator
Benefitforresearchapplicationsidentified;Potentialpublicand
commercialopportunitiesidentified
Measurementprogramissustainableintheshort-term Researchgradecode
3Standardsdefinedoridentified;sufficienttouseandunderstandthedataandextract
basicdiscoverymetadata
Score2+paperonmethodologypublished;Validationreportavailablefromdatacollectororingreyliterature;comprehensiveuserguidanceis
available
Score2+limitedtraceabailityandcomparabilityassured;comprehensivedocumentationonmeasurement
uncertaintiespresentandmethodsforroutinequalitymonitoringdefined
Dataanddocumentationpublicallyavailablefromoriginator,feedbackcollated,irregularupdates,initial
versioningandlocalarchival
Benefitsforresearchapplicationsdemonstrated;PublicandCommercialuseoccuringandbenefitsemerging
Measurementprogramissustainableandhasminimumlevelofnecessarysupportto
assureminimalqualitystandardsaremaintained
Researchcodewithpartiallyappliedstandards;codecontainsheaderand
comments,andaREADMEfile;PIaffirmsportability,numericalreproducibilityand
nosecurityproblems
4
Score3+standardssystematicallyapplied;meetsinternationalstandardsforthemeasurementmetadatacollection;
enhanceddiscoverymetadata;limitedlocationlevelmetadata
Score3+comprehensivescientificdescriptionavailablefromdataprovider;reportonintercomparisonavailable;paperonvalidation
published;userguideavailablefromdataproviderincludesdetailsofvalidationandcharacterisation
Score3+stepsrequiredtoestablishtraceabilityaredefined;(inter)comparisonagainstcorrespondingmeasurementsinorganisedcampaignsavailable;
quantitativeestimatesofuncertaintyavailableandroutinemonitoringpartiallyimplemented
Dataanddocumentationavailablethrougharecogniseddataportal;feedbackmechanismconsiderspublishedanalyses;versioncontrolformalized,rocbustarchivalon
multiplemedia
Score3+researchcitationsonproductusageoccurring;societalandeconomicalbenefitsdiscussed
Measurementprogramhasmedium-termsustainabilityandisnotliabletoasinglepointof
failure
Score3+draftsoftwareinstallation/usermanualavailable;3rdpartyaffirms
portabilityandnumericalreproducibility;passesdataproviderssecurityreview
5Score4+fullycompliantwithstandards;completediscoverymetadata;complete
locationlevelmetadata
Score4+comprehensivescientificdescriptionmaintainedbydataprovider;reportondata
assessmentresultsexists;userguideisregularlyupdatedwithupdatesonproductandvalidation;
descriptiononpracticalimplementationisavailablefromdataprovider
Score4+traceabilitypartlyestablished;measurementsregularlycomparedtoameasurementofsimilarorgreater
traceability;systematicuncertaintiesremovedanduncertaintyestimatesarepartiallytraceable;routine
qualitymonitoringfullyimplemented
Sourcedata,codeandmetadataarchivedandavailableuponrequest;establishedfeedbackmechanism;regularupdatecycle;fullyestablishedversioncontrol;data
archivalatrecognizednationalorinternationallong-termrepository
Score4+productbecomesreferenceforcertainresearchapplications;societalandeconomicbenefitsare
demonstrated
Measurementprogramislong-termsustainableandrobusttopossiblesourcesoffailure
Score4+operationalcodefollowingstandards,actionstoachievefullcompliancearedefined;software
installation/usermanualcomplete;3rdpartyinstallsthecodeoperationally
6Score5+regularlyupdatedandusingextendedmetadatawheredefined
Score5+journalpapersonproductupdatesareandmorecomprehensivevalidationandvalidationofquantitativeuncertaintyestimatesarepublished;
operationsconceptregularlyupdated
Score5+traceabilityestablished;measurementsareregularlycomparedtoothertraceablemeasurementstoverify;comprehensivevalidationofthequantitative
uncertaintyestimatesthatarefiullytraceable;routinemonitoringinplacewithresultsnotedinmetadataor
documentation
Score5+nodataaccessrestrictions;activeconsiderationofuserfeedback;dataavailableininitialversionfornear-realtimeapplications;allversionsretained,indexedand
availablethrougharecognisedrepository
Score5+productanditsapplicationsbecomereferencesinmultiple
researchfields;Influenceondecisionandpolicymakingdemonstrated
Measurementprogramissustainableandstrivingforconstantimprovement
Score5+fullycompliantwithstandards;TurnkeySystem
1 & 2 Operationalmeasurementcapability3 & 4 Baselinemeasurementcapability5 & 6 Referencemeasurementcapability
1
MEASUREMENT SYSTEM MATURITY EVALUATION GUIDELINES
Maturity METADATA StandardsCollection level metadata (including
change records) File level
1 No metadata available No standard considered None None
2 Very basic metadata available No standard considered Limited Limited
3 Standards defined or identified; sufficient to use and understand the data and extract basic discovery metadata
Metadata standards identified and/or defined and partially but not yet systematically applied
Sufficient to use and understand the data independent of external assistance; Sufficient for data provider to extract
discovery metadata from meta data repositories
Sufficient to use and understand the data independent of external assistance
4
Score 3 + standards systematically applied; meets international standards for the measurement metadata
collection; enhanced discovery metadata; limited location level metadata
Score 3 + standards systematically applied at file level and collection level by data provider. Meets international standards Score 3 + Enhanced discovery metadata
Score 3 + Limited location (station, grid-point, etc.) level metadata along with unique measurement set metadata
(e.g. batch, set-up, time, averaging period)
5 Score 4+ fully compliant with standards; complete discovery metadata; complete location level metadata
Score 4 + meta data standard compliance systematically checked by the data provider
Score 4 + Complete discovery metadata meets appropriate (at the time of assessment) international standards
Score 4 + Complete location (station, grid-point, etc.) level and measurement specific metadata
6 Score 5 + regularly updated and using extended metadata where defined
Score 5 + extended metadata that could be useful but is not considered mandatory is also retained. Score 5 + Regularly updated
MEASUREMENT SYSTEM MATURITY EVALUATION GUIDELINES
Maturity DOCUMENTATION Formal description of measurement methodology
Formal Validation Report Formal Measurement series User Guidance
1 Limited scientific description of the measurement methodology available
Limited scientific description of methodology available from data collector or instrument manufacturer None None
2Comprehensive scientific description of the
measurement methodology, report on limited validation, and limited measurement series user guide
Comprehensive scientific description available from data collector or instrument manufacturer Informal validation work undertaken.
Sufficient information on the measurements available to allow user to ascertain minimum set of information required for appropriate use
3Score 2 + paper on methodology published; Validation
report available from data collector or in grey literature; comprehensive user guidance is available
Score 2 + Journal paper on measurement methodology published
Instrument has participated in certified intercomparison campaign and results
available in grey literature
Comprehensive documentation on how the measurement is made available from data
collector or instrument manufacturer including basic data characteristics description
4
Score 3 + comprehensive scientific description available from data provider; report on inter comparison
available; paper on validation published; user guide available from data provider includes details of
validation and characterisation
Score 3 + Comprehensive scientific description available from Data Provider
Report on intercomparison to other instruments, etc.; Journal paper on
product validation published
Score 3 + including documentation of manufacturer independent characterisation and
validation
5
Score 4 + comprehensive scientific description maintained by data provider; report on data assessment
results exists; user guide is regularly updated with updates on product and validation; description on
practical implementation is available from data provider
Score 4 + Comprehensive scientific description maintained by Data Provider
Score 4 + Sustained validation undertaken via redundant periodic
measurements
Score 4 + regularly updated by data provider with instrument / method of measurement
updates and/or new validation results
6
Score 5 + journal papers on product updates are and more comprehensive validation and validation of quantitative uncertainty estimates are published;
operations concept regularly updated
Score 5 + Journal papers on measurement system updates published
Score 5+ Journal papers describing more comprehensive validation, e.g.,
error covariance, validation of qualitative uncertainty estimates
published
Score 5 + measurement description and examples of usage available in peer-reviewed
literature
1
MEASUREMENT SYSTEM MATURITY EVALUATION GUIDELINES
Maturity UNCERTAINTY CHARACTERISATION
Traceability Comparability Uncertainty quantification Routine Quality Monitoring
1 None None None None None
2Limited steps taken towards assuring traceability and comparability; limited information exists on
systematic and random measurement uncertainties
Comparison to independent stable measurement or local secondary standard undertaken irregularly
Validation using external comparator measurements done only periodically and these comparator
measurements lack traceability
Limited information on uncertainty arising from systematic and random effects in the measurement None
3
Score 2 + limited traceabaility and comparability assured; comprehensive documentation on
measurement uncertainties present and methods for routine quality monitoring defined
Score 2 + independent measurement / local secondary standard is itself periodically calibrated against a
recognized primary standard
Score 2 + Validation is done sufficiently regularly to ascertain gross systematic drift effects
Comprehensive information on uncertainty arising from systematic and random effects in the measurement Methods for routine quality monitoring defined
4
Score 3 + steps required to establish traceability are defined; (inter)comparison against corresponding measurements in organised campaigns available;
quantitative estimates of uncertainty available and routine monitoring partially implemented
Score 3 + processing steps in the chain of traceability are documented but not yet fully quantified.
Score 3 + (Inter)comparison against corresponding measurements in large-scale instrument
intercomparison campaigns
Score 3 + quantitative estimates of uncertainty provided within the measurement products characterising more or
less uncertain data pointsScore 3 + routine monitoring partially implemented
5
Score 4 + traceability partly established; measurements regularly compared to a
measurement of similar or greater traceability; systematic uncertainties removed and uncertainty estimates are partially traceable; routine quality
monitoring fully implemented
Score 4 + traceability in the processing chain partly established
Score 4 + compared regularly to at least one measurement that has a traceability score >=5
Score 4 + systematic effects removed and uncertainty estimates are partially traceable
Score 4 + monitoring fully implemented (all production levels)
6
Score 5 + traceability established; measurements are regularly compared to other traceable
measurements to verify; comprehensive validation of the quantitative uncertainty estimates that are
fiully traceable; routine monitoring in place with results noted in meta data or documentation
Score 5 + SI traceability in the processing chain fully established
Score 5 + compared periodically to additional measurements including some with traceability
assessment >5
Score 5 + comprehensive validation of the quantitative uncertainty estimates
Score 5 + routine monitoring in place with results fed back to other accessible information, e.g. meta
data or documentation
MEASUREMENT SYSTEM MATURITY EVALUATION GUIDELINES
Maturity PUBLIC ACCESS, FEEDBACK, AND UPDATE Public Access/ArchiveUser Feedback
Mechanism Updates to Record Version control Long-termdatapreservation
1 Restricted availability through request Data may be available through request to trusted users None None None None
2 Data avaliable from originator Data available for use through originator Ad hoc feedback None None None
3 Data and documentation publically available from originator, feedback collated, irregular updates, initial versioning and local archival Data and documentation available through originator Programmatic feedback collated
Irregularly following accrual of a number of new measurements or new
insightsVersioning by data collector Local archive retained by measurement collector
4Data and documentation available through a recognised data portal; feedback mechanism considers published analyses; version control
formalized, rocbust archival on multiple media Score 3 + available through recognized data portal Score 3+ consideration of
published analyses
Regularly updated with new observations and utilising input from
established feedback mechanism
Version control institutionalised and procedure documented
Each version archived at an institutional level on at least two media
5
Source data, code and metadata archived and available upon request; established feedback mechanism; regular update cycle; fully established
version control; data archival at recognized national or international long-term repository
Score 4 + source data, code and metadata available upon request
Established feedback mechanism and international data quality
assessment results are considered
Regularly operationally by stable data provider as dictated by availability of
new input data or new innovations
Fully established version control considering all aspects
Data, raw data and metadata is archived at a recognised data repository such as a National Meteorological Service, national archive or
international repository.
6Score 5 + no data access restrictions; active consideration of user feedback; data available in initial version for near-real time applications; all versions
retained, indexed and available through a recognised repositoryScore 5 + no access restrictions apply
Score 5 + Established feedback mechanism and international data
quality assessment results are considered in continuous data
provisions
Score 5 + initial version of measurement series shared in near real time
Score 5 + all versions retained and accessible upon request
Score 5 + all versions of measurement series, metadata, software etc. retained, indexed and
accessible upon request
1
MEASUREMENT SYSTEM MATURITY EVALUATION GUIDELINES
Maturity USAGE Research Public and commercial exploitation
1 None None None
2 Benefit for research applications identified; Potential public and commercial opportunities identified Benefits for research applications identified Potential benefits identified
3 Benefits for research applications demonstrated; Public and Commercial use occuring and benefits emerging
Benefits for research applications demonstrated by publication Use occurring and benefits emerging
4 Score 3 + research citations on product usage occurring; societal and economical benefits discussed
Score 3 + Citations on product usage occurring
Score 3 + societal and economical benefits discussed, data being distributed via
appropriate data portals.
5 Score 4+ product becomes reference for certain research applications; societal and economic benefits are demonstrated
Score 4 + product becomes reference for certain applications
Score 4 + societal and economical benefits demonstrated
6Score 5 + product and its applications become references in multiple research fields; Influence on decision and policy
making demonstrated
Score 5 + Product and its applications become references in multiple research
fields
Score 5 + influence on decision (including policy) making demonstrated
Maturity SUSTAINABILITY Sitingenvironment Scientific/expertsupport Programmaticsupport1 None None None None
2Measurementprogramis
sustainableintheshort-termSiteenvironmentisstableintheshort
term
Minimalscientificsupportrequiredtosustaintheprogramisavailable
Projectbasedfundingsupportavailable
3
Measurementprogramissustainableandhasminimumlevelofnecessarysupportto
assureminimalqualitystandardsaremaintained
Score2+siteownershipissustainableRelevantinstrumentexpertizeisavailabletosupportthe
measurements
Score2+withexpectationoffollowonfunding
4
Measurementprogramhasmedium-termsustainabilityandisnotliabletoasingle
pointoffailure
Score3+Siteisrepresentativeofabroaderregionaroundtheimmediate
location
Score3+atleasttwoexpertsavailabletosupportthemeasurementprogram
operation
Score3+notdependentuponasingleinvestigatororfundingline
5
Measurementprogramislong-termsustainableand
robusttopossiblesourcesoffailure
Score4+siteownership,immediateenvironmentislikelytobeunchanged
fordecades
Activeinstrumentationresearchanddevelopment
beingundertaken
Sustainedinfrastructuresupportavailabletofinancecontinuedoperationsforasfarascanbeenvisagedgivennationalandinternationalfundingvageries
6Measurementprogramissustainableandstrivingforconstantimprovement
Score5+long-termownershipandrightsareguaranteed
Score5+supportforactiveresearchanddevelopmentofinstrumentationorappliedanalysisoftheobservations
MEASUREMENTSYSTEMMATURITYEVALUATIONGUIDELINESNotethatthissetofcriteriaisoptionalandshouldonlybeappliedtorelevantmeasurementsystemsthatmakesubstantiveuseofsoftwaretotakeand/orprocessthemeasurementseries
Maturity SOFTWARE READINESS Coding standards Software Documentation
Portability and Numerical
Reproducibility Security
1 Conceptual development No coding standard or guidance identified or defined No documentation Not evaluated Not evaluated
2 Research grade code Coding standard or guidance is identified or defined, but not applied Minimal documentation Reproducible under identical
conditionsData provider affirms no security
problems
3
Research code with partially applied standards; code contains header and comments, and a
README file; PI affirms portability, numerical reproducibility and no security problems
Score 2 + standards are partially applied and some compliance results are available
Header and process description (comments) in the code Reproducible and portable Submitted for data provider’s security
review
4
Score 3 + draft software installation/user manual
available; 3rd party affirms portability and numerical reproducibility; passes data providers
security review
Score 3 + compliance is systematically checked in all code, but not yet compliant to the standards.
Score 3 + a draft software installation / user manual available
3rd party affirms reproducibility and portability Passes data provider’s security review
5
Score 4 + operational code following standards, actions to achieve full compliance are defined; software installation/user manual complete; 3rd
party installs the code operationally
Score 4 + Measurement provider has identified departures from the standards and actions are planned to achieve full compliance
Score 4 + enhanced process descriptions throughout the
installation / user manual complete
Score 4 + 3rd party can install the code operationally
Continues to pass the data provider’s review
6 Score 5 + fully compliant with standards; Turnkey System Code is fully compliant with standards. Score 5 + code and documentation is
publicly available from a webpage Score 5 + Turnkey system