Date post: | 29-Dec-2015 |
Category: |
Documents |
Upload: | gervase-stanley |
View: | 217 times |
Download: | 2 times |
Discussion on an HDF-GEO concept
HDF Workshop X30 November 2006
Abstract At the past several HDF & HDF-EOS Workshops, there has been some
informal discussion of building on the success of HDF-EOS to design a new profile, tentatively called HDF-GEO.
This profile would incorporate lessons learned from Earth science, Earth applications, and Earth model data systems.
It would encompass all types of data, data descriptions, and other metadata. It might support 1-, 2-, and 3-D spatial data as well as time series; and it would include raw, calibrated, and analyzed data sets.
It would support data exchange by building its needed complexity on top of minimal specialized features; and by providing clear mechanisms and requirements for all types of appropriate metadata.
The organizers propose to host a discussion among the workshop participants on the need, scope, and direction for HDF-GEO.
Which buzzwords would fit?
profile
bestpractices
datamodel
metadatacontent
atomic &compound
types
self docu-mentation
platformsupport
markup& schema
toolstest
suites
…
…ilities
HDF-GEO
namingrules
datalevels
georef
Questions for discussion by Earth science practitioners - Bottom-up analysis:
What are the successful features of existing community data formats and conventions? (HDF5, HDF-EOS, netCDF, CDF, GRIB BUFR, COARDS, CF-1, NITF, FITS, FGDC RS extensions, ISO, geoTIFF, ...)
Progress being made -- John Caron's Common Data Model
Specific needs for geo- and time-referenced data conventions
Specific needs to support observed (raw), calibrated, and analyzed data sets
Questions for discussion by Earth science practitioners - Top-down analysis: (1 of 2)
What is a profile? Consider specifics of how a standard or group of standards are implemented for a related set of uses and applications.
How does a profile relate to a format or other elements of a standard?
What constitutes overkill? How much profile would be beneficial, and how much would be difficult to implement and of limited utility?
Questions for discussion by Earth science practitioners - Top-down analysis: (2 of 2)
Why is it useful? Establishes specific meanings for complicated terms
or relationships Establishes common preferred terms for attributes
which can be described multiple ways. Establishes practices which are consistent with
portability across operating systems, hardware, or archives
Establishes common expectations and obligations for data stewardship
Clarifies community (and sponsor) long-term expectations, beyond short-term necessity
other ...
Discussion
Wrap-up
Send your list of provisional HDF-GEO requirements goals to be achieved How HDF-GEO would help
to me at [email protected]
Backup
Motivation: In many instances, application-specific 'profiles' or
'conventions' or best practices have shown their utility for users. In particular, profiles have encouraged data exchange within communities of interest. HDF provides minimal guidance for applications. HDF-EOS was a mission-specific profile; resulted in successes and lessons learned. HDF5 for NPOESS is another approach. Is it time for another attempt, benefitng from all the lessons, and targeted at a broader audience?
© 2005 The MITRE Corporation. All rights reserved
NOTICE
This technical data was produced for the U.S. Government under Contract No. 50-SPNA-9-00010, and is subject to the Rights in Technical Data - General clause at FAR 52.227-14 (JUN 1987)
Approved for public release, distribution unlimited
HDF Lessons from NPOESS & Future Opportunities (excerpt)
Alan M. Goldberg <[email protected]>
HDF Workshop IX, December 2005
© 2005 The MITRE Corporation. All rights reserved
Requirements for data products
Deal with complexity
– Large data granules Order of Gb
– Complex intrinsic data complexity Advanced sensors produce new challenges
– Multi-platform, multi-sensor, long duration data production
– Many data processing levels and product types
Satisfy operational, archival, and field terminal users
– Multiple users with heritage traditions
© 2005 The MITRE Corporation. All rights reserved
NPOESS products delivered at multiple levels
A/D Conversion
Detection
FluxManipulation
Packetization
Compression
FiltrationAux.SensorData
CCSDS (mux, code, frame) & Encrypt
CommXmitter
SE
NS
OR
S
OT
HE
RS
UB
SY
ST
EM
S
Cal.Source
DataStore
ENVIRONMENTALSOURCE
COMPONENTS
SP
AC
E S
EG
ME
NT
RDRProduction
EDRProduction
SDRProduction
EDR Level
SDR Level
RDR Level
IDP
S
CommReceiver
CommProcessing
Delivered Raw
C3S
TDR Level
© 2005 The MITRE Corporation. All rights reserved
Sensor product types
Swath-oriented multispectral imagery
– VIIRS – cross-track whiskbroom
– CMIS – conical scan
– Imagery EDRs – resampled on uniform grid
Slit spectra– OMPS SDRs – cross-track
spectra, limb spectra
Image-array fourier spectra– CrIS SDR
Directional spectra– SESS energetic particle sensor
SDR
Point lists– Active fires
3-d swath-oriented grid– Vertical profile EDRs
2-d map grid– Seasonal land products
Abstract byte structures– RDRs
Abstract bit structures– Encapsulated ancillary data
Bit planes– Quality flage
Associated arrays (w/ stride?)– geolocation
© 2005 The MITRE Corporation. All rights reserved
NPOESS product design development
Requirements- Multi-platform, multi-sensor, long duration data production- Many data processing levels and product types- Satisfy operational, archival, and field terminal users
Constraints- Processing architecture and optimization- Heritage designs- Contractor style and practices- Budget and schedule
Intentions- Use simple, robust standards- Use best practices and experience from previous operational and EOS missions- Provide robust metadata - Maximize commonality among products- Forward-looking, not backward-looking standardization
Resources- HDF5- FGDC- C&F conventions- Expectation of tools by others
Result Design Process - Experience- Trades& Analyses
© 2005 The MITRE Corporation. All rights reserved
Lessons & Way Forward
© 2005 The MITRE Corporation. All rights reserved
Observations from development to date
Avoid the temptation to use heritage approaches without reconsideration, but …
Novel concepts need to be tested Data concepts, profiles, templates, or best practices should be
defined before coding begins Use broad, basic standards to the greatest possible extent
– FGDC has flexible definitions, if carefully thought through Define terms in context; clarity and precision as appropriate Attempt to predefine data organizations in the past (e.g., HDF-EOS
‘swath’ or HDF4 ‘palette’) have offered limited flexibility. Keep to simple standards which can be built upon and described well. Lesson: be humble
It is a great service to future programs if we capture lessons and evolve the standards
How do we get true estimates of the life-cycle savings for good design?
© 2005 The MITRE Corporation. All rights reserved
Thoughts on future features for Earth remote sensing products Need to more fully integrate product components with HDF
features
Formalize the organization of metadata items which establish the data structure
– Need mechanism to associate arrays by their independent variables
Formalize the organization of metadata items which establish the data meaning
– XML is a potential mechanism – can it be well integrated?
– Work needed to understand the advantages and disadvantages
– Climate and Forecast (CF) sets a benchmark
Need a mechanism to encapsulate files in native format
– Case in which HDF is only used to provide consistent access
Need more investment in testing before committing to a design
Primary and Associated Arrays
Index Attribute
n-Dimensional Dependant Variable (Entity) Array
Primary Arraye.g., Flux, Brightness, Counts, NDVI
Associated Array(s)e.g., QC, Error barsdimension n
1-Dimensional Attribute Variables
Index Attribute
Primarye.g., UTC time or angle
Additionale.g., IET time, angle,
or presssure height
Associated Independent Variable(s)
Multi-Dimensional Attribute Variables
2-Dimensional Independent Variable Array(s)e.g., lat/lon, XYZ, sun alt/az,
sat alt/az, or land mask
Key concept: Index Attributes organize the primary dependant variables, or entities. The same Index Attributes maybe used to organize associated independent variables. Associated independent variables may be used singly (almost always), in pairs (frequently), or in larger combinations.
© 2005 The MITRE Corporation. All rights reserved
Issues going forward - style
Issues with assuring access understanding– How will applications know which metadata is present?
– Need to define a core set with a default approach Issues with users
– How to make providers and users comfortable with this or any standard
– How to communicate the value of: best practices; careful & flexible design; consistency; beauty of simplicity
– Ease of use as well as ease of creation Issues with policy
– Helping to meet the letter and intent of the Information Quality Act
Capturing data product design best practices– Flexibility vs. consistency vs. ease-of-use for a purpose
© 2005 The MITRE Corporation. All rights reserved
Issues going forward - features
Issues with tools
– Tools are needed to create, validate, and exploit the data sets. Understand structure and semantics
Issues with collections
– How to implement file and collection metadata, with appropriate pointers forward and backward
– How to implement quasi-static collection metadata
Issues with HDF
– Processing efficiency (I/O) of compression, of compaction
– Repeated (fixed, not predetermined) metadata items with the same <tag> not handled
– Archival format
© 2005 The MITRE Corporation. All rights reserved
Possible routes: Should there be an HDF-GEO? Specify a profile for the use of HDF in Earth science applications: Generalized point (list), swath (sensor coordinates), grid
(georeferenced), abstract (raw), and encapsulated (native) profiles. Generalized approach to associating georeferencing information
with observed information. Generalized approach to incorporating associated variables with
the mission data Generalized approach to ‘stride’ Preferred core metadata to assure human and machine readability Identification metadata in UserBlock Map appropriate metadata items from HDF native features (e.g.,
array rank and axis sizes) Preferred approach to data object associations: arrays-of-structs
or structs-of-arrays? Design guidelines or strict standardization?