This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 687289
Coastal Waters Research Synergy Framework
Validation Plan
Research Applications
Document Code: CORESYF-ACR-VVR-VVP03-E-R
Date of delivery: CDR-1 meeting (T0+10) Deliverable identifier: D4.03 Version of document: 1.1 – last updated 16/12/2016 Dissemination level for document: PU
Table of Signatures
Name Function Signature
Prepared by
Rory Scarrott Eirini Politi
Francisco Sancho Alberto Azevedo
Ana Moura Antoine Mangin Paolo Cipollini
UCC UCC LNEC LNEC
IH ACRI-HE
NOC
Reviewed by
Chloé Vincent
ACRI-HE, Application Developer
Approved by Miguel Terra-Homem Executive Board Chair
Signatures and approvals appear on original
Project start date: 01/01/2016
Project duration: 36 months
Validation Plan | Research Applications
PUBLIC Page 2 of 33
Revision Records
Version Date Changes Authors
1.0 18/11/2016 First issue of document before CDR
Chloé Vincent, ACRI-HE Rory Scarrott, UCC Eirini Politi, UCC Francisco Sancho, LNEC Alberto Azevedo, LNEC Ana Moura, IH Antoine Mangin, ACRI-HE Paolo Cipollini, NOC
1.1 16/12/2016
Complements on maturity of the applications and inputs homogenisation after CDR review.
Chloé Vincent, ACRI-HE Rory Scarrott, UCC Eirini Politi, UCC Francisco Sancho, LNEC Alberto Azevedo, LNEC Ana Moura, IH Antoine Mangin, ACRI-HE Paolo Cipollini, NOC
Validation Plan | Research Applications
PUBLIC Page 3 of 33
Table of Contents
1 Introduction ............................................................................................................................. 6
1.1 Purpose and Scope .......................................................................................................... 6
1.2 Document Structure ........................................................................................................ 6
2 RESEARCH APPLICATIONS VALIDATION PLAN ......................................................................... 7
2.1 Research Application Validation Process Overview ........................................................ 7
2.1.1 Organization ............................................................................................................ 7
2.1.2 Master Schedule ...................................................................................................... 7
2.1.3 Techniques and Methods ........................................................................................ 7
2.1.4 Standards, Practices and Conventions .................................................................... 8
2.1.5 Resource Summary .................................................................................................. 8
2.2 Research Application Validation Test Reporting ............................................................. 8
3 ACCEPTANCE TESTS ................................................................................................................. 9
3.1 Acceptance Test Designs ................................................................................................. 9
3.2 Task Iteration Policy ......................................................................................................... 9
3.3 Bathymetry determination from SAR imagery .............................................................. 10
3.3.1 Overview ................................................................................................................ 10
3.3.2 Acceptance Test Specification ............................................................................... 10
3.4 Bathymetry, benthic classification and water quality determination from optical
imagery ...................................................................................................................................... 13
3.4.1 Overview ................................................................................................................ 13
3.4.2 Acceptance Test Specification ............................................................................... 13
3.5 Vessel detection & Oil Spill Detection with SAR & Optical ............................................ 18
3.5.1 Overview ................................................................................................................ 18
3.5.2 Acceptance Test Specification ............................................................................... 18
3.6 Time series processing for hyper temporal optical data analysis ................................. 24
3.6.1 Overview ................................................................................................................ 24
3.6.2 Acceptance Test Specification ............................................................................... 25
3.7 Ocean and Coastal altimetry ......................................................................................... 27
3.7.1 Overview ................................................................................................................ 27
3.7.2 Acceptance Test Specification ............................................................................... 27
3.8 Acceptance Test Procedures ......................................................................................... 30
Validation Plan | Research Applications
PUBLIC Page 4 of 33
4 References ............................................................................................................................. 31
5 Annex I: Template for the Test Procedures ........................................................................... 32
List of Tables
Table 3-1: V1 Acceptance Tests Specifications for Bathy SAR ....................................................... 10
Table 3-2: V1 Acceptance Tests Specifications for Bathy Optical .................................................. 13
Table 3-3: V1 Acceptance Tests Specifications for Vessel Detection ............................................ 19
Table 3-4: V1 Acceptance Tests Specifications for Oil Spill Detection .......................................... 20
Table 3-5: V1 Acceptance Tests Specifications for hypertemporal TS .......................................... 25
Table 3-6 : V1 Acceptance Tests Specifications for Altimetry ....................................................... 28
Validation Plan | Research Applications
PUBLIC Page 5 of 33
Acronyms and Abbreviations AOI Area Of Interest
BBP Backscattering coefficient of particulates
BOA Bottom of Atmosphere
CDOM Coloured Dissolved Organic Matters
CDR Critical Design Review
Co-ReSyF Coastal Waters Research Synergy Framework
DEM Digital Elevation Model
ECMWF European Centre for Medium-Range Weather Forecasts
EMSA European Maritime Safety Agency
EO Earth Observation
FFT Fast Fourier Transform
GUI Graphical User Interface
IH Instituto Hidrografico, Portugal
LNEC Laboratório Nacional de Engenharia Civil, Portugal
NCEP National Centers for Environmental Prediction
NDWI Normalised Difference Water Index
NOAA National Oceanic and Atmospheric Administration, USA
NOC National Oceanography Centre, UK
OS Operating System
PC Personal Computer
PDR Preliminary Design Review
QGIS Geographic Information Systems Software
RMSE Root Mean Square Error
SAR Synthetic aperture radar
SAR-1, SAR-2
System Acceptance Reviews
SGDR Sensor Geophysical Data Records
SHOM Service Hydrographique et Océanographique de la Marine, France
SNAP SentiNel Application Platform
SSHA Sea Surface Height Anomaly
SST Sea Surface Temperature
SWH Significant Wave Height
TS Time Series
UCC University College Cork, Ireland
VTP Validation Test Procedures
VTS Validation Test Specifications
WW3 Wave Watch 3 (wave model)
Validation Plan | Research Applications
PUBLIC Page 6 of 33
1 Introduction The Co-ReSyF project is implementing a dedicated data access and processing infrastructure,
with automated tools, methods and standards to support research applications using Earth
Observation (EO) data for monitoring of Coastal Waters, levering on the components deployed
SenSyF (www.sensyf.eu). The main objective is to facilitate the access to Earth Observation data
and pre-processing tools to the research community, towards the future provision of future
Coastal Waters services based on EO data.
Through Co-ReSyF‘s collaborative front end, even inexperienced researchers in EO will be able
to load their applications to the system to compose and configure processing chains for easy
deployment on the cloud infrastructure. They will be able to accelerate the development of
high-performing applications taking full advantage of the scalability of resources available in the
cloud framework. The system’s facilities and tools, optimized for distributed processing, include
EO data access catalogues, discovery and retrieval tools, as well as a number of pre-processing
tools and toolboxes for manipulating EO data. Advanced users will also be able to go further and
take full control of the processing chains and algorithms by having access to the cloud back-end,
and to further optimize their applications for fast deployment for big data access and
processing.
The Co-ReSyF capabilities will be supported and initially demonstrated by a series of early
adopters who will develop new research applications on the coastal domain, guide the
definition of requirements and serve as system beta testers. A competitive call will be issued
within the project to further demonstrate and promote the usage of the Co-ReSyF release.
These pioneering researchers in will be given access not only to the platform itself, but also to
extensive training material on the system and on Coastal Waters research themes, as well as to
the project's events, including the Summer School and Final Workshop.
1.1 Purpose and Scope This document aims at defining the validation strategy for the first version of the Research
Applications implemented, explaining how the applications will be tested be its developers, and
the acceptance criteria to fulfil to assume the application is of good quality.
This document focuses solely on the V1 of the Research Applications part of the Co-ReSyF
platform, thus the acceptance criteria can be considered as evaluation criteria in this first
version of the application/validation.
1.2 Document Structure The structure of the document is as follows:
Chapter 2 : describes the Validation approach,
Chapter 4 : describes the Acceptances tests to be executed,
Chapter 5 : details the Reference Documents.
Validation Plan | Research Applications
PUBLIC Page 7 of 33
2 RESEARCH APPLICATIONS VALIDATION PLAN
2.1 Research Application Validation Process Overview
2.1.1 Organization
The project will follow a simplified validation process, based on carrying out only the acceptance
testing activities defined in (Deimos, 2016a).
The validation activities will be carried out by the Application Development leaders with
assistance of the Deimos development team. The Application Development leaders are
responsible for defining, preparing and executing the acceptance tests to be carried out during
the Acceptance campaign. The results of the Acceptance Campaign will be reported to the
Executive Board in the Validation Report document.
2.1.2 Master Schedule
The Validation activities will be carried out according to the schedule below, where the
milestone reviews in (Co-ReSyF, 2016c) are used as reference dates:
1. After PDR-1 (August 2016), the first version of the Acceptance Tests specification is
prepared with the requirements baseline for V1;
2. At the CDR-1 (November 2016), the Acceptance Tests specification is reviewed;
3. During the implementation period of all V1 applications, the test procedures for the
Acceptance Tests is defined;
4. One month before SAR-1 (June 2017), the Acceptance Tests are executed and the results
are recorded in the Validation Report;
5. At the SAR-1 (July 2017), the Validation Report is reviewed;
6. After PDR-2 (Jan 2018), the Acceptance Tests specification is updated to include the updates
to the requirements baseline for V2;
7. During the implementation period of all V2 applications, the test procedures for the new
Acceptance Tests is defined;
8. One month before SAR-2 (Sept. 2018), the new Acceptance Tests are executed and the
results are recorded in the Validation Report;
9. At the SAR-2 (Oct 2018), the Validation Report is reviewed.
2.1.3 Techniques and Methods
The method to be used for the Validation activities is to execute a set of Acceptance tests on the
research applications in order to verify that the modules and steps defined in (Co-ReSyF, 2016b)
are implemented, for the v1 applications. There is only one testing method foreseen, which is to
execute a set of procedural steps on the running applications that will verify their functionality.
The validation criteria will be defined. The information will be recorded in the Validation Report
document.
Validation Plan | Research Applications
PUBLIC Page 8 of 33
2.1.4 Standards, Practices and Conventions
The validation activities follow the standard defined in the Deimos Corporate Management
System (Deimos, 2016a), tailored for the simplified approach of this project that focus only on
the Acceptance activities.
2.1.5 Resource Summary
The validation activities will require the execution of the defined acceptance test procedures.
These procedures will be executed by the Application Development Leaders and will use a
standard PC (or Laptop) to connect to the online platform. Additional open source tools may be
used in the local PC/Laptop for analysis of the results, if needed, although an effort will be made
when defining the test procedures to use only the functionalities provided by the online
platform.
2.2 Research Application Validation Test Reporting The results of the Acceptance tests to be carried out for the validation activities will be reported
in the Validation Report document. For each acceptance test procedure the result of the step
shall be recorded as well as the date of when the test procedure was executed.
Validation Plan | Research Applications
PUBLIC Page 9 of 33
3 ACCEPTANCE TESTS
3.1 Acceptance Test Designs The Acceptance tests are designed to test a specific module of an application. The approach for grouping or not modules/processing steps in one test case is decided on a case by case basis and best practice is to ensure that the Pass/Fail Criteria can be written in a simple way. If the Pass/Fail Criteria becomes complex to describe, this is an indicator that the test case is covering too many processing steps. Several test cases can be further grouped into one test procedure, when defining the test procedures, if they share common steps, in order to reduce the execution time of the tests. Clear candidates to grouping are test cases that have the same scenario and inputs. A test case is specified using the following fields:
A unique identifier for each Validation Tests Specification (VTS).
A brief description of the module/result that the test aims to validate.
A brief description of the overall scenario for the test (including the state that the application/processing have to be in before the test begins), i.e. explains how the module will be tested.
The inputs to the test, described in more detail than in the test scenario. Note that the table does not give the complete, detailed description of all the inputs, because that is done instead in the test procedure.
The test pass criteria, e.g. in terms of the expected values of certain outputs.
3.2 Task Iteration Policy An acceptance test procedure should be repeated whenever a failed step in the procedure prevents the validation of a requirement and as a result of that a
modification of the application is needed. For those cases, the full test procedure shall be repeated in order to ensure that all the modules in that same test
procedure are validated after the change in the application.
Validation Plan | Research Applications
PUBLIC Page 10 of 33
3.3 Bathymetry determination from SAR imagery
3.3.1 Overview
The SAR-imagery bathymetry application provides the coastal bathymetry underneath a given coastal area sensed remotely by a Synthetic Aperture Radar
(SAR) satellite sensor, at a given time of acquisition. The application is based on the detection of the ocean swell wavelength from a SAR image intensity,
which is then related to the local water depth. The output of the algorithm consists in a Digital Elevation Model (DEM) for the target area.
The method put in place is semi-automatic (at least for the version 1 of this application) – this means that successive steps have to be applied, each of them
requiring a user confirmation. If the user does not confirm the validity of one step, then the process is stopped.
The SAR bathymetry application development is rather well advanced and the validation plan can cover functionality as well as first level of scientific
evaluation. This validation plan will be adapted and further detailed later in the project when the version 2 of the algorithm is ready.
3.3.2 Acceptance Test Specification
The following table details the acceptance tests to be performed to assess the validity of the Bathymetry determination from SAR imagery application.
These will be numbered APPLI1-VTS-xx.
Table 3-1: V1 Acceptance Tests Specifications for Bathy SAR
VTS No. Module/Result Tested
Test Scenario Required Inputs Pass/Fail Criteria
APPLI1-VTS-01
All Output format Sentinel 1 imagery (SLC and GRD products) from Co-ReSyF catalogue.
1. The outputs of all processing steps shall be georeferenced maps, loadable in QGIS or SNAP software (GeoTIFF format)
APPLI1 -VTS-02
Wind velocity check External wind velocity check for the centre point of each SAR image.
AOI and time period. Wind data from ECMWF or NCEP re-analysis.
1. Wind velocity should be comprised between 3 and 10 m/s for an image to be a good candidate for the application of the proposed methodology and observation of swell features in the SAR images.
Validation Plan | Research Applications
PUBLIC Page 11 of 33
VTS No. Module/Result Tested
Test Scenario Required Inputs Pass/Fail Criteria
APPLI1 -VTS-03
Waves/swell check External wave data for each SAR image
AOI and time period. Data from buoys or WW3 model, from NOAA or other local institution.
1. Existence of a well-defined ocean swell with one predominant direction and wavelengths adequate for the application of the method.
APPLI1 -VTS-04
Image selection protocol check
SAR image visual check AOI and time period. Sentinel 1 images from Co-ReSyF catalogue.
1. The SAR image is visualized to verify the existence of wind shadow areas, in the bathymetry AOI.
2. The presence of swell is verified visually. 3. The presence of strong wave diffraction or
wave refraction zones near the shore should be verified, as those may influence the applicability of the algorithm. As example, regions where the patterns show the presence of crossed-waves are inadequate.
4. Other features are checked, such as the presence of different signatures like oil spills, vessels, internal waves, river plumes or other coastal processes that could affect the algorithm application.
APPLI1 -VTS-05
Radiometric, speckle, sigma0 and crop and creation of the Bathymetry AOI with– visual check
The application produces a GEOTIFF file of the corrected SAR image.
In terms of SAR imagery, Sentinel 1 images are needed from the Co-ReSyF catalogue. The user must define a polygon of the region for the application of the depth inversion methodology and the final resolution of the output grid.
1. The polygon of the AOI must be closed and properly defined.
2. The grid size must be within a default range, depending on image resolution.
3. The image within the AOI shall show clear wave crest patterns.
Validation Plan | Research Applications
PUBLIC Page 12 of 33
VTS No. Module/Result Tested
Test Scenario Required Inputs Pass/Fail Criteria
APPLI1 -VTS-06
Determination of the FFT boxes for each point of the final grid
The wavelength, wave
direction and wave number of
the swell is retrieved from
each FFT box.
No special input needed. 1. The image is cropped in N FFT boxes which are dependent of the resolution of the grid.
2. Check if the wavelength is comprised between a pre-defined interval.
3. Check if the direction of the swell is comprised between an expected interval, from the visual inspection (APPLI1 -VTS-04).
APPLI1 -VTS-07
Depth inversion algorithm
The bathymetry map is
created for the final grid.
No special input needed. 1. Creation of the bathymetry map. The estimated depths must be coherent, particularly at adjacent grid points. If there are abrupt bed changes within the AOI, these shall be verified by other means, as the algorithms may fail above these features (e.g., rocky beds, reefs, submerged structures).
2. Validation of the final outputs with in situ data (error and statistical estimation).
3. The outputs of all processing steps shall be georeferenced maps, loadable in QGIS or SNAP software (GeoTIFF format)
APPLI1 -VTS-07
Comparison of output with EMODNET and local soundings.
Validation of the final
bathymetry map with in situ
data.
EMODNET and/or in situ bathymetry
1. Validation of the final bathymetry map with in situ data.
Validation Plan | Research Applications
PUBLIC Page 13 of 33
3.4 Bathymetry, benthic classification and water quality determination from optical imagery
3.4.1 Overview
The bathymetry, benthic classification and water quality retrieval in shallow waters from optical sensors (i.e. typically between 0 and 8-10m) has to take
into account several parameters that modify the light backscattered upwards the sea surface. These parameters are:
1. the water content that impacts the light travelling in the water column;
2. the light path length (so the water depth);
3. the reflection (albedo) of the seafloor (hence the benthic classification).
The method put in place is semi-automatic (at least for the version 1 of this application) – this means that successive steps have to be applied, each of them
requiring a user confirmation. If the user does not confirm the validity of one step then the process is stopped and might be repeated either with a new
image or with a new parametrization of the processing chain.
The optical bathymetry application development is rather well advanced and the validation plan can cover functionality as well as first level of scientific
evaluation. This validation plan will be adapted and further detailed later in the project when the version 2 of the algorithm is ready.
3.4.2 Acceptance Test Specification
The following table details the acceptance tests to be performed to assess the validity of the Bathymetry, benthic classification and water quality
determination from optical imagery application. These will be numbered APPLI2-VTS-xx. Each VTS shall be tested in the order of specification in this table
(i.e. all the previous VTS are required for the acceptance of the current acceptance test).
Table 3-2: V1 Acceptance Tests Specifications for Bathy Optical
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
APPLI2- All Output format intermediate outputs from 1. The outputs of all processing steps shall be
Validation Plan | Research Applications
PUBLIC Page 14 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
VTS-01 the application steps georeferenced maps, loadable in QGIS software (GeoTIFF format)
APPLI2 -VTS-02
Atmospheric correction (before visual check)
Images are corrected from Rayleigh scattering and gaseous absorption.
optical HR images (Sentinel2 in particular), additional VHR optical images if available
1. Source images contain land and water pixels 2. Atmospheric correction provides bottom of
atmosphere data in the Area of Interest 3. The lowest quality pixels are flagged (clouds, shadows)
– this is complemented by a visual check step
APPLI2 -VTS-03
Pair optical image and bathymetry
External bathymetry data is available and associated with each satellite image considered
EMODNET or IH or Litto3D or SHOM bathymetry data
1. The Co-ReSyF application allows access to bathymetry data available on the area of interest, and when possible at a date close to the one of the considered satellite image used.
APPLI2 -VTS-04
Atmospheric correction - visual check
Flagged Bottom Of Atmosphere (BOA) images are available on the AOI
EMODNET or IH or Litto3D or SHOM bathymetry data paired with the image considered (same area)
1. The clouds detected are checked, and flagged if not already done (as well as their shadows)
2. In case of excessive coverage of clouds over the water, the image shall be removed from the list of processed images.
3. The optical image is compared with the gradients of bathymetry data (e.g. from IH, Litto3D, SHOM or EMODNET) to better understand where the changes in the optical image are due to water depth (that could be seen on the bathymetry) or rather to changes in sea bottom type.
APPLI2 -VTS-05
Land Masking (NDWI)
The application produces a GEOTIFF file of land/sea mask. the land mask is customisable by the user.
No special input needed, apart from output from previous step.
1. A land mask covers all the land areas. Land is masked with 0, water with 1.
2. The user has the possibility to adapt the threshold used in this module.
APPLI2 -VTS-06
K-means marine pixel classification
The module provides a set of cluster defined from the
No special input needed. OPTIONAL: the user can
1. The number of cluster is set by default, but also customisable by the user.
Validation Plan | Research Applications
PUBLIC Page 15 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
atmospherically corrected image
specify a number of cluster (Integer).
2. The brightest sea bottom class is numbered 1. 3. The image is “cut” into N clusters (N being the default
cluster number or the value specified by user). 4. The N clusters cover the full image –i.e. no data is lost
in the process 5. The clusters defined seem in line with the optical image
perception and its comparison with the external bathymetry dataset: the colours changes between areas of different clusters are not linked to bathymetry changes – but probably to changes in sea bed types.
APPLI2 -VTS-07
Bathymetry and water quality inversion (before or after visual inspection)
The process shall provide 4 maps (water depth, bbp, CDOM, chlorophyll-a, cluster location) per cluster and per source image, as well as the corresponding combined maps (water depth, bbp, CDOM, chlorophyll-a and sea bottom classification).
Scalar values for the size of the pixels to consider in the processing (default values or user-defined)
1. The combined inversion maps (water depths, chlorophyll-a, bbp and CDOM) have continuous values at the limits of two clusters. If the analysis shows spatial heterogeneity over the map, the inversion process shall be adapted (e.g. change number of cluster).
2. The areas with water depths deeper than 15 meters are not considered (flagged).
APPLI2 -VTS-08
Water level correction
The process provides bathymetry maps for each image treated (removing surge and tide components to the water level maps)
Tidal database depending on availability and provider e.g. from FES2004 (1/8°x1/8°) & NCEP atmospheric pressures
1. In low tides areas, the bathymetry is close to the water depth level.
2. The bathymetry retrieved is independent on the time of the day (referenced to the hydrographic reference level).
3. The comparison between the satellite derived bathymetry and the external bathymetry data is reasonable. To process to this analysis, the data on the thinner grid shall be first degraded onto a coarser grid
Validation Plan | Research Applications
PUBLIC Page 16 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
corresponding either to the satellite or to the external dataset. Then, the correlation between the pixels shall be assessed, as well as a comparison of the patterns and the Root Mean Square Error, RMSE, between the two datasets). In addition, the standard deviation of the thinner bathymetry shall be assessed and found reasonable over the area of interest.
APPLI2 -VTS-09
Averaged bathymetry and estimate of variability
This step provides the median value of all the bathymetry results (and associated standard deviation) available over the same area of interest during a period shorter than six months.
in situ measurements of bathymetry
1. All the optical images used in the initial satellite dataset are chosen during a period shorter than three to six months to avoid changes due to sediment dynamics. (The period duration can even be shortened considering the sedimentary characteristics of the area chosen. This can be decided depending on the final results produced, e.g. if too high dispersion in bathymetry results from an image to another).
2. For each pixel of the area of interest, all the bathymetry values are stored and compared. The dispersion of the data is analysed, and outliers are removed. The comparison of the results of the median and the without-outliers-dataset median are compared. They shall be close so that this module is validated. Another method can be chosen to compute the final estimate of the area’s bathymetry if showing better results. If any method gives good result, each of the images considered for the bathymetry computation shall be analysed and removed whenever needed to improve the dispersion of the bathymetry data.
3. A confidence level is made available on the basis of the
Validation Plan | Research Applications
PUBLIC Page 17 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
standard deviation results. The higher the standard deviation, the lower the confidence level.
4. The standard deviation is reasonable compared to the inherent variability of bathymetry in the area of interest.
5. The final bathymetry values is very close to the in situ measurements provided by the user, if any. (Correlation close to 1, small RMSE). At the first stage of the application validation, thresholds can’t be chosen. This phase is rather an evaluation that will be used to set the validation thresholds for version 2 evaluation.
Validation Plan | Research Applications
PUBLIC Page 18 of 33
3.5 Vessel detection & Oil Spill Detection with SAR & Optical
3.5.1 Overview
The vessel detection algorithm is based on the principle that vessels generally appear as bright objects over the relatively darker sea surface in SAR imagery.
The accuracy of detection can be limited by speckle noise in SAR data, weather conditions (e.g. increased sea surface roughness due to high winds will
decrease contrast between vessel objects and their background), and artefacts in the image that are often misclassified as vessels. In addition, the size of
the area of interest may limit the detection capability. For example, wide swath SAR imagery has the advantage of covering large areas but has relatively
poor spatial resolution, and thus limiting the minimum size of detectable vessels.
The oil spill detection algorithm aims to distinguish between dark, smoother areas (oil spill) and rougher areas (sea surface) in SAR imagery. VV polarised
SAR acquisitions should be preferred for this application as they generally provide higher contrast, when oil is floating on the sea surface. Certain ocean
features formed due to meteorological or oceanographic conditions can be misinterpreted as oil spills in SAR images. Common look-alikes (i.e. dark
structures resembling oil spills) include natural films, low wind surfaces, rain cells, shear zones, internal waves. The spectral signature of pixels in optical
imagery is also used to distinguish between water and oil spills, and the results are compared to the SAR outputs.
Additionally, both algorithms are mature enough that both technical verification and scientific validation (using external validation datasets) can be
performed.
3.5.2 Acceptance Test Specification
The following table details the acceptance tests to be performed to assess the validity of the Vessel detection & Oil Spill Detection with SAR & Optical
application. These will be numbered APPLI3-VTS-xx. Each VTS shall be tested in the order of specification in these tables (i.e. all the previous VTS are
required for the acceptance of the current acceptance test). Please note that certain validation tests are identical for both vessel detection and oil spill
detection applications.
Validation Plan | Research Applications
PUBLIC Page 19 of 33
Table 3-3: V1 Acceptance Tests Specifications for Vessel Detection
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
APPLI3-VTS-01
Pre-processing analysis
Images are calibrated and despeckled, geometrically corrected and the land is masked out
Data: SAR image Tools: Co-ReSyF visualisation tools, calibration and despeckling tool, georeferencing tool, land masking tool
1. The images are calibrated 2. The images appear despeckled
compared to original after application of despeckling filter of choice
3. The images are georeferenced 4. Land has been successfully masked out 5. The pre-processing outputs are in
GeoTIFF format
APPLI3-VTS-02
Select Area of Interest (AOI)
Original image is cropped to select only AOI (volume reduction)
Data: SAR image in GeoTIFF format (output of APPLI3-VTS-01 module) Tools: Co-ReSyF visualisation tools, AOI selection and crop tool
1. Cropped image contains AOI
APPLI3-VTS-03
Application of ship detection algorithm
Detection of brightest pixels and objects in each image tile. Generation of metadata file.
Data: SAR image in GeoTIFF format (output of APPLI3-VTS-02 module) Tools: Co-ReSyF visualisation tools, ship detection algorithm
1. The image (tile) is a georeferenced binary image, where white objects represent detected vessels (including false alarms)
2. Metadata file (.txt format) includes central pixel coordinates, size and direction for each detected bright object
APPLI3-VTS-04
Merging image tiles Generation of original AOI image from all image tiles used for parallel processing
Data: Georeferenced binary images (outputs of APPLI3-VTS-03 module) Tools: Co-ReSyF visualisation tools, merging tool (with option to save
1. The binary result tile images are merged back into the original AOI image
2. The merged image is in shapefile (.shp) format
Validation Plan | Research Applications
PUBLIC Page 20 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
as *.shp) 3. The shapefile contains an attribute table that lists all metadata files (.txt) that were generated per each tile in the previous step
APPLI3-VTS-05
AIS shapefile generation
Query based on satellite image date/time acquisition (allowing user-defined temporal window) generates list of AIS data that are suitable for match-up pairs
Data: Merged *.shp images (output of APPLI3-VTS-04 module), external AIS database (e.g. EMSA) Tools: Query tool
1. Generation of shapefile (.shp) with AIS data that fall on, or around (temporal window is defined by user), the date and time of the SAR image acquisition
APPLI3-VTS-06
All Output data format Data: AOI image in GeoTIFF format, binary (merged) result over the AOI in shapefile format, AIS match-up dates/times shapefile Tools: Co-ReSyF catalogue is sufficient for verification of outputs and scientific validation using these outputs
1. The outputs of the vessel detection processor are three: (i) AOI image in GeoTIFF format, (ii) binary (merged) result over the AOI in shapefile format, (iii) AIS match-up dates/times shapefile
Table 3-4: V1 Acceptance Tests Specifications for Oil Spill Detection
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
APPLI3-VTS-01
Pre-processing analysis (SAR)
SAR images are calibrated and despeckled, geometrically
Data: SAR image
1. The images are calibrated 2. The images appear despeckled
Validation Plan | Research Applications
PUBLIC Page 21 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
corrected and the land is masked out
Tools: Co-ReSyF visualisation tools, calibration and despeckling tool, georeferencing tool, land masking tool
compared to original after application of despeckling filter of choice
3. The images are georeferenced 4. Land has been successfully masked out 5. The pre-processing outputs for the SAR
images are in GeoTIFF format
APPLI3-VTS-07
Pre-processing analysis (optical)
Optical images are atmospherically corrected and the land/clouds are masked out
Data: Medium spatial resolution optical imagery, e.g. Sentinel-2, or Landsat that temporally and spatially coincides with SAR image used in APPLI3-VTS-01 module Tools: Co-ReSyF visualisation tools, atmospheric correction, land mask, cloud mask tools
1. The images are atmospherically corrected into bottom of atmosphere reflectance
2. Land has been successfully masked out 3. Clouds have been successfully masked
out (user needs to be aware of cloud shadows that will still remain in the image)
4. The pre-processing outputs for the optical images are in GeoTIFF format
APPLI3-VTS-02
Select Area of Interest, AOI (SAR and optical)
Original image is cropped to select only AOI (volume reduction)
Data: SAR image in GeoTIFF format (output of APPLI3-VTS-01 module), Optical image in GeoTIFF format (output of APPLI3-VTS-07 module) Tools: Co-ReSyF visualisation tools, AOI selection and crop tool
1. Cropped image (SAR/optical) contains AOI
APPLI3-VTS-08
Application of oil spill detection algorithm (SAR)
Detection of dark features on the sea surface and classify as oil spills or look-alikes
Data: AOI SAR image (output of APPLI3-VTS-02 module) Tools: Co-ReSyF visualisation tools, oil spill detection algorithm for SAR
1. The image (tile) is a georeferenced binary image, where white objects represent detected oil spills (including false alarms)
2. Metadata file (.txt format) includes
Validation Plan | Research Applications
PUBLIC Page 22 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
central and extreme pixel coordinates, area and perimeter for each detected object
APPLI3-VTS-09
Application of oil spill detection algorithm (optical)
Separation of oil spills from water or other ocean features is performed on pixel-based spectral signatures
Data: AOI optical image (output of APPLI3-VTS-02 module) Tools: Co-ReSyF visualisation tools, oil spill detection algorithm for optical
1. The image (tile) is a georeferenced binary image, where white objects represent detected oil spills (including false alarms)
2. Metadata file (.txt format) includes central and extreme pixel coordinates, area and perimeter for each detected object
APPLI3-VTS-04
Merging image tiles (SAR and optical)
Generation of original AOI image from all image tiles used for parallel processing
Data: Georeferenced binary SAR and optical images (outputs of APPLI3-VTS-08 and APPLI3-VTS-09 modules) Tools: Co-ReSyF visualisation tools, merging tool (with option to save as *.shp)
1. The binary result image tiles (SAR/optical) are merged back into the original AOI image
2. The merged image is in shapefile (.shp) format
3. The shapefile contains an attribute table that lists all metadata files (.txt) that were generated per each image tile in the previous steps
APPLI3-VTS-10
Auxiliary data shapefile generation
Query based on satellite image (SAR and optical) date/time acquisition (allowing user-defined temporal window) generates list of auxiliary data that are suitable for match-up pairs
Data: Merged *.shp images for SAR and optical (output of APPLI3-VTS-04 module), external oil slick event database, external meteorological data (wind speed and direction, precipitation events) Tools: Query tool
1. Generation of shapefile (.shp) with auxiliary data that fall on, or around (temporal window is defined by user), the date and time of the SAR and optical image acquisition
Validation Plan | Research Applications
PUBLIC Page 23 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
APPLI3-VTS-11
All Output data format Data: AOI SAR and optical images in GeoTIFF format, binary (merged) result over the AOI in shapefile format (SAR), binary (merged) result over the AOI in shapefile format (optical), auxiliary match-up dates/times shapefile Tools: Co-ReSyF catalogue is sufficient for verification of outputs and scientific validation using these outputs
1. The outputs of the vessel detection processor are three: (i) AOI SAR and optical images in GeoTIFF format, (ii) binary (merged) result over the AOI in shapefile format (SAR), (iii) binary (merged) result over the AOI in shapefile format (optical), (iv) auxiliary match-up dates/times shapefile
Validation Plan | Research Applications
PUBLIC Page 24 of 33
3.6 Time series processing for hyper temporal optical data analysis
3.6.1 Overview
This study will explore a data-driven approach to extracting oceanic boundary region information. The methodology has roots in the LaHMa methodology
described by De Bie et al. (2012)1, and adapted for use on different parameters, and in oceanic (rather than terrestrial) regions. It uses a series of image-
processing based Clustering steps, and GIS-based boundary extraction steps to derive information on regions in the Earth’s oceans where one is more likely
to encounter a boundary whilst transiting across it. The oceanic application of the LaHMa methodology is being deployed on Sea Surface Temperature data
for the purposes of this Validation exercise.
Data processing for the algorithm development phase using SST data consists of the following:
1. Extraction of image data from archive format (NetCDF, Zip etc.).
2. Multiple image processing to cleaning sequential images of raw L4 data of poor quality data, using error quality flags (temporal cleaning).
3. Generation of random value images within the statistical range of the real data.
4. Consolidation of the extracted real temporal data, and randomised temporal data, into hyper-temporal data cubes.
5. Implementation of multiple unsupervised ISODATA clustering algorithm runs on each data cube.
6. Extraction of cluster boundaries.
7. Subsequent processing to a single layered raster spatial dataset containing data on the relative regularity of boundary occurrence within pixels.
8. Statistical analysis and verification of patterning observed.
These are fully described in the Co-ReSyF Algorithm Theoretical Basis Document. These will be distilled into modules and uploaded onto the Co-ReSyF
framework for future users to integrate into their processing sequences.
The Hyper Temporal application development is in a prototype testing stage and the validation plan can only cover functionality verification and not
scientific evaluation. This validation plan will be adapted and further detailed later in the project when the version 2 of the algorithm is ready. 1 De Bie, C.A.J.M.; Nguyen, T.T.H.; Ali, A.; Scarrott, R.; Skidmore, A.K. (2012). LaHMa : a landscape heterogeneity mapping method using hyper-temporal datasets.
International Journal of Geographic Information Science, DOI:10.1080/13658816.2012.712126
Validation Plan | Research Applications
PUBLIC Page 25 of 33
3.6.2 Acceptance Test Specification
The following table details the acceptance tests to be performed to assess the validity of the Time series processing for hyper temporal optical data analysis
application. These will be numbered APPLI4-VTS-xx. Note that for these tests to be conducted, a sample set of data shall be run through the Co-ReSyF
platform independently, and through the same processing steps achieved on a desktop PC. Many of these tests shall involve outputs from the different
processing stages being compared.
Table 3-5: V1 Acceptance Tests Specifications for hypertemporal TS
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
APPLI4-VTS-01
Extraction of image data
Ingestion into the system of data extracted from the raw netCDF, or zipped files
The Raw data files, in a format held in the online archive.
Data image successfully extracted from the original data file, into a form ready for further processing.
APPLI4 -VTS-02
Compilation of data cubes (from real & randomised data)
Verification that the data cube generated, contains the correct data images, in the correct sequence.
Data cube generated by the process.
Original images input into the data cube.
ArcGIS software
Independent desktop verification that each image has successfully been integrated into the data cube, in the correct order, with no loss of the required data values for further processing.
APPLI4 -VTS-03
Generation of randomised image datasets
Verification that the randomised image data, is generated within the statistical parameter range constraints sourced from the data cube
Original data cube, and input images
Associated images containing the randomised data.
ArcGIS software
Independent desktop verification that each image’s values lie within the range allowed by the statistical constraints.
APPLI4 -VTS-04
ISODATA Clustering of the data cubes
Independent Verification that the 91 cluster outputs extracted from the data cube
91 cluster images obtained from the read data cube using the Co-ReSyF platform
Independent comparison of like versus like cluster images (e.g. image of platform-derived 10 clusters, versus image of desktop-derived 10
Validation Plan | Research Applications
PUBLIC Page 26 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
by the Co-ReSyF system are equivalent to those extracted by the same methodology operating on a desktop computer
91 cluster images obtained from the real data cube using Erdas Imagine on a desktop PC.
ArcGIS software
clusters), and verification that output images are equal.
APPLI4 -VTS-05
Extraction of the cluster boundaries areas from the cluster images.
Independent Verification that for each of the 91 cluster output image files extracted from the data cube by the Co-ReSyF system, the boundaries between clusters have been successfully captured by the boundary extraction algorithm, and integrated into boundary extraction output image file
91 output cluster images
91 extracted boundary image files
Arc GIS software
Independent verification that boundary value pixels in the extracted boundary output image files, are spatially located in inter-cluster boundary areas in the classification output cluster image files.
APPLI4 -VTS-06
Boundary summation
Verification that the raster calculator functionality is operating properly
Boundary presence/absence files.
Summed boundary files from the Co-ReSyF platform
ArcGIS software
Determine whether the boundary file generated using the platform, and that generated on the desktop PC are equivalent.
Determine whether the land/ocean boundary areas have the highest value.
APPLI4 -VTS-07
Validation of pattern presence
Verification that the statistical analyses are being conducted successfully by the system
Statistical outputs generated by the Platform. Statistical outputs generated from the desktop PC processing.
Verification that the same statistical conclusions are reached from the same data being processed using the two different system pathways (platform and desktop PC).
Validation Plan | Research Applications
PUBLIC Page 27 of 33
3.7 Ocean and Coastal altimetry
3.7.1 Overview
The ocean and coastal altimetry application allows user direct access to along-track altimeter data reprocessed with state-of-the art algorithms that make
them amenable to applications in the coastal zone and over challenging open ocean situations such as sub-mesoscale dynamics and slicks. The user has
ample flexibility in the selection of the altimetric mission and the spatial and temporal ranges. Spatial domain can be specified by lat/lon boxes, a polygon in
.kml format or by selection from a pre-defined list of regions (the user can also define new regions for inclusion in the list). Alternatively the user can specify
a list of passes with longitude/latitude limits. The temporal limits can be entered as dates or orbital cycle ranges. The reprocessing is customisable by expert
users, by prescribing the application of specific corrections; otherwise ’recommended’ combinations of corrections are selectable to aid non-experts. The
content of the output files is also customizable to include the user’s preferred selection of estimated geophysical parameters and corrections; again, some
‘recommended’ options are available to aid non-experts.
Ocean Altimetry is a fully mature technique, while coastal altimetry is rapidly achieving maturity thanks to the extensive work by may laboratories, reflected
in the contributions to the Coastal Altimetry Workshop Series (whose 10th edition is being held in Florence in February 2017) and in a steadily growing body
of peer-reviewed publications. Some of the techniques developed for coastal altimetry may give an improved view of specific phenomena (like sub-
mesoscale and oil slicks) over the open ocean, so the time is mature for a processor that allow generation of the relevant products anywhere.
Acceptance test APPLI5-VTS-01 to APPLI5-VTS-07 below are meant as functionality tests only, i.e. to verify the input/processing/output of each module.
Instead, acceptance test APPLI5-VTS-08 is a proper scientific validation test, which also acts as an overall verification of the functionalities of the whole
selection plus processing chain.
3.7.2 Acceptance Test Specification
The following table details the acceptance tests to be performed to assess the validity of the Ocean and Coastal altimetry application. These will be
numbered APPLI5-VTS-xx.
Validation Plan | Research Applications
PUBLIC Page 28 of 33
Table 3-6 : V1 Acceptance Tests Specifications for Altimetry
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
APPLI5-
VTS-01
All modules Output format No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
passes) is sufficient.
1. The outputs of all processing steps shall be
Along-track CF-compliant netCDF files,
loadable by most common netCDF readers
(for instance Ferret)
APPLI5 -
VTS-02
Input data selection
module
User specifies mission, pass
list, cycle list or time limits,
latitude limits
No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
passes) is sufficient.
1. Selection module locates and plots/lists
correct list of all SGDR data to process
APPLI5 -
VTS-03
Input data selection
module
User specifies region for
predefined list and time limits
No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
passes) is sufficient.
1. Selection module locates and plots/lists
correct list of all SGDR data to process
APPLI5 -
VTS-04
Input data sub-
selection module
User selects a few passes
from list
No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
passes) is sufficient.
1. Module correctly passes to processor the
sub-selection of SGDR data to process
APPLI5 -
VTS-05
Retracker module Retracking of example data
from different missions with
appropriate algorithm
No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
1. Retracked quantities (range, SWH, sigma0)
for each mission and algorithm compare
favourably (i.e. have equivalent or less
Validation Plan | Research Applications
PUBLIC Page 29 of 33
VTS No. Module/Result
Tested Test Scenario Required Inputs Pass/Fail Criteria
passes) is sufficient. noise) with original data
APPLI5 -
VTS-06
Correction module Different combinations of
range corrections specified by
the user
No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
passes) is sufficient.
1. Corrections are properly applied according
to user selection
APPLI5 -
VTS-07
Output module Output with different degrees
of richness, as specified by the
user
No special input needed, Co-ReSyF
catalogue (including all altimeter
SGDR input data and catalogue of
passes) is sufficient.
1. netCDF output is generated at different
levels of complexity (i.e. including different
combinations of variables) according to
user specification
APPLI5 -
VTS-08
All modules and
main application
results
Validation of output
reprocessed SSHA against
selected tide gauges
Tide Gauge data from UK National
Tidal and Sea Level Facility
1. Number valid measurements recovered in
the coastal zone increases when using
reprocessed data w.r.t. using standard
data.
2. Correlation r between time series
(altimetry vs tide gauges) does not
decrease when using reprocessed data
w.r.t. using standard data.
Validation Plan | Research Applications
PUBLIC Page 30 of 33
3.8 Acceptance Test Procedures The procedures for the Acceptance tests will be defined in the Validation Report document according to the template presented in Annex I (Section 5). They are specific for each application. Each acceptance test procedure will cover one or more test specification (from the applications tables above), and will have a unique ID. Each step of the procedure will be numbered. The possible results for a test step are either PASS or FAIL, no other value is allowed. On cases where no P/F criterion is specified, the result of the step is assessed by the ability to execute the step.
Validation Plan | Research Applications
PUBLIC Page 31 of 33
4 References Co-ReSyF. (2016a). GRANT AGREEMENT-687289. European Commission, Research Executive
Agency.
Co-ReSyF (2016b). Research Applications ATBD, issue 1.1. European Commission, Research
Executive Agency.
Co-ReSyF (2016c). Project Management Plan, issue 1.0. European Commission, Research
Executive Agency.
De Bie, C.A.J.M.; Nguyen, T.T.H.; Ali, A.; Scarrott, R.; Skidmore, A.K. (2012). LaHMa : a
landscape heterogeneity mapping method using hyper-temporal datasets. International
Journal of Geographic Information Science, DOI:10.1080/13658816.2012.712126
Deimos (2016a). VERIFICATION AND VALIDATION PROCEDURE, EDG-CMS-PRO16-E, Issue
2.0. ELECNOR DEIMOS CORPORATE MANAGEMENT SYSTEM.
Deimos (2016b). NON-CONFORMANCE MANAGEMENT PROCEDURE, EDG-CMS-PRO12-E,
Issue 3.0. ELECNOR DEIMOS CORPORATE MANAGEMENT SYSTEM.
Validation Plan | Research Applications
PUBLIC Page 32 of 33
5 Annex I: Template for the Test Procedures
VALIDATION TEST PROCEDURE (VTP)
VTP Id.:
Id of Associated Validation Test Specification(s):
Module to
be Tested:
Required Test Environment:
Overview of the test procedure:
Detailed description of the test procedure, including how to observe and verify the results:
Step
Nb.
Description Result Pass/Fail Criteria Comments
Date of execution:
Validation Plan | Research Applications
PUBLIC Page 33 of 33
END OF DOCUMENT