SVERF
Hydrology SAF
System Verification File
Reference Number: SAF/HSAF/SVERF/0.5Issue/Revision Index: Issue 0.5Last Change: 08/02/2008
DOCUMENT SIGNATURE TABLE
Name Date Signature
Prepared by : H-SAF Project Team 08/02/2008 Approved by : H-SAF Project Manager 08/02/2008
DOCUMENT CHANGE RECORD
Issue / Revision Date Description
Version 0.5 08/02/2008 Preliminary Release prepared for CDR
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (1 di 82)27/06/2008 9.52.01
SVERF
DISTRIBUTION LIST
Country Organization Name Contact
Austria ZAMG Veronika Zwatz-Meise [email protected]
ZAMG Alexander Jann [email protected]
TU-Wien / IPF Wolfgang Wagner [email protected]
TU-Wien / IPF Stefan Hasenauer [email protected]
Belgium IRM Emmanuel Roulin [email protected]
IRM Gaston Demaree [email protected]
ECMWF Philippe Bougeault [email protected]
Matthias Drusch [email protected]
Klaus Scipal [email protected]
Finland FMI Pirkko Pylkko [email protected]
FMI Jarkko Koskinen [email protected]
FMI Jouni Pulliainen [email protected]
FMI Panu Lahtinen [email protected]
TKK Juha-Petri Karna [email protected]
SYKE Sari Metsämäki [email protected]
France Météo France Jean-Christophe Calvet [email protected]
Météo France Laurent Franchisteguy [email protected]
CNRS-CETP Mehrez Zribi [email protected]
CNRS-CESBIO Patricia de Rosnay [email protected]
Germany BfG Thomas Maurer [email protected]
Hungary OMSZ Eszter Lábó [email protected]
Italy USAM Massimo Capaldo [email protected]
CNMCA Luigi De Leonibus [email protected]
CNMCA Costante De Simone [email protected]
CNMCA Francesco Zauli [email protected]
CNMCA Daniele Biron [email protected]
CNMCA Davide Melfi [email protected]
CNMCA Attilio Di Diodato [email protected]
CNMCA Lucio Torrisi [email protected]
CNMCA Massimo Bonavita [email protected]
CNMCA Adriano Raspanti [email protected]
CNMCA Alessandro Galliani [email protected]
DPC Roberto Sorani [email protected]
DPC Luca Rossi [email protected]
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (2 di 82)27/06/2008 9.52.01
SVERF
DPC Silvia Puca [email protected]
DPC Angela Corina [email protected]
DPC William Nardin [email protected]
CNR-ISAC Franco Prodi [email protected]
CNR-ISAC Bizzarro Bizzarri [email protected]
CNR-ISAC Alberto Mugnai [email protected]
CNR-ISAC Vincenzo Levizzani [email protected]
CNR-ISAC Francesca Torricella [email protected]
CNR-ISAC Stefano Dietrich [email protected]
CNR-ISAC Francesco Di Paola [email protected]
Ferrara University Federico Porcu' [email protected]
Ferrara University Davide Capacci [email protected]
DATAMAT Flavio Gattari [email protected]
DATAMAT Emiliano Agosta [email protected]
Poland IMWM Piotr Struzik [email protected]
IMWM Bozena Lapeta [email protected]
IMWM Jerzy Niedbala [email protected]
IMWM Jaga Niedbala [email protected]
IMWM Monika Pajek [email protected]
IMWM Jakub Walawender [email protected]
IMWM Jan Szturc [email protected]
Romania NMA Andrei Diamandi [email protected]
NMA Simona Oancea [email protected]
Slovakia SHMÚ Ján Ka•ák [email protected]
SHMÚ Marián Jurašek [email protected]
SHMÚ Marcel Zvolenský [email protected]
Turkey TSMS Fatih Demýr [email protected]
TSMS Ali Umran Komuscu [email protected]
TSMS Ibrahim Sonmez [email protected]
TSMS Ayd•n Erturk [email protected]
METU Ali Unal Sorman [email protected]
METU Zuhal Akyurek [email protected]
METU Orhan Gokdemir [email protected]
METU Ozgur Beser [email protected]
ITU Ahmet Oztopal [email protected]
Anadolu University Aynur Sensoy [email protected]
EUMETSAT Lorenzo Sarlo [email protected]
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (3 di 82)27/06/2008 9.52.01
SVERF
Frédéric Gasiglia [email protected]
Dominique Faucher [email protected]
Jochen Grandell [email protected]
Lothar Schüller [email protected]
TABLE OF CONTENTS 1 Introduction
1.1 Purpose1.2 Scope1.3 Document Overview1.4 Glossary1.5 References
1.5.1 Applicable Documents1.5.2 Reference Documents
2 System Verification Overview2.1 General2.2 The Approach2.3 System Verification activities
2.3.1 Test Procedure failures management2.4 Test activities specification
2.4.1 Naming convention2.4.2 Test Case specification2.4.3 Test Procedure specification
2.5 Test results specification2.5.1 Test Results form2.5.2 Test Report form
3 H-SAF System Components test3.1 General Requirements test
3.1.1 Test Case TC_ST_GEN_0103.1.2 Test Procedure TP_ST_GEN_0103.1.3 Test Case TC_ST_GEN_0203.1.4 Test Case TC_ST_GEN_0303.1.5 Test Case TC_ST_GEN_0403.1.6 Test Case TC_ST_GEN_050
3.2 Data Acquisition Requirements test3.2.1 Test Case TC_ST_ACQ_0203.2.2 Test Case TC_ST_ACQ_0303.2.3 Test Case TC_ST_ACQ_0403.2.4 Test Case TC_ST_ACQ_050
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (4 di 82)27/06/2008 9.52.01
SVERF
3.2.5 Test Case TC_ST_ACQ_0603.2.6 Test Case TC_ST_ACQ_0703.2.7 Test Case TC_ST_ACQ_0803.2.8 Test Case TC_ ST_ACQ_0903.2.9 Test Case TC_ST_ACQ_1003.2.10 Test Case TC_ST_ACQ_1103.2.11 Test Case TC_ST_ACQ_1203.2.12 Test Case TC_ST_ACQ_1303.2.13 Test Case TC_ST_ACQ_1403.2.14 Test Case TC_ ST_ACQ_1503.2.15 Test Case TC_ST_ACQ_160
3.3 Pre-processing Requirements test3.3.1 Test Case TC_ST_PPR_010
3.4 Product Generation Requirements test3.4.1 Test Case TC_ST_PRG_0103.4.2 Test Case TC_ST_PRG_0203.4.3 Test Case TC_ST_PRG_0303.4.4 Test Case TC_ST_PRG_0403.4.5 Test Case TC_ST_PRG_0503.4.6 Test Case TC_PRG_0103.4.7 Test Case TC_ST_PRG_0603.4.8 Test Case TC_ST_PRG_0703.4.9 Test Case TC_ST_PRG_0803.4.10 Test Case TC_ST_PRG_0903.4.11 Test Case TC_ST_PRG_1003.4.12 Test Case TC_ST_PRG_1103.4.13 Test Case TC_ST_PRG_1203.4.14 Test Case TC_ST_PRG_1303.4.15 Test Case TC_ST_PRG_1403.4.16 Test Case TC_ST_PRG_1503.4.17 Test Case TC_ST_PRG_1603.4.18 Test Case TC_ST_PRG_1703.4.19 Test Case TC_ST_PRG_1803.4.20 Test Case TC_ST_PRG_1903.4.21 Test Case TC_ST_PRG_200
3.5 Reporting Requirements test3.5.1 Test Case TC_ST_REP_0103.5.2 Test Case TC_ST_REP_0203.5.3 Test Case TC_ST_REP_030
3.6 Archiving Requirements3.6.1 Test Case TC_ST_ARC_0103.6.2 Test Case TC_ST_ARC_0203.6.3 Test Case TC_ST_ARC_030
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (5 di 82)27/06/2008 9.52.01
SVERF
3.6.4 Test Case TC_ST_ARC_0403.6.5 Test Case TC_ST_ARC_0503.6.6 Test Case TC_ST_ARC_060
3.7 Dissemination Requirements test3.7.1 Test Case TC_ST_DIS_0103.7.2 Test Case TC_ST_DIS_0203.7.3 Test Case TC_ST_DIS_0303.7.4 Test Case TC_ST_DIS_0403.7.5 Test Case TC_ST_DIS_0503.7.6 Test Case TC_ST_DIS_0603.7.7 Test Case TC_ST_DIS_0703.7.8 Test Case TC_ST_DIS_0803.7.9 Test Procedure TP_ST_DIS_080
3.8 Reliability, Availability and Maintenance Requirements test3.8.1 Test Case TC_ST_RAM_0103.8.2 Test Case TC_ST_RAM_0203.8.3 Test Case TC_ST_RAM_0303.8.4 Test Case TC_ST_RAM_040
3.9 Operational Requirements test3.9.1 Test Case TC_ST_OPE_0103.9.2 Test Case TC_ST_OPE_0203.9.3 Test Case TC_ST_OPE_030
3.10 Performance Requirements test3.10.1 Test Case TC_ST_PER_0103.10.2 Test Case TC_ST_PER_020
3.11 Portability Requirements test3.11.1 Test Case TC_ST_POR_010
3.12 Resources Requirements test3.12.1 Test Case TC_ST_RES_0103.12.2 Test Case TC_ST_RES_0203.12.3 Test Procedure TP_ST_RES_020
3.13 Safety Requirements test3.13.1 Test Case TC_ST_SAF_0103.13.2 Test Procedure TP_ST_SAF_0103.13.3 Test Case TC_ST_SAF_0203.13.4 Test Case TC_ST_SAF_0303.13.5 Test Case TC_ST_SAF_040
3.14 Security Requirements test3.14.1 Test Case TC_ST_SEC_010
3.15 Documentation Requirements test3.15.1 Test Case TC_ST_DOC_010
4 Requirements Traceability4.1 System Requirements traceability to System Test Cases
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (6 di 82)27/06/2008 9.52.01
SVERF
5 TBDs/TBCs list6 Appendix A - Test tools
LIST OF TABLES
Table 1 Test Case specification templateTable 2 Test Procedure specification templateTable 3 Test Result formTable 4 Test Report formTable 5 Classes of messagesTable 6 System requirement vs. System/Component Test CasesTable 7 TBDs/TBDs listTable 8 Test tools
1 Introduction
1.1 PurposeThis document is the System Verification File Document (SVERF) for Satellite Application Facilities on support to Operational Hydrology and Water Management (H-SAF). It describes all tests activities of the H-SAF project at System/Subsystem level.ECSS standard founding principles have been adopted into H-SAF Engineering, such as the tailoring concept and the strict relationship between Software Engineering and System Engineering.These principles are particular meaningful in the H-SAF context, since H-SAF is a complex system into which the software layer is just one of many different layers building up the heterogeneous set-up.
1.2 ScopeThis document aims to describe test activities at System/Subsystem level for generation of Precipitation, Soil Moisture and Snow Parameters products and for their centralization, archiving and dissemination that shall be provided at the H-SAF sites CNMCA (Italy), ZAMG (Austria), ECMWF (UK), FMI (Finland), TSMS (Turkey) in accordance with what is declared in the System Integration, Verification & Validation Plan Document (SIVVP).It also covers test activities at System/Subsystem level for what concern Hydro Validation Subsystem tasks (products validation, reports production, etc.) and Offline Monitoring subsystem tasks.
1.3 Document OverviewSection 2 provides a general description of systems and related test activities.Section 3 provides test cases and procedures concerning general requirements.Section 4 provides test cases and procedures concerning data acquisition requirementsSection 5 provides test cases and procedures concerning pre-processing requirementsSection 6 provides test cases and procedures concerning product generation requirements.Section 7 provides test cases and procedures concerning reporting requirementsSection 8 provides test cases and procedures concerning archiving requirementsSection 9 provides test cases and procedures concerning dissemination requirements.Section 10 provides test cases and procedures concerning reliability, availability and maintenance requirementsSection 11 provides test cases and procedures operational requirements.Section 12 provides test cases and procedures concerning performance requirements.Section 13 provides test cases and procedures concerning safety portability requirements.
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (7 di 82)27/06/2008 9.52.01
SVERF
Section 14 provides test cases and procedures concerning resources requirements.Section 15 provides test cases and procedures concerning safety requirements.Section 16 provides test cases and procedures concerning security requirements.Section 17 provides test cases and procedures concerning documentation requirementsSection 18 provides system test results, system test reports and system test results summary of the project.Section 19 provides matrixes to verify the system requirement application.Section 20 summarizes the TBDs and TBCs items, providing the responsible of each item and a planned date for resolution.Section 21 provides Appendix A that contains list of used test tools.
1.4 GlossaryAAPP AVHRR and ATOVS Processing Package
ADEOS Advanced Earth Observation Satellite (I and II)
ALOS Advanced Land Observing Satellite
AMIR Advanced Microwave Imaging Radiometer
AMSR Advanced Microwave Scanning Radiometer (on ADEOS-II)
AMSR-E Advanced Microwave Scanning Radiometer - E (on EOS-Aqua)
AMSU-A Advanced Microwave Sounding Unit - A (on NOAA satellites and EOS-Aqua)
AMSU-B Advanced Microwave Sounding Unit - B (on NOAA satellites up to NOAA-17)
API Application Program(ming) Interface
ASAR Advanced SAR (on ENVISAT)
ASCAT Advanced Scatterometer (on MetOp)
ASI Agenzia Spaziale Italiana
ATDD Algorithms Theoretical Definition Document
ATMS Advanced Technology Microwave Sounder (on NPP and NPOESS)
ATOVS Advanced TIROS Operational Vertical Sounder (on NOAA and MetOp)
AU Anatolian University
AVHRR Advanced Very High Resolution Radiometer (on NOAA and MetOp)
BAMPR Bayesian Algorithm for Microwave Precipitation Retrieval
BfG Bundesanstalt für Gewässerkunde
BRDF Bi-directional Reflectance Distribution Function
BVA Boundary Value Analysis
CASE Computer Aided System Engineering
CBA Component-Based Architecture
CBSD Component-based Software Development
CDA Command and Data Acquisition (EUMETSAT station at Svalbard)
CDD Component Design Document
CDR Critical Design Review
CESBIO Centre d'Etudes Spatiales de la BIOsphere (of CNRS)
CETP Centre d’études des Environnements Terrestres et Planétaires (of CNRS)
CI Configuration Item
CMIS Conical-scanning Microwave Imager/Sounder (on NPOESS)
CMP Configuration Management Plan
CNMCA Centro Nazionale di Meteorologia e Climatologia Aeronautica
CNR Consiglio Nazionale delle Ricerche
CNRM Centre Nationale de la Recherche Météorologique (of Météo-France)
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (8 di 82)27/06/2008 9.52.01
SVERF
CNRS Centre Nationale de la Recherche Scientifique
COM Component Object Model
CORBA Common Object Request Broker Architecture
COTS Commercial-off-the-shelf
CPU Central Processing Unit
CRD Component Requirement Document
CVERF Component Verification File
CVS Concurrent Versions System
DCOM Distributed Component Object Model
DEM Digital Elevation Model
DFD Data Flow Diagram
DMSP Defense Meteorological Satellite Program
DOF Data Output Format
DPC Dipartimento della Protezione Civile
DWD Deutscher Wetterdienst
E&T Education and Training
EARS EUMETSAT Advanced Retransmission Service (station)
ECMWF European Centre for Medium-range Weather Forecasts
ECSS European Cooperation on Space Standardization
EGPM European contribution to the GPM mission
EOS Earth Observing System
EPS EUMETSAT Polar System
ERS European Remote-sensing Satellite (1 and 2)
ESA European Space Agency
EUR End-User Requirements
FMI Finnish Meteorological Institute
FOC Full Operational Chain
FTP File Transfer Protocol
GEO Geostationary Earth Orbit
GIS Geographical Information System
GMES Global Monitoring for Environment and Security
GOMAS Geostationary Observatory for Microwave Atmospheric Sounding
GOS Global Observing System
GPM Global Precipitation Measurement mission
GPROF Goddard Profiling algorithm
GTS Global Telecommunication System
GUI Graphical User Interface
HMS Hungarian Meteorological Service
HRU Hydrological Response Unit
H-SAF SAF on support to Operational Hydrology and Water Management
HSB Humidity Sounder for Brazil (on EOS-Aqua)
HTML Hyper Text Markup Language
HTTP Hyper Text Transfer Protocol
HUT/LST Helsinki University of Technology / Laboratory of Space Technology
HVR Hydrological Validation Review
HYDRO Preliminary results of Hydrological validation
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (9 di 82)27/06/2008 9.52.01
SVERF
HYDROS Hydrosphere State Mission
HW Hardware
ICD Interface Control Document
ICT Information and Communication Technology
IEEE Institute of Electrical and Electronics Engineers
IFS Integrated Forecast System
INF Progress reports in between meetings
INWM Institute of Meteorology and Water Management (of Poland)
IPF Institut für Photogrammetrie und Fernerkundung
ISAC Istituto di Scienze dell’Atmosfera e del Clima (of CNR)
ISO International Standards Organization
IT Information Technology
ITU Istanbul Technical University
JPS Joint Polar System (MetOp + NOAA/NPOESS)
J2EE Java 2 Enterprise Edition
KIDS Kestrel Interactive Development System
KLOC Thousand (Kilo) Lines Of Code
KOM Kick-Off Meeting
LAI Leaf Area Index
LAN Local Area Network
LEO Low Earth Orbit
LIS Lightning Imaging Sensor (on TRMM)
LLS Lower Level Specifications
LOC Lines Of Code
LST Solar Local Time (of a sun-synchronous satellite)
MARS Meteorological Archive and Retrieval System
MetOp Meteorological Operational satellite
METU Middle East Technical University (of Turkey)
MHS Microwave Humidity Sounder (on NOAA N/N’ and MetOp)
MIMR Multi-frequency Imaging Microwave Radiometer
MIN Minutes of Meetings/Reviews
MODIS Moderate-resolution Imaging Spectro-radiometer (on EOS Terra and Aqua)
MSG Meteosat Second Generation
MTBF Mean Time Between Failure
MTG Meteosat Third Generation
MTTR Mean Time To Repair
MVIRI Meteosat Visible Infra-Red Imager (on Meteosat 1 to 7)
N/A Not Available
N.A. Not Applicable
NASA National Aeronautics and Space Administration
NATO North Atlantic Treaty Organisation
NDI Non-developmental Items
NIMH National Institute for Meteorology and Hydrology (of Hungary)
NMS National Meteorological Service
NOAA National Oceanic and Atmospheric Organisation (intended as a satellite series)
NPOESS National Polar-orbiting Operational Environmental Satellite System
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (10 di 82)27/06/2008 9.52.01
SVERF
NPP NPOESS Preparatory Programme
NRT Near-Real Time
NWP Numerical Weather Prediction
OAR Options Analysis for Reengineering
OFL Off-line
OMG Object Management Group
OO Object Oriented
OP Proposal for H-SAF Operational phase
OPS Operational Product Segment
ORB Object Request Broker
ORR Operations Readiness Review
OWL Web Ontology Language
PAC Prototype Algorithm Code
PALSAR Phased Array L-band Synthetic Aperture Radar (on ALOS)
PAW Plant Available Water
PDR Preliminary Design Review
POP Precipitation Observation Production
PP Project Plan
PPR Products Prototyping Reports
PR Precipitation Radar (on TRMM)
QoS Quality of Service
R&D Research and Development
RCS Revision Control System
REP Report
RMI Royal Meteorological Institute (of Belgium)
RR Requirements Review
RT Real Time
SAAM Simulation, Analysis and Modeling
SAF Satellite Application Facility
SAG Science Advisory Group
SAOCOM Argentinean Satellite for Observation and Communication
SAR Synthetic Aperture Radar
SA/SD Structured Analysis / Structured Design
SCA Snow Covered Area
SCAT Scatterometer (on ERS-1 and 2)
SCM Source Configuration Management
SD Snow depth
SDAS Surface Data Assimilation System
SDD System Design Document
SDP Software Development Plan
SEI Software Engineering Institute
SEVIRI Spinning Enhanced Visible Infra-Red Imager (on MSG)
SHW State Hydraulic Works of Turkey
SHFWG SAF Hydrology Framework Working Group
SHMI Slovakian Hydrological and Meteorological Institute
SIRR System Integration Readiness Review
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (11 di 82)27/06/2008 9.52.01
SVERF
SIVVP System Integration, Verification & Validation Plan
SLAs Service-Level Agreements
SM Summary Report
SMART Service Migration and Reuse Technique
SMMR Scanning Multichannel Microwave Radiometer (on SeaSat and Nimbus VII)
SMOS Soil Moisture and Ocean Salinity
SOA Service-Oriented Architecture
SoS System of Systems
SQL Structured Query Language
SR Snow Recognition
SRD System Requirements Document
SSM/I Special Sensor Microwave / Imager (on DMSP up to F-15)
SSMIS Special Sensor Microwave Imager/Sounder (on DMSP starting with F-16)
STRR System Test Results Review
SVALF System Validation File
SVERF System Verification File
SVRR System Validation Results Review
SW Software
SWE Snow Water Equivalent
SYKE Finnish Environment Institute
TBC To be confirmed
TBD To be defined
TC Test Case
TKK/LST Helsinki University of Technology / Laboratory of Space Technology
TMI TRMM Microwave Imager (on TRMM)
TP Test Procedure
TR Test Report
TRMM Tropical Rainfall Measuring Mission
TSMS Turkish State Meteorological Service
TU Wien Technische Universität Wien
UM User Manual
U-MARF Unified Meteorological Archive and Retrieval Facility
UML Unified Modelling Language
URD User Requirements Document
VIIRS Visible/Infrared Imager Radiometer Suite (on NPP and NPOESS)
VS Visiting Scientist
WBS Work Breakdown Structure
WMO World Meteorological Organization
WP Work Package
WPD Work Package Description
WS Workshop
W3C World Wide Web Consortium
XMI XML (eXtensible Markup Language ) Metadata Interchange
XML eXtensible Markup Language
ZAMG Zentral Anstalt für Meteorologie und Geodynamik
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (12 di 82)27/06/2008 9.52.01
SVERF
1.5 References
1.5.1 Applicable Documents[AD1] H-SAF Development Proposal - Issue 2.1, 15 May 2005[AD2] H-SAF Project Plan (PP). Ref.: SAF/HSAF/PP/2.1[AD3] H-SAF Configuration Management Plan (CMP). Ref.: SAF/HSAF/CMP/1.0[AD4] H-SAF Users Requirement Document (URD). Ref.: SAF/HSAF/URD/2.1[AD5] H-SAF SAF Preliminary Design Review Organization Note. Ref.: EUM/PPS/PRC/06/0125[AD6] H-SAF System Requirements Document (SRD). Ref.: SAF/HSAF/SRD/2.0[AD7] H-SAF System Design Document (SDD). Ref.: SAF/HSAF/SDD/2.0[AD8] H-SAF Component Requirements Document (CRD). Ref.: SAF/HSAF/CRD/1.0[AD9] H-SAF Component Design Document (CDD). Ref.: SAF/HSAF/CDD/1.0[AD10] H-SAF Interface Control Document (ICD). Ref.: SAF/HSAF/ICD/1.0[AD11] Algorithmic Software Development Guidelines Document (ASDGD). Ref.: SAF/HSAF/ASDGD/0.1[AD12] System Integration, Verification & Validation Plan (SIVVP). Ref.: SAF/HSAF/SIVVP/1.0[AD13] Guide to Software Configuration Management. Ref.: ESA PSS-05-09 Issue 1 Rev.1[AD14] SAF Software Development Guidelines. Ref.: SAF/NET/EUM/SW/GD/MTR/01[AD15] H-SAF Component Verification File (CVERF). Ref.: SAF/HSAF/CVERF/1.0[AD16] H-SAF Hydrological Validation Plan (REP-2). Ref.: SAF/HSAF/WS-1/Rep-2[AD17] H-SAF Algorithmic Software Development Guidelines Document (ASDGD) – internal issue – Ref.: SAF/HSAF/ASDGD/0.5[AD18] H-SAF System Verification File (SVERF). Ref.: SAF/HSAF/SVERF/0.5[AD19] Algorithms Theoretical Definition Document (ATDD). Ref.: SAF/HSAF/ATDD/1.0[AD20] H-SAF System/Software Version Document (SSVD). Ref.: SAF/HSAF/SSVD/0.5
1.5.2 Reference Documents[RD1] EUM.TD.08 MSG - Image Data Dissemination Service[RD2] EPS AMSU-A L1 Product Generation Specification[RD3] EPS AMSU-A L1 Product Format Specification[RD4] EPS MHS L1 Product Generation Specification[RD5] EPS MHS L1 Product Format Specification[RD6] MetOp-AVHRR documentation VCS Engineering website (http://rst.vcs.de/)[RD7] NASA-MODIS documentation at NASA website (http://directreadout.gsfc.nasa.gov/index.cfm)[RD8] Meteosat-SEVIRI documentation at EUMETSAT website (http://www.eumetsat.int/Home/Main/What_We_Do/Satellites/Meteosat_Second_Generation/index.html=en)[RD9] EUMETSAT-EUMETCast documentation at EUMETSAT website (http://www.eumetsat.int/Home/Main/What_We_Do/EUMETCast/index.html=en)[RD10] EPS NRT User Interface Specification. Ref.: EPS-ASPI-IR-0648[RD11] National Oceanic and Atmospheric Administration (NOAA) Low-Rate Information Transmission (LRIT) System. Ref.: http://noaasis.noaa.gov/LRIT/ [RD12] WMO - WWWDM specifications. Ref.: http://www.wmo.ch/web/www/WDM/wdm.html [RD13] MARS – Meteorological Archive and Retrieval System, MARS User Guide (for Data Retrieval), Revision 11 – September 1995, Meteorological Bulletin B6.7/3, ECMWF Reading, England. Ref.: http://www.wmo.ch/web/www/WDM/wdm.html [RD14] MSG Level 1.5 Image Data Format Description. Ref.: ESA PSS-05-09 Issue 1 Rev.1[RD15] ECSS Standard on Space engineering. Verification. Ref.: ECSS-E-10-02A[RD16] ECSS Standard on Space engineering. Testing. Ref.: ECSS-E-10-03A [RD17] ECSS Standard on Space engineering. Software - Part 1: Principles and requirements. Ref.: ECSS-E-40 Part 1B[RD18] ECSS Standard on Space engineering. Software - Part 2: Document requirements definitions (DRDs). Ref.: ECSS-E-40 Part 2B[RD19] Object Management Group, Inc. (OMG), “Unified Modelling Language: Infrastructure” Version 2.0 –
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (13 di 82)27/06/2008 9.52.01
SVERF
05/072005[RD20] Object Management Group, Inc. (OMG), “Unified Modelling Language: Superstructure” Version 2.0 – 05/072005[RD21] Object Management Group, Inc. (OMG), “Unified Modelling Language: Diagram Interchange” Version 2.0 – 05/072005[RD22] Object Management Group, Inc. (OMG), “Unified Modelling Language: Business Modelling” Version 2.0 – 05/072005[RD23] Brown, Alan W. & Wallnau, Kurt C. "Engineering of Component-Based Systems". Component-Based Software Engineering: Selected Papers from the Software Engineering Institute. Los Alamitos, CA: IEEE Computer Society Press, 1996.[RD24] Jim Rumbaugh, Ivar Jacobson, Grady Booch. “The Unified Modelling Language Reference Manual (2nd edition)”
2 System Verification Overview
2.1 GeneralThe basic principle of the Verification is that every system function has to be tested: on the basis of the requirements set, a number of Test Cases adequate to cover the whole set of functions has to be defined.Specifically, this document is focused on System and Subsystem Tests, covering all of the System Requirements, as they are presented in the SRD Document ([AD6]).Each requirement is covered by at least one test case.The System Tests verify functionalities such as:
• Adequate operational chain productions;• Correct copying of the products to their input directories;• Correct storing of the products in the local and central archive ;• Correct reading of configuration files;• Correct detection of errors;• Correct detection of success in product ingestion;• Completeness of information in log messages.
The system/software reliability is an important characteristic which provides an indication of the quality of the system itself. Fundamental role toward the increasing of the reliability is played by the requirements, which should be as more comprehensive as possible: this let drawing up consequently a comprehensive Test Plan, which can ensure that all requirements are adequately tested.The following three activities help pursuing a substantial increment of the system/software reliability:
1. Error prevention; 2. Fault detection and removal; 3. Measurements to maximize reliability; specifically measures that support the first two activities
2.2 The ApproachSystem verification is done, in general, by executing a set of Test Procedures. The execution of a Test Procedure consists of performing a sequence of operations, as defined in their specification.Requirements not testable by executing software or running Subsystem Components require different verification methods than a Test Procedure execution, such as:
• Document inspection;• Demonstration;• Visual inspection.
In particular, when requirements to be tested involve software parts but the code execution is not an achievable test method, these requirements will be verified by inspecting the system documentation or the code itself.
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (14 di 82)27/06/2008 9.52.01
SVERF
Test Procedures have been omitted also in case of particularly simple procedures, e.g. whenever they simply consist of a script execution, as specified in the test method field of the Test Case.
2.3 System Verification activitiesPreliminary tasks must be carried out before starting test, which are:
• Installation of the pre-required software (e.g. required software libraries);• Verification of the test environment. This includes verifying the hardware configuration, versions of system software and data, and personnel participating in the tests.
For each Test Procedure of each Test Case the following activities have to be completed:• Carry out any preparative task defined for the Test Procedure;• Carry out the steps of the Test Procedure;• Observe and note down the results of each step and fill consequently the Test Result form associated to the Test Procedure;• Fill the Test Report form with the TP and TC status.
At the end a Test Results summary have to be compiled.
2.3.1 Test Procedure failures managementWhen the execution of a Test Procedure fails, the involved Subsystems have to be corrected and a new version is released. If an error prevents the running of further tests, e.g. a critical error, a new component issue where the error is corrected is released before the test activities can continue.If the failure is non-critical and does not prevent further testing, the test session may continue. In any case the final version of the software must have neither critical nor non-critical errors left open. The Test Procedure executions shall be repeated for the new Component version. Regression testing continues until the software passes all tests. All procedures are executed for each version, unless execution is prevented by Test Case dependencies.
2.4 Test activities specification
2.4.1 Naming conventionTest Cases and Test Procedure have an identification code. The naming convention adopted in this document for these IDs is here exposed:
MM-ST-YYY-NNN
where:MM : one of the following kind of test:
TC: Test case
TP: Test Procedure
ST : System Test
YYY : one of the following functional areas
GEN General ACQ Data acquisition
PPR Pre-processing PRG Product generation VPR Hydrovalidation Processing REP Reporting ARC Archiving DIS Distribution/Dissemination VMQ Validation, Monitoring and Quality ControlNNN : progressive number, unique for each TC or TP
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (15 di 82)27/06/2008 9.52.01
SVERF
For example TC_ST_GEN_001 is the “Test Case” number “001” relevant to the “System Test” phase involving system requirements belonging to the “GENeral” functional area
2.4.2 Test Case specificationTable below represents the template adopted for Test Cases specification. Explanation of each item is exposed inside its related field.Test Case identification
An identification code that follows the Naming Convention exposed in section 2.4.1
Test Case name
A name explicative of the object of the TC
H-SAF Subsystems
List of the involved H-SAF Subsystems (one, more than one or all of them)
Components specification
List of the involved H-SAF System Components, belonging to the above specified Subsystems
Requirements to be tested
List of the Component Requirements to be tested by this TC
Input
Description of input data or input actions to be executed to start the test case. It can be “None”
Output
Description of the output data, information or actions obtained as result of the test case activities
Test Method
Method used for the test case; it can be: Test, Inspection, Demonstration
Dependency with other tests
Set of other Test Cases to be finalized before starting the current test. It can be “None”, if the current Test Case results are independent from any other Test Case execution/finalization.
Table 1 Test Case specification template
2.4.3 Test Procedure specificationTable below represents the template adopted for Test Procedures specification. Explanation of each item is exposed inside its related field.Test Procedure identification
An identification code that follows the Naming Convention exposed in section 2.4.1
Test description
Description of aims of the test
Test Case identification
Identification of the Test Case to which the Procedure is related
Test Procedure steps
Series of steps to be executed
Table 2 Test Procedure specification templateDuring the Test Procedure execution, a result for each of its composing steps is expected to be reported. In this document these report is mentioned as a separate form to be compiled; however, it is also possible to produce one only form containing the Test Procedure specification (above exposed) and, beside of each step, the obtained result.
2.5 Test results specification
2.5.1 Test Results form
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (16 di 82)27/06/2008 9.52.01
SVERF
For each test procedure, during its steps execution the results have to be noted down to the Test Report form. Such a form can be incorporated into the Test Procedure specification as well as in a separated form to be filled during the execution. The following table represents a template for the Test Result form as a separate form:Test Procedure identification
Test Procedure ID
Test Cases and Components specification
List of the involved Test Cases and relevant H-SAF System Components
Execution Date Author
Date of execution Author of the Procedure
TP Result
S • CWP • B •
Comments
Comments about the Procedure execution and its overall result
Test Procedure Steps Result
Step Number Expected Result Comments
1 Expected result after the relevant step’s execution
Comments about this step’s result; criticality and analysis of the eventual failure of the step.
2
3
Table 3 Test Result formThe overall result of the Test Procedure can be:
• “Success” (“S”), if this has been the result for each of its steps;• “Completed with problems” (“CWP”) if at least one of its steps has reported this same result and none of them has had the result “Blocked”;• “Blocked” (“B”), if at least one of its steps has reported this same result.
2.5.2 Test Report formThis form is used to complete the result reporting, giving a specification of the configuration of the test environment, intended as hardware, software (third party SW or implemented SW), names and version/release of every one of these items.It has to be filled during test execution. The following table shows a template for the Test Report:Test Report identification
Test Report ID
TP Result
S • CWP • B •
TC’s Result
TC-ST-YYY-NNN S • CWP • B •
TC-ST-YYY-NNN S • CWP • B •
TC-ST-YYY-NNN S • CWP • B •
Components specification
List of the involved H-SAF System Components
Execution Date Author
Date of execution Author of the procedure
Comments
Comments about the Procedure results and their relation with the below detailed Test Environment
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (17 di 82)27/06/2008 9.52.01
SVERF
Test Environment Description
HW Item O.S. Name/Version SW Name/Version 3rd SW Name/Version
HW platform name Operating System specification
Implemented SW specification
3rd party SW specification
Table 4 Test Report form
3 H-SAF System Components test
3.1 General Requirements test
3.1.1 Test Case TC_ST_GEN_010Test Case identification
TC_ST_GEN_010
Test Case name
Identify error conditions
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-00030-FUN-GENSR-00040-FUN-GENSR-00050-FUN-GEN
Input
Any data that results an erroneous condition
Output
Identification of unique error code traced on system log
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.1.2 Test Procedure TP_ST_GEN_010Test Procedure identification
TP_ST_GEN_010
Test description
Identify error conditions
Test Case identification
TC_ST_GEN_010TC_GEN_010
Test Procedure steps
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (18 di 82)27/06/2008 9.52.01
SVERF
1. For each class of messages (see Table 5) verify if code is traced on system log using a shell script (e.g. list_log.sh)
2. The code traced on system log have be to be unique
The possible classes of messages are here listed:
Class Description of the notification message
INFO Non-threatening information to the user. E.g.: a simple splash screen to notify an alert
WARNING Potential dangerous operation in advance. E.g.: "Warning: this operation will erase your data."
ERROR Critical situation. E.g.: "There is not enough memory to install the application."
ALARM Notification about something in progress or to be doneE.g.: "Staff meeting in five minutes."
CONFIRMATION Non-threatening information to the user. E.g.: "The file has been saved."
Table 5 Classes of messagesWARNING and ERROR classes identify conditions that have to be threaded.
3.1.3 Test Case TC_ST_GEN_020Test Case identification
TC_ST_GEN_020
Test Case name
Use of UTC time reference
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
All relevant components
Requirements to be tested
SR-00060-FUN-GEN
Input
None
Output
Verification of use UTC time reference
Test Method
Inspection
Dependency with other tests
TBD_SVERF_03
3.1.4 Test Case TC_ST_GEN_030Test Case identification
TC_ST_GEN_030
Test Case name
Getting of satellite data
H-SAF Subsystems
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (19 di 82)27/06/2008 9.52.01
SVERF
Precipitation, Soil Moisture, Snow Parameters
Component specification
PR-AP, SM-GC1-GPG, SM-GC2-VPG, SP-GC1-AP, SP-GC2-AP
Requirements to be tested
SR-00070-FUN-GEN
Input
None
Output
Getting of data reported in table 6 of SRD
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.1.4.1 Test Procedure TP_ST_GEN_030
Test Procedure identification
TP_ST_GEN_030
Test description
Getting of data reported in table 6 of SRD
Test Case identification
TC_ST_GEN_030
Test Procedure steps
1. Build up directories according to the type of data reported in table 6 of SRD Document ([AD6])
2. For each data of the table 4 verify if data are received in the right directory
3.1.5 Test Case TC_ST_GEN_040Test Case identification
TC_ST_GEN_040
Test Case name
Geographical outbound
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters, Hydro Validation
Components specification
PR-GC1, PR-GC2, SM-GC1, SM-GC2, SP-GC1, SP-GC2, HV-VPR
Requirements to be tested
SR-00020-FUN-GEN, SR-00021-FUN-GEN
Input
Raw data in input directory
Output
The geographical region of product appliance is Europe (including Turkey) and North AfricaTest Method
TBD_SVERF_02
Dependency with other tests
TBD
3.1.5.1 Test Procedure TP_ST_GEN_040
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (20 di 82)27/06/2008 9.52.01
SVERF
Test Procedure identification
TP_ST_GEN_040
Test description
Check products geographical region is Europe (including Turkey) and North Africa
Test Case identification
TC_ST_GEN_040
Test Procedure steps
1. Get products from processing chains
2. Verify if each product has Europe (including Turkey) and North Africa as its geographical region
3.1.6 Test Case TC_ST_GEN_050Test Case identification
TC_ST_GEN_050
Test Case name
Handling of erroneous data
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-AP, SM-GC1, SM-GC2, SP-GC1-AP, SP-GC2-AP
Requirements to be tested
SR-10020-FUN-ACQ
Input
Erroneous data in input directory
Output
Data are deleted from input chain
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.1.6.1 Test Procedure TP_ST_GEN_050
Test Procedure identification
TP_ST_GEN_050
Test description
Deleting of erroneous data from acquisition chain
Test Case identification
TC_ST_GEN_050
Test Procedure steps
1. Start the process in charge of data acquisition
2. The process finds out an error on input data
3. The erroneous data is deleted from input directory by acquisition process
4. The error is traced on system log by the acquisition process
5. The acquisition chain continues to work
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (21 di 82)27/06/2008 9.52.01
SVERF
3.2 Data Acquisition Requirements test
3.2.1 Test Case TC_ST_ACQ_020Test Case identification
TC_ST_ACQ_020
Test Case name
Locally archiving of external satellite data and accessibility to H-SAF Central Archive.
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Component specification
PR-AP, SM-GC1-GPG, SM-GC2-VPG, SP-GC1-AP, SP-GC2-AP
Requirements to be tested
SR-10010-FUN-ACQ
Input
Raw data and low level products
Output
Archiving of data reported in table 6 of SRD ([AD6]) and their availability to the H-SAF Central Archive
Test Method
Test
Dependency with other tests
TBD_SVERF_03
3.2.1.1 Test Procedure TP_ST_ACQ_020
Test Procedure identification
TP_ST_ACQ_020
Test description
Archiving of data reported in table 6 of SRD ([AD6])
Test Case identification
TC_ST_ACQ_020
Test Procedure steps
1. Verify if data reported in table 6 of SRD are received
2. Starts the archiving process
3. Start TBD_SVERF_04.sh. It is a script Shell (or Perl) that compares to if input data are archived making a diff file for example
4. Start TBD_SVERF_08.sh. It is a script Shell (or Perl) for testing the connection between H-SAF Central Archive and Generation chains’ local archives
3.2.2 Test Case TC_ST_ACQ_030Test Case identification
TC_ST_ACQ_030
Test Case name
Handling of erroneous data
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters, Hydro Validation
Component specification
PR-AP, SM-GC1-GPG, SM-GC2-VPG, SP-GC1-AP, SP-GC2-AP, HV-AP
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (22 di 82)27/06/2008 9.52.01
SVERF
Requirements to be tested
SR-10020-FUN-ACQ
Input
Erroneous data in input directory
Output
Data are deleted from input chain
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.2.1 Test Procedure TP_ST_ACQ_030
Test Procedure identification
TP_ST_ACQ_030
Test description
Deleting of erroneous data from acquisition chain
Test Case identification
TC_ST_ACQ_030
Test Procedure steps
1. Start the process in charge of data acquisition
2. The process finds out an error on input data
3. The erroneous data is deleted from input directory by acquisition process
4. The error is traced on system log by the acquisition process
5. The acquisition chain continues to work
3.2.3 Test Case TC_ST_ACQ_040Test Case identification
TC_ST_ACQ_040
Test Case name
Satellite data identification
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Component specification
PR-AP, SM-GC1-GPG, SM-GC2-VPG, SP-GC1-AP, SP-GC2-AP
Requirements to be tested
SR-10030-FUN-GEN
Input
None
Output
Identification of satellite data
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.3.1 Test Procedure TP_ST_ACQ_040
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (23 di 82)27/06/2008 9.52.01
SVERF
Test Procedure identification
TP_ST_ACQ_040
Test description
Identification of satellite information
Test Case identification
TC_ST_ACQ_040
Test Procedure steps
1. There are data in input directory that have to be archived
2. Delete all data in the archive
3. Start the process in charge of data acquisition
4. The archive process is successfully terminated
5. Start the TBD_SVERF_05.sh
6. The process print on screen information about data on archive
7. Satellite identifier is printed
8. Satellite coordinates are printed
9. Satellite time information is printed
10. Channel availability is printed
3.2.4 Test Case TC_ST_ACQ_050Test Case identification
TC_ST_ACQ_050
Test Case name
Identification of auxiliary data
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-GC1-PG1, PR-GC1-PG2, PR-GC1-PG3, PR-GC1-PG4,, PR-GC2-PG, SM-GC1-GPG, SM-GC1-RPG, SM-GC2-VPG, SP-GC1-PG, SP-GC2-PG
Requirements to be tested
SR-10090-FUN-ACQ
Input
None
Output
Auxiliary data source identification
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.4.1 Test Procedure TP_ST_ACQ_050
Test Procedure identification
TP_ST_GEN_050
Test description
Identification of auxiliary data source
Test Case identification
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (24 di 82)27/06/2008 9.52.01
SVERF
TC_ST_ACQ_050
Test Procedure steps
1. Make creation of input directory where files containing auxiliary data are copied
2. Verify if auxiliary data are copied on that directory
3.2.5 Test Case TC_ST_ACQ_060Test Case identification
TC_ST_ACQ_060
Test Case name
Identification of time information of time-related auxiliary data
H-SAF Subsystems
Precipitation , Snow Parameters, Soil moisture
Components specification
PR-GC1-PG1, PR-GC1-PG2, PR-GC1-PG3, PR-GC1-PG4, PR-GC2-PG, SM-GC1-GPG, SM-GC1-RPG, SM-GC2-VPG, SP-GC1-PG, SP-GC2-PG
Requirements to be tested
SR-10100-FUN-ACQ
Input
There are auxiliary data in input directory
Output
Time information of time-related auxiliary data are printed
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.5.1 Test Procedure TP_ ST_ACQ_060
Test Procedure identification
TP_ST_ACQ_060
Test description
Time information of time-related auxiliary data
Test Case identification
TC_ ST_ACQ_060
Test Procedure steps
1. Verify if auxiliary data are on input directory
2. Starts the TBD_SVERF_06.sh + filename of auxiliary data
3. Verify if the script prints on screen time information of file
3.2.6 Test Case TC_ST_ACQ_070Test Case identification
TC_ST_ACQ_070
Test Case name
Managing acquisition failure
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters, Hydro Validation
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (25 di 82)27/06/2008 9.52.01
SVERF
Components specification
All relevant components
Requirements to be tested
SR-10110-FUN-ACQSR-10120-FUN-ACQ
Input
Failure in acquisition of mandatory data
Output
The subsystems continue to work out and the failure is traced on system logTest Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.6.1 Test Procedure TP_ST_ACQ_070
Test Procedure identification
TP_ST_ACQ_070
Test description
Managing acquisition failure
Test Case identification
TC_ST_ACQ_070
Test Procedure steps
1. Verify if mandatory erroneous data are copied on input directory
2. Starts the operational chain
3. Verify data are deleted from input directory
4. Verify if the operational chain continues to work out
5. Verify if the error is traced on system log by launching list_log.sh
3.2.7 Test Case TC_ST_ACQ_080Test Case identification
TC_ST_ACQ_080
Test Case name
Acquisition of synoptic observation data via WMO-GTS
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-GC1-PG1, PR-GC1-PG2, PR-GC1-PG3, PR-GC1-PG4,, PR-GC2-PG, SM-GC1-GPG, SM-GC1-RPG, SM-GC2-VPG, SP-GC1-PG, SP-GC2-PG
Requirements to be tested
SR-10130-FUN-ACQ
Input
Synoptic observation data in input directory
Output
Pre-processing and archiving of synoptic observation data
Test Method
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (26 di 82)27/06/2008 9.52.01
SVERF
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.7.1 Test Procedure TP_ST_ACQ_080
Test Procedure identification
TP_ST_ACQ_080
Test description
Acquisition of synoptic observation data via WMO-GTS
Test Case identification
TC_ST_ACQ_080
Test Procedure steps
1. Verify if synoptic observation data are copied onto WMO-GTS input directory
2. Start the operational chain
3. Verify data are deleted from WMO-GTS input directory
3.2.8 Test Case TC_ ST_ACQ_090Test Case identification
TC_ST_ACQ_090
Test Case name
Acquisition of satellite data for Precipitation productions
H-SAF Subsystems
Precipitation
Components specification
PR-AP
Requirements to be tested
SR-12010-FUN-ACQSR-12020-FUN-ACQSR-12021-FUN-ACQSR-12030-FUN-ACQSR-12031-FUN-ACQSR-12040-FUN-ACQSR-12041-FUN-ACQSR-12050-FUN-ACQSR-12051-FUN-ACQSR-12060-FUN-ACQSR-12061-FUN-ACQSR-12070-FUN-ACQSR-12010-FUN-ACQSR-12071-FUN-ACQSR-12080-FUN-ACQSR-12081-FUN-ACQSR-12100-FUN-ACQSR-12110-FUN-ACQSR-12111-FUN-ACQSR-12120-FUN-ACQSR-12130-FUN-ACQSR-12140-FUN-ACQSR-12150-FUN-ACQSR-12160-FUN-ACQ
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (27 di 82)27/06/2008 9.52.01
SVERF
Input
Satellite data for Precipitation productions specified in table 4 of SRD ([AD6]) shall be available
Output
Pre-processing and archiving of satellite data for Precipitation productions
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.8.1 Test Procedure TP_ ST_ACQ_090
Test Procedure identification
TP_ST_ACQ_090
Test description
Acquisition of satellite data for Precipitation products as specified in table 4 of SRD ([AD6])
Test Case identification
TC_ST_ACQ_090
Test Procedure steps
1. Start the operational chain
2. The operational chain makes a copy of satellite data on local archive
3. Verify that satellite data are archived by launching a script shell
3.2.9 Test Case TC_ST_ACQ_100Test Case identification
TC_ST_ACQ_100
Test Case name
Failure in direct acquisition
H-SAF Subsystems
Precipitation
Components specification
PR-AP
Requirements to be tested
SR-12220-FUN-ACQ
Input
Failure in direct acquisition of NOAA-17, NOAA-18, MetOp, Meteosat-9 data
Output
direct acquisition chains of NOAA-17, NOAA-18, MetOp-1, Meteosat-9 is replaced with NRT acquisition via EUMETCast
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.9.1 Test Procedure TP_ST_ACQ_100
Test Procedure identification
TP_ST_ACQ_100
Test description
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (28 di 82)27/06/2008 9.52.01
SVERF
Failure in direct acquisition
Test Case identification
TC_ST_ACQ_100
Test Procedure steps
1. Start the operational chain
2. An error occurs so a SW item returns an error code
3. The failure in direct acquisition chain is traced on system log
4. The operational chain changes the direct acquisition with NRT acquisition via EUMETCast
5. The operational chain traces the changing of acquisition mode on system log
3.2.10 Test Case TC_ST_ACQ_110Test Case identification
TC_PR_ACQ_110
Test Case name
Acquisition of auxiliary data
H-SAF Subsystems
Precipitation
Components specification
PR-GC1
Requirements to be tested
SR-12240-FUN-ACQSR-12250-FUN-ACQSR-12260-FUN-ACQSR-12270-FUN-ACQSR-12280-FUN-ACQ
Input
Auxiliary data are available on a dedicated directory
Output
Use of auxiliary data
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.10.1 Test Procedure TP_ST_ACQ_110
Test Procedure identification
TP_ST_ACQ_110
Test description
Acquisition of auxiliary data
Test Case identification
TC_ST_ACQ_110
Test Procedure steps
1. Start the operational chain
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (29 di 82)27/06/2008 9.52.01
SVERF
2. The operational chain checks if auxiliary data are available on a dedicated directory. The auxiliary data are:
a. lightning b. VIS/IR/MW imagec. meteorological analysisd. rain gauge networkse. rain data images
3. The operational chain makes a copy of auxiliary data on local archive if required
4. The operational chain uses auxiliary data for data production
5. The operational chain deletes auxiliary data after use
3.2.11 Test Case TC_ST_ACQ_120Test Case identification
TC_ST_ACQ_ 120
Test Case name
Acquisition of boundary conditions data as input to data assimilationH-SAF Subsystems
Precipitation
Components specification
PR-GC2
Requirements to be tested
SR-12300-FUN-ACQ
Input
Boundary conditions are available on a dedicated directory
Output
Use of boundary conditions data as input to data assimilation
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.11.1 Test Procedure TP_ST_ACQ_120
Test Procedure identification
TP_ST_ACQ_ 120
Test description
Acquisition of boundary conditions data as input to data assimilation
Test Case identification
TC_ST_ACQ_ 120
Test Procedure steps
1. Start the operational chain
2. The operational chain checks if boundary conditions data are available on a dedicated directory.
3. The operational chain makes a copy of data on local archive if required
4. The operational chain uses boundary conditions data as input to data assimilation
5. The operational chain deletes boundary conditions data after use
3.2.12 Test Case TC_ST_ACQ_130
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (30 di 82)27/06/2008 9.52.01
SVERF
Test Case identification
TC_ST_ACQ_130
Test Case name
Acquisition of level II processed MetOp ASCAT data at 25 km resolution (SM-OBS-1) and relevant Level I generated by EUMETSAT-PPF through NRT interface via EUMETCast
H-SAF Subsystems
Soil Moisture
Components specification
SM-GC1
Requirements to be tested
SR-11010-FUN-ACQ
Input
Level II processed MetOp ASCAT data at 25 km resolution (SM-OBS-1) and relevant Level I generated by EUMETSAT-PPF are available
Output
Pre-processing and archiving of data
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.12.1 Test Procedure TP_ST_ACQ_130
Test Procedure identification
TP_ST_ACQ_130
Test description
Acquisition of level II processed MetOp ASCAT data at 25 km resolution (SM-OBS-1) and relevant level I generated by EUMETSAT-PPF through NRT interface via EUMETCast
Test Case identification
TC_ST_ACQ_130
Test Procedure steps
1. Start the operational chain
2. The operational chain checks if data are available through NRT interface via EUMETCast
3. The operational chain makes a copy of data on local archive
4. Verify that data are archived by launching a script shell
3.2.13 Test Case TC_ST_ACQ_140Test Case identification
TC_ST_ACQ_140
Test Case name
Access to European parameter database and modelled root zone soil moisture (first guess)
H-SAF Subsystems
Soil Moisture
Component specification
SM-GC1, SM-GC2
Requirements to be tested
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (31 di 82)27/06/2008 9.52.01
SVERF
SR-11030-FUN-ACQSR-11040-FUN-ACQ
Input
European parameter database and modelled root zone soil moisture (first guess) are available on dedicated directory/repository
Output
Use of European parameter database and modelled root zone soil moisture (first guess)
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.13.1 Test Procedure TP_ST_ACQ_140
Test Procedure identification
TP_ST_ACQ_140
Test description
Access to European parameter database and modelled root zone soil moisture (first guess)
Test Case identification
TC_ST_ACQ_140
Test Procedure steps
1. Start the operational chain
2. Check if European parameter database and modelled root zone soil moisture (first guess) are available on a dedicated directory/repository. The involved data are:
a. land coverb. operational snow cover mapsc. freeze/thaw cyclesd. water bodiese. topographyf. first guess
3. The operational chain makes a copy these data on local archive if required
4. The operational chain uses these data as auxiliary data for data production
5. The operational chain deletes these data after use from local directory
3.2.14 Test Case TC_ ST_ACQ_150Test Case identification
TC_ST_ACQ_090
Test Case name
Acquisition of satellite data for Snow Parameters productions
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC1-AP, SP-GC2-AP
Requirements to be tested
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (32 di 82)27/06/2008 9.52.01
SVERF
SR-13310-FUN-ACQSR-13315-FUN-ACQSR-13020-FUN-ACQSR-13025-FUN-ACQSR-13030-FUN-ACQSR-13040-FUN-ACQSR-13050-FUN-ACQSR-13060-FUN-ACQSR-13070-FUN-ACQSR-13080-FUN-ACQSR-13090-FUN-ACQSR-13100-FUN-ACQSR-13070-FUN-ACQ
Input
Satellite data for Snow Parameters production as specified in table 4 of SRD ([AD6]) shall be available
Output
Pre-processing and archiving of satellite data
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.14.1 Test Procedure TP_ ST_ACQ_150
Test Procedure identification
TP_ST_ACQ_150
Test description
Acquisition of satellite data specified in table 4 of SRD ([AD6])
Test Case identification
TC_ST_ACQ_150
Test Procedure steps
1. Start the operational chain
2. The operational chain makes a copy of satellite data on local archive
3. Verify that satellite data are archived by launching a script shell
3.2.15 Test Case TC_ST_ACQ_160Test Case identification
TC_ST_ACQ_160
Test Case name
Acquisition of H-SAF product for models through ftp via Internet, EUMETCast and WMO-GTS
H-SAF Subsystems
Hydro Validation
Components specification
HV-AP
Requirements to be tested
SR-14010-FUN-ACQSR-14020-FUN-ACQSR-14030-FUN-ACQSR-14040-FUN-ACQ
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (33 di 82)27/06/2008 9.52.01
SVERF
Input
Products from Precipitation, Soil Moisture and Snow Parameters processing chains
Output
H-SAF products available on local directories
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.2.15.1 Test Procedure TP_ST_ACQ_160
Test Procedure identification
TP_ST_ACQ_160
Test description
Acquisition of H-SAF product for models through ftp via Internet, EUMETCast and WMO-GTS
Test Case identification
TC_ST_ACQ_160
Test Procedure steps
1. Open a browser for internet access
2. Write the URL ftp:\\TBD.com
3. Specify user and password
4. get the H-SAF products
5. Check if H-SAF products are available through WMO Information System/GTN-H
6. get the data
7. Check if H-SAF products are available through UMARF
8. get the H-SAF products
9. Make sure that H-SAF products transferring are successful
10. Verify if H-SAF products are on specified directories
3.3 Pre-processing Requirements test
3.3.1 Test Case TC_ST_PPR_010Test Case identification
TC_ST_PPR_010
Test Case name
Preparing of satellite raw data for processing
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-AP, SM-GC1-GPG, SM-GC2-VPG, SP-GC1-AP, SP-GC2-AP
Requirements to be tested
SR-20010-FUN-PPRSR-20020-FUN-PPR
Input
Satellite raw data
Output
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (34 di 82)27/06/2008 9.52.01
SVERF
Production of a files suitable for processing
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.3.1.1 Test Procedure TP_ST_PPR_010
Test Procedure identification
TP_ST_PPR_010
Test description
Preparing of satellite raw data for processing
Test Case identification
TC_ST_PPR_010
Test Procedure steps
1. Start the operational chains
2. Satellite raw data are available on dedicated directories
3. The pre-processing processes gets the raw data
4. The operational chains pre-process raw data obtaining data suitable for processing
5. The pre-processing processes make copies of data results on dedicated directories
6. The pre-processing processes trace on the system logs the pre-processing result
3.4 Product Generation Requirements test
3.4.1 Test Case TC_ST_PRG_010Test Case identification
TC_ST_PRG_010
Test Case name
Automatic generation of products
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-GC1-PG1, PR-GC1-PG2, PR-GC1-PG3, PR-GC1-PG4,, PR-GC2-PG, SM-GC1-GPG, SM-GC1-RPG, SM-GC2-VPG, SP-GC1-PG, SP-GC2-PG
Requirements to be tested
SR-00010-FUN-GENSR-32010-FUN-PRGSR-32011-FUN-PRGSR-32060-FUN-PRGSR-32061-FUN-PRGSR-32062-FUN-PRGSR-32063-FUN-PRGSR-31010-FUN-PRGSR-31070-FUN-PRGSR-33040-FUN-PRGSR-33050-FUN-PRGSR-33060-FUN-PRGSR-33065-FUN-PRGSR-33200-FUN-PRGSR-33210-FUN-PRG
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (35 di 82)27/06/2008 9.52.01
SVERF
SR-33220-FUN-PRG
Input
Raw data in input directory
Output
Processed data in output directory
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.1.1 Test Procedure TP_ST_PRG_010
Test Procedure identification
TP_ST_PRG_010
Test description
Check if products are automatically generated by relevant operational chains
Test Case identification
TC_ST_PRG_010
Test Procedure steps
1. Put raw satellite date in the input directory of precipitation and snow parameters operational chains and product coming from EUMETST-PPF in the input directory of soil moisture operational chain
2. Run all the process of the precipitation, soil moisture and snow parameters operational chain in the following order: TBD
3. Check if the process results of the operational chains are successful.
4. Check if the following products are in the relevant output directories:5. PR-OBS-1, PR-OBS-2, PR-OBS-3, PR-OBS-4, PR-OBS-5, PR-ASS-1, SM-OBS-1 (or SM-OBS-1 backup), SM-OBS-2, SM-ASS-1, SN-OBS-1a, SN-OBS-2, SN-OBS-3a, SN-OBS-4a, SN-OBS-1b, SN-OBS-3b, SN-OBS-4b
3.4.2 Test Case TC_ST_PRG_020Test Case identification
TC_ST_PRG_020
Test Case name
Archiving data produced by operational chains
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-GC1, PR-GC2, SM-GC1-PG, SM-GC2-PG ,SP-GC1-PG,SP-GC2-PG
Requirements to be tested
SR-30010-FUN-GEN
Input
Data produced by operational chains
Output
locally archiving of data and sending of data to central archive
Test Method
TBD_SVERF_02
Dependency with other tests
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (36 di 82)27/06/2008 9.52.01
SVERF
TBD_SVERF_03
3.4.2.1 Test Procedure TP_PRG_020
Test Procedure identification
TP_ST_PRG_020
Test description
Archiving data produced by operational chains
Test Case identification
TC_ST_PRG_020
Test Procedure steps
1. Start the operational chains
2. Verify if data are correctly produced by each chain
3. Verify if data are on a temporary directory
4. Start the archiving process
5. Start TBD_ SVERF_04.sh. It is a script shall (or perl) that compares to if input data are archived by making a diff file for example
6. Verify that data are transmitted to the central archive by means of information traced on system log
3.4.3 Test Case TC_ST_PRG_030Test Case identification
TC_ST_PRG_030
Test Case name
PR-OBS-1 production
H-SAF Subsystems
Precipitation
Components specification
PR-GC1-PG1, PR-GC1-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-32010-FUN-PRGSR-32020-FUN-PRGSR-32030-FUN-PRGSR-32040-FUN-PRG
Input
Satellite data
Output
PR-OBS-1
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.3.1 Test Procedure TP_ST_PRG_030
Test Procedure identification
TP_ST_PRG_030
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (37 di 82)27/06/2008 9.52.01
SVERF
Test description
Production of PR-OBS-1
Test Case identification
TC_ST_PRG_030
Test Procedure steps
1. Start the operational chain
2. Satellite data are available on a dedicated directory.
3. The product generation process gets the raw data
4. The operational chain pre-processes satellite data and process pre-processed data for obtaining PR-OBS-1
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:resolution of PR-OBS-1 product shall be 10 km (with MIS) or 15 km (with additional GPM)accuracy of PR-OBS-1 product shall be 10-20% (for a rate > 10 mm/h), 20-40% (for a rate from 1 to 10 mm/h), 40-80% (for a rate < 1 mm/h)frequency of PR-OBS-1 product shall be 9 h (with MIS) or 3 h (with full GPM)quality flags associated to products
8. The operational chain receive the Hydrovalidation report from central archive
3.4.4 Test Case TC_ST_PRG_040Test Case identification
TC_ST_PRG_040
Test Case name
PR-OBS-2 productionH-SAF Subsystems
Precipitation
Component specification
PR-GC1-PG2, PR.-GC2-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-32011-FUN-PRGSR-32021-FUN-PRGSR-32031-FUN-PRGSR-32041-FUN-PRG
Input
Satellite dataOutput
PR-OBS-2Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.4.1 Test Procedure TP_ST_PRG_040
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (38 di 82)27/06/2008 9.52.01
SVERF
Test Procedure identification
TP_ST_PRG_040
Test description
Production of PR-OBS-2
Test Case identification
TC_ST_PRG_040
Test Procedure steps
1. Start the operational chain
2. Satellite data are available on a dedicated directory.
3. The product generation process gets the raw data
4. The operational chain pre-processes satellite data and process pre-processed data for obtaining PR-OBS-2
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
9. The following features have to be satisfied to have a suitable product:resolution of PR-OBS-2 product shall be 10 km (with MIS) or 15 km (with other GPM)accuracy of PR-OBS-2 product shall be 10-20% (for a rate > 10 mm/h), 20-40% (for a rate from 1 to 10 mm/h), 40-80% (for a rate < 1 mm/h)frequency of PR-OBS-2 product shall be 9 h (with CMIS) or 3 h (with full GPM)quality flags associated to products
7. The operational chain receive the Hydrovalidation report from central archive
3.4.5 Test Case TC_ST_PRG_050Test Case identification
TC_ST_PRG_050
Test Case name
SM-OBS-1 and SM-OBS-2 production
H-SAF Subsystems
Soil Moisture
Components specification
SM-GC1-GPG, SM-GC1-RPG, SM-GC1-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-31010-FUN-PRGSR-31020-FUN-PRGSR-31030-FUN-PRGSR-31040-FUN-PRGSR-31060-FUN-PRGSR-31061-FUN-PRG
Input
Level 1 ASCAT and Level 2 ASCAT products
Output
SM-OBS-1 backup and SM-OB-2
Test Method
TBD_SVERF_02
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (39 di 82)27/06/2008 9.52.01
SVERF
Dependency with other tests
TBD_SVERF_03
3.4.5.1 Test Procedure TP_ST_PRG_050
Test Procedure identification
TP_ST_PRG_050
Test description
Production of SM-OBS-1 backup and SM-OBS-2
Test Case identification
TC_ST_PRG_050
Test Procedure steps
1. Start the operational chain
2. Level 1 ASCAT and Level 2 ASCAT products are available on a dedicated directories
3. The product generation processes get these products
4. The operational chain process these data obtaining SM-OBS-1 backup and SM-OBS-2
5. The processing process make copies of data results on dedicated directories
6. The processing process trace on the system log the successfully operation
7. The following features have to be satisfied to have a suitable SM-OBS-1 backup product:resolution of SM-OBS-1 product shall be 25 km (from ASCAT) or 50 km (from MIS)accuracy of SM-OBS-1 product shall be 0.05 m3 m-3
frequency of SM-OBS-1 product shall be 36 h (from ASCAT) or 9 h (from MIS)quality flags associated to product
8. The following features have to be satisfied to have a suitable SM-OBS-2 product:shall be derived from SM-OBS-1 one, re-sampling it at 1 km intervalsCorrection of SM-OBS-2 in topography and project to user-defined coordinate systemsquality flags associated to product
9. The operational chain receive the Hydrovalidation report from central archive
3.4.6 Test Case TC_PRG_010Test Case identification
TC_PRG_010
Test Case name
Archiving data produced by operational chains
H-SAF Subsystems
All
Components specification
PR-GC1, PR-GC2, SM-GC1-PG, SM-GC2-PG , SP-GC1-PG, SP-GC2-PG
Requirements to be tested
SR-30010-FUN-GEN
Input
Data produced by operational chains
Output
Local archiving of data and sending of data to central archive
Test Method
TBD_SVERF_02
Dependency with other tests
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (40 di 82)27/06/2008 9.52.01
SVERF
TBD_SVERF_03
3.4.7 Test Case TC_ST_PRG_060Test Case identification
TC_ST_PRG_060
Test Case name
SM-ASS-1 production
H-SAF Subsystems
Soil Moisture
Component specification
SM-GC2-VPG, SM-GC2-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-31070-FUN-PRGSR-31080-FUN-PRGSR-31090-FUN-PRGSR-31100-FUN-PRGSR-31150-FUN-PRG
Input
Level 2 ASCAT product
Output
SM-ASS-1
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.7.1 Test Procedure TP_ST_PRG_060
Test Procedure identification
TP_ST_PRG_060
Test description
Production of SM-ASS-1
Test Case identification
TC_ST_PRG_060
Test Procedure steps
1. Start the operational chain
2. Level 2 ASCAT product is available on a dedicated directory
3. The product generation process gets this product
4. The operational chain processes this product for obtaining SM-ASS-1
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (41 di 82)27/06/2008 9.52.01
SVERF
7. The following features have to be satisfied to have a suitable product:resolution of SM-ASS-1 product shall be 25 km (from ASCAT) and 50 km (from MIS) accuracy of SM-ASS-1 product shall be 0.05 m3 m-3
frequency of SM-ASS-1 product shall be 36 h (from ASCAT) and 9h (from MIS)quality flags associated to products
3.4.8 Test Case TC_ST_PRG_070Test Case identification
TC_ST_PRG_070
Test Case name
Setting up of a DB for cal/val
H-SAF Subsystems
Soil Moisture
Components specification
SM-GC2-PG
Requirements to be tested
SR-31270-FUN-PRG
Input
Soil moisture profile
Output
Local archiving of cal/val
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.8.1 Test Procedure TP_ST_PRG_070
Test Procedure identification
TP_ST_PRG_070
Test description
Setting up of a reference DB for cal/val, with elements for soil moisture consistency control as reported in table 8 in SRD ([AD6])
Test Case identification
TC_ST_PRG_070
Test Procedure steps
1. Setting up of a reference DB
2. Start the operational chains
3. The following data are available :In-situ dataRelated space-based soil moisture missionsGlobal water datasetsGlobal snow datasetsGlobal topography datasetsGlobal freeze/thaw datasets
4. Verify if produced data are on a temporary directory
5. Start the archiving process
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (42 di 82)27/06/2008 9.52.01
SVERF
6. Start TBD_SVERF_04.sh. It is a script shall (or perl) that compares to if input data are archived by making a diff file for example
3.4.9 Test Case TC_ST_PRG_080Test Case identification
TC_ST_PRG_080
Test Case name
SN-OBS-1a production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC1-PG, SP-GC1-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33040-FUN-PRGSR-33070-FUN-PRGSR-33110-FUN-PRGSR-33140-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-1a
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.9.1 Test Procedure TP_ST_PRG_080
Test Procedure identification
TP_ST_PRG _080
Test description
Production of SN-OBS-1a
Test Case identification
TC_ST_PRG_080
Test Procedure steps
1. Start the operational chain
2. Raw satellite data are available on a dedicated directory.
3. The product generation process gets the raw and auxiliary data
4. The operational chain processes raw and auxiliary data for obtaining SN-OBS-1a
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-1a is 2 km Accuracy of SN-OBS-1a is 95%frequency of SN-OBS-1a is 6 h (depending on latitude) quality flags associated to products
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (43 di 82)27/06/2008 9.52.01
SVERF
3.4.10 Test Case TC_ST_PRG_090Test Case identification
TC_ST_PRG_090
Test Case name
SN-OBS-2a production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC1-PG, SP-GC1-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33050-FUN-PRGSR-33051-FUN-PRGSR-33120-FUN-PRGSR-33150-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-2
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.10.1 Test Procedure TP_ST_PRG_090
Test Procedure identification
TP_ST_PRG _090
Test description
Production of SN-OBS-2
Test Case identification
TC_ST_PRG_080
Test Procedure steps
1. Start the operational chain
2. Raw satellite data are available on a dedicated directory.
3. The product generation process gets the raw and auxiliary data
4. The operational chain processes raw and auxiliary data for obtaining SN-OBS-2
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-2 is 10 km Accuracy of SN-OBS-2 is 15 % (depending on basin size and complexity)frequency of SP-OBS-2 is 6 h (depending on latitude) quality flags associated to products
3.4.11 Test Case TC_ST_PRG_100
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (44 di 82)27/06/2008 9.52.01
SVERF
Test Case identification
TC_ST_PRG_100
Test Case name
SN-OBS-3a production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC1-PG, SP-GC1-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33060-FUN-PRGSR-33090-FUN-PRGSR-33125-FUN-PRGSR-33160-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-3a
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.11.1 Test Procedure TP_ST_PRG_100
Test Procedure identification
TP_ST_PRG _100
Test description
Production of SN-OBS-3a
Test Case identification
TC_ST_PRG_100
Test Procedure steps
1. Start the operational chain
2. Raw satellite data are available on a dedicated directory.
3. The product generation process gets the raw and auxiliary data
4. The operational chain processes raw and auxiliary data for obtaining SN-OBS-3a
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-3a is 5 km Accuracy of SN-OBS-3a is 15 % (depending on basin size and complexity)frequency of SN-OBS-3a is 6 h (depending on latitude) quality flags associated to products
3.4.12 Test Case TC_ST_PRG_110Test Case identification
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (45 di 82)27/06/2008 9.52.01
SVERF
TC_ST_PRG_110
Test Case name
SN-OBS-4a production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC1-PG, SP-GC1-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33070-FUN-PRGSR-33100-FUN-PRGSR-33130-FUN-PRGSR-33170-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-4a
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.12.1 Test Procedure TP_ST_PRG_110
Test Procedure identification
TP_ST_PRG _110
Test description
Production of SN-OBS-4a
Test Case identification
TC_ST_PRG_110
Test Procedure steps
8. Start the operational chain
9. Raw satellite data are available on a dedicated directory.
10. The product generation process gets the raw and auxiliary data
11. The operational chain processes raw and auxiliary data for obtaining SN-OBS-4a
12. The processing process makes copies of data results on a dedicated directory
13. The processing process traces on the system log the successfully operation
14. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-4a is 10 km Accuracy of SN-OBS-4a is about 20 mmfrequency of SN-OBS-4a is 6 h (depending on latitude) quality flags associated to products
3.4.13 Test Case TC_ST_PRG_120Test Case identification
TC_ST_PRG_120
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (46 di 82)27/06/2008 9.52.01
SVERF
Test Case name
SN-OBS-1b production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC2-PG, SP-GC2-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33200-FUN-PRGSR-33230-FUN-PRGSR-33260-FUN-PRGSR-33290-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-1b
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.13.1 Test Procedure TP_ST_PRG_120
Test Procedure identification
TP_ST_PRG _120
Test description
Production of SN-OBS-1b
Test Case identification
TC_ST_PRG_080
Test Procedure steps
1. Start the operational chain
2. Raw satellite data are available on a dedicated directory.
3. The product generation process gets the raw and auxiliary data
4. The operational chain processes raw and auxiliary data for obtaining SN-OBS-1b
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-1b is 2 km Accuracy of SN-OBS-1b is 95%frequency of SN-OBS-1b is 6 h (depending on latitude) quality flags associated to products
3.4.14 Test Case TC_ST_PRG_130Test Case identification
TC_ST_PRG_130
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (47 di 82)27/06/2008 9.52.01
SVERF
Test Case name
SN-OBS-3b production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC2-PG, SP-GC2-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33210-FUN-PRGSR-33240-FUN-PRGSR-33270-FUN-PRGSR-33300-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-3b
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.14.1 Test Procedure TP_ST_PRG_130
Test Procedure identification
TP_ST_PRG _130
Test description
Production of SN-OBS-3b
Test Case identification
TC_ST_PRG_130
Test Procedure steps
1. Start the operational chain
2. Raw satellite data are available on a dedicated directory.
3. The product generation process gets the raw and auxiliary data
4. The operational chain processes raw and auxiliary data for obtaining SN-OBS-3b
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-3b is 5 km Accuracy of SN-OBS-3b is 15 % (depending on basin size and complexity)frequency of SN-OBS-3b is 6 h (depending on latitude) quality flags associated to products
3.4.15 Test Case TC_ST_PRG_140Test Case identification
TC_ST_PRG_140
Test Case name
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (48 di 82)27/06/2008 9.52.01
SVERF
SN-OBS-4b production
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC2-PG, SP-GC2-QC
Requirements to be tested
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33220-FUN-PRGSR-33250-FUN-PRGSR-33280-FUN-PRGSR-33320-FUN-PRG
Input
Raw satellite data
Output
SN-OBS-4b
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.15.1 Test Procedure TP_ST_PRG_140
Test Procedure identification
TP_ST_PRG _140
Test description
Production of SN-OBS-4b
Test Case identification
TC_ST_PRG_140
Test Procedure steps
1. Start the operational chain
2. Raw satellite data are available on a dedicated directory.
3. The product generation process gets the raw and auxiliary data
4. The operational chain processes raw and auxiliary data for obtaining SN-OBS-4b
5. The processing process makes copies of data results on a dedicated directory
6. The processing process traces on the system log the successfully operation
7. The following features have to be satisfied to have a suitable product:Resolution of SN-OBS-4b is 10 km Accuracy of SN-OBS-4b is about 20 mmfrequency of SN-OBS-4b is 6 h (depending on latitude) quality flags associated to products
3.4.16 Test Case TC_ST_PRG_150Test Case identification
TC_ST_PRG_150
Test Case name
Snow Parameters Finnish (“a” labelled) and Turkish (“b” labelled) products merging
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (49 di 82)27/06/2008 9.52.01
SVERF
H-SAF Subsystems
Snow Parameters
Components specification
SP-GC1-PG, SP-GC1-QC
Requirements to be tested
SR-33320-FUN-PRGSR-33330-FUN-PRGSR-33340-FUN-PRG
Input
SN-OBS-1a, SN-OBS-3a, SN-OBS-4a, SN-OBS-1b, SN-OBS-3b, SN-OBS-4bOutput
SN-OBS-1, SN-OBS-3, SN-OBS-4Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.16.1 Test Procedure TP_ST_PRG_150
Test Procedure identification
TP_ST_PRG _150
Test description
Production of combined products between finish (“a” labelled) and Turkish (“b” labelled) products
Test Case identification
TC_ST_PRG_150
Test Procedure steps
1. Start the operational chain
2. Turkish (“b” labelled) products are available on Finnish local archive.
3. The product generation process gets the“b” labelled products and auxiliary data
4. The operational chain processes “b” and “a” labelled products using auxiliary data obtaining combined products:• SN-OBS-1• SN-OBS-3• SN-OBS-4
5. The processing process makes copies of data results on dedicated directory
6. The processing process traces on the system log the successfully operation
3.4.17 Test Case TC_ST_PRG_160Test Case identification
TC_ST_PRG_160
Test Case name
Processing H-SAF products
H-SAF Subsystems
Hydro Validation
Component specification
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (50 di 82)27/06/2008 9.52.01
SVERF
HV-AP
Requirements to be tested
SR-34010-FUN-VPR
Input
H-SAF products in GRIB_ver.1, BUFR_ver.3, HDF-5, ASCII format, JPEG or similar
Output
Hydro validation reports
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.4.18 Test Case TC_ST_PRG_170Test Case identification
TC_ST_PRG_170
Test Case name
Hydro validation process timing
H-SAF Subsystems
Hydro Validation
Component specification
HV-AP
Requirements to be tested
SR-34020-FUN-VPRSR-34030-FUN-VPR
Input
NoneOutput
Hydro validation process performed daily at fixed time of the day, except for SN-OBS-4 that is performed on the orbital basis. See table 10 of SRD ([AD6]).Dependency with other tests
TBD_SVERF_03
3.4.19 Test Case TC_ST_PRG_180Test Case identification
TC_ST_PRG_180
Test Case name
Use of auxiliary dataH-SAF Subsystems
Hydro Validation
Component specification
HV-AP
Requirements to be tested
SR-34040-FUN-VPR
Input
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (51 di 82)27/06/2008 9.52.01
SVERF
Auxiliary data like digital terrain models, land cover maps, river network data are available on a dedicated directory Output
Use of auxiliary data and trace of them inside reportTest Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.19.1 Test Procedure TP_ST_PRG_180
Test Procedure identification
TP_ST_PRG_180
Test description
Use of auxiliary data
Test Case identification
TC_ST_PRG_180
Test Procedure steps
Start the HV-AP component
The HV- AP checks if auxiliary data are available on a dedicated directory. The auxiliary data shall be:Digital terrain modelsLand cover mapsRiver network data
The HV-AP makes a copy of auxiliary data on local archive if required
The HV- AP uses auxiliary data for report production
The HV- AP deletes local copies of auxiliary data after usage
3.4.20 Test Case TC_ST_PRG_190Test Case identification
TC_ST_PRG_190
Test Case name
Input data re-processing in order to fit input variables to model resolutionH-SAF Subsystems
Hydro Validation
Component specification
HV- AP
Requirements to be tested
SR-34050-FUN-VPR
Input
Input data are available on a dedicated directory Output
Input variables fitted to model resolution
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (52 di 82)27/06/2008 9.52.01
SVERF
3.4.20.1 Test Procedure TP_ST_PRG_190
Test Procedure identification
TP_ST_PRG_190
Test description
Input data re-processing in order to fit input variables to model resolution
Test Case identification
TC_ST_PRG_190
Test Procedure steps
1. Start the HV- AP component
2. The HV- AP checks if auxiliary input data are available on a dedicated directory.
3. The HV- AP uses input data for pre-processing
4. The HV- AP produces a file containing input variables to model resolution
5. The HV- AP makes a copy of file on a dedicated directory
3.4.21 Test Case TC_ST_PRG_200Test Case identification
TC_ST_PRG_200
Test Case name
Processing of input parameters with correct spatial and temporal resolutionH-SAF Subsystems
Hydro Validation
Component specification
HV- AP
Requirements to be tested
SR-34060-FUN-VPR
Input
Input parameters with spatial and temporal resolution specified in table 11 of SRD ([AD6]) are available on a dedicated directory Output
Processing of input parameters
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.4.21.1 Test Procedure TP_ST_PRG_200
Test Procedure identification
TP_ST_PRG_200
Test description
Processing of input parameters with correct spatial and temporal resolution as specified in table 11 of SRD ([AD6])
Test Case identification
TC_ST_PRG_200
Test Procedure steps
1. Start the HV- AP component
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (53 di 82)27/06/2008 9.52.01
SVERF
2. The HV- AP checks if input parameters are available on a dedicated directory.
3. The HV- AP uses input parameters for processing
4. The HV- AP produces hydrologic reports
3.5 Reporting Requirements test
3.5.1 Test Case TC_ST_REP_010Test Case identification
TC_ST_REP_010
Test Case name
Production of a Hydro Validation structured report
H-SAF Subsystems
Hydro Validation
Components specification
HV-VPR
Requirements to be tested
SR-34110-FUN-REP
Input
Hydro Validation Reports coming from the Impact Study Components (HV-AP-IS1 ... HV-AP-IS7)
Output
Structured Hydro Validation report correctly formatted
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.5.2 Test Case TC_ST_REP_020Test Case identification
TC_ST_REP_020
Test Case name
Hydrovalidation report generation
H-SAF Subsystems
Hydro Validation
Component specification
HV-VPR
Requirements to be tested
SR-34120-FUN-REPSR-34130-FUN-REPSR-34140-FUN-REPSR-34180-FUN-REP
Input
Data from operational chains
Output
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (54 di 82)27/06/2008 9.52.01
SVERF
Generation of reports in automatic way having:• payload information listed in table 12 of SRD Document ([AD6]) • header information listed in table 13 of SRD Document ([AD6])
• format listed in table 14 of the SRD Document ([AD6])
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.5.3 Test Case TC_ST_REP_030Test Case identification
TC_ST_REP_030
Test Case name
Hydrovalidation report archiving, accessibility and distributionH-SAF Subsystems
Hydro Validation
Component specification
HV-VPR
Requirements to be tested
SR-34150-FUN-REPSR-34160-FUN-REPSR-34170-FUN-REP
Input
Hydrovalidation reportsOutput
Hydrovalidation reports archived on a specified directory and accessible via FTP and via HTTP. The same reports shall be distributed to Precipitation, Soil Moisture and Snow Parameters generation chains.
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.5.3.1 Test Procedure TP_ST_REP_030
Test Procedure identification
TP_ST_REP_030
Test description
Hydrovalidation report archiving, accessibility and distribution
Test Case identification
TC_ST_REP_030
Test Procedure steps
1. Start the HV-VPR component
2. The HV-VPR produces HV reports
3. HV reports shall be available on a dedicated directory and accessible via FTP and HTPP
4. HV reports shall be archived on a local repository
5. HV reports shall be distributed to the Precipitation, Soil Moisture and Snow Parameters generation chains in an automatic way
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (55 di 82)27/06/2008 9.52.01
SVERF
6. Check the presence of HV reports onto dedicated directories/repositories of Precipitation, Soil Moisture and Snow Parameters generation chains
3.6 Archiving Requirements
3.6.1 Test Case TC_ST_ARC_010Test Case identification
TC_ST_ARC_010
Test Case name
Central archiving of H-SAF data coming from Precipitation, Soil Moisture, Snow Parameters production chains
H-SAF Subsystems
Offline Monitoring
Components specification
OM-ARC
Requirements to be tested
SR-44010-FUN-ARCSR-44040-FUN-ARC
Input
Data produced by Precipitation, Soil Moisture, Snow Parameters production chains
Output
Central Archiving of data in an online/offline archive
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.6.1.1 Test Procedure TP_ST_ARC_010
Test Procedure identification
TP_ST_ARC_010
Test description
Archiving of data produced by H-SAF subsystem
Test Case identification
TC_ST_ARC_010
Test Procedure steps
1. Start the operational chains
2. Verify if data are correctly produced by each chain
3. Start the archiving process
4. Verify that data are transmitted to the central archive
3.6.2 Test Case TC_ST_ARC_020Test Case identification
TC_ST_ARC_020
Test Case name
Logging of archiving relevant data
H-SAF Subsystems
Offline Monitoring
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (56 di 82)27/06/2008 9.52.01
SVERF
Components specification
OM-ARC
Requirements to be tested
SR-44020-FUN-ARC
Input
Data produced by H-SAF
Output
Central Archiving of relevant data in an online/offline archive is traced on system log
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.6.2.1 Test Procedure TP_ST_ARC_020
Test Procedure identification
TP_ST_ARC_020
Test description
Logging of archiving data produced by H-SAF
Test Case identification
TC_ST_ARC_020
Test Procedure steps
1. Start the operational chains
2. Verify if data are correctly produced by each chain
3. Start the archiving process
4. Verify that relevant data archiving is traced on system log
3.6.3 Test Case TC_ST_ARC_030Test Case identification
TC_ST_ARC_030
Test Case name
Collection and archiving of performance report of users
H-SAF Subsystems
Offline Monitoring
Components specification
OM-CEN, OM-ARC
Requirements to be tested
SR-44050-FUN-ARCSR-44060-FUN-ARCSR-44070-FUN-ARC
Input
Performance report of online/offline dissemination to users
Output
Collection, archiving and analysis of performance report of users
Test Method
TBD_SVERF_02
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (57 di 82)27/06/2008 9.52.01
SVERF
Dependency with other tests
TBD_SVERF_03
3.6.3.1 Test Procedure TP_ST_ARC_030
Test Procedure identification
TP_ST_ARC_030
Test description
Collection and archiving of dissemination performance report
Test Case identification
TC_ARC_030
Test Procedure steps
1. Start of dissemination process
2. Verify if data are correctly disseminated
3. Get the performance of dissemination towards the users
4. Producing of report containing the results of dissemination
5. Archiving of the report produced
6. Verify that the content of performance report is present :a. Time of receptionb. Format checkc. Message soundness
7. Verify if the foreseen timeless and reliability are respected
3.6.4 Test Case TC_ST_ARC_040Test Case identification
TC_ST_ARC_040
Test Case name
Automatic transferring of data from online to offline archive
H-SAF Subsystems
Offline Monitoring
Components specification
OM-ARC
Requirements to be tested
SR-44080-FUN-ARC
Input
Data available on online archive
Output
Data are transferred to offline archive
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.6.4.1 Test Procedure TP_ST_ARC_040
Test Procedure identification
TP_ST_ARC_040
Test description
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (58 di 82)27/06/2008 9.52.01
SVERF
Automatic transferring of data from online to offline archive
Test Case identification
TC_ST_ARC_040
Test Procedure steps
1. Verify there are data into online archive that have to be transferred into offline archive
2. Start of archiving process
3. Verify that on system log is traced the start and the end of the dissemination on offline archive
4. Verify that data are copied on offline archive
5. Verify that data copied on offline archive are deleted from online archive
6. Verify the operations are traced on system log
3.6.5 Test Case TC_ST_ARC_050Test Case identification
TC_ST_ARC_050
Test Case name
Restoring of data from offline to online archive
H-SAF Subsystems
Offline Monitoring
Components specification
OM-ARC
Requirements to be tested
SR-44090-FUN-ARC
Input
Request of data restoring
Output
Data are transferred to online archive
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.6.5.1 Test Procedure TP_ST_ARC_050
Test Procedure identification
TP_ST_ARC_050
Test description
Restoring of data from offline to online archive
Test Case identification
TC_ST_ARC_050
Test Procedure steps
1. A request of restoring of data from offline to online archive is received
2. Start of archiving process
3. The archiving process processes the received request
4. Verify that on system log is traced the start and the end of the archiving from offline archive
5. Verify that data are copied on online archive
6. Verify that data copied on online archive are deleted from offline archive
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (59 di 82)27/06/2008 9.52.01
SVERF
7. Verify the operations are traced on system log
3.6.6 Test Case TC_ST_ARC_060Test Case identification
TC_ST_ARC_060
Test Case name
Duration of archiving-relevant data logging
H-SAF Subsystems
Offline Monitoring
Components specification
OM-ARC
Requirements to be tested
SR-44021-FUN-ARC
Input
Any process writes on log
Output
Duration of archiving-relevant data logging is one month
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.6.6.1 Test Procedure TP_ST_ARC_060
Test Procedure identification
TP_ST_ARC_060
Test description
Duration of archiving-relevant data logging
Test Case identification
TC_ST_ARC_060
Test Procedure steps
1. Open Log file
2. If duration of archiving -relevant data logging is longer than one month, archive the system log file, create a new system log file
3. Write in log file
3.7 Dissemination Requirements test
3.7.1 Test Case TC_ST_DIS_010Test Case identification
TC_ST_DIS_010
Test Case name
Receiving and sending data through UMARF
H-SAF Subsystems
Offline Monitoring
Components specification
OM-DIS
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (60 di 82)27/06/2008 9.52.01
SVERF
Requirements to be tested
SR-54010-FUN-DISSR-54020-FUN-DISSR-54030-FUN-DISSR-54040-FUN-DISSR-54050-FUN-DISSR-54060-FUN-DIS
Input
Orders from users via UMARF
Output
OM-DIS sends data (product, catalogue, order updated information) through UMARF
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.7.1.1 Test Procedure TP_DIS_010
Test Procedure identification
TP_ST_DIS_010
Test description
Receiving and sending data through UMARF
Test Case identification
TC_ST_DIS_010
Test Procedure steps
1. The UMARF orders are available on input directory
2. The dissemination process is alive
3. The dissemination process gets the orders via UMARF from an input directory and validates it
4. The dissemination process processes the request and it marks the request as to valid or not putting that on a specific directory
5. The dissemination process mange the valid request disseminating products and catalogue to users via UMARF
3.7.2 Test Case TC_ST_DIS_020Test Case identification
TC_ST_DIS_020
Test Case name
Dissemination of products through WMO-GTS
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-DIS, SM-DIS, SP-DIS
Requirements to be tested
SR-54070-FUN-DISSR-54080-FUN-DIS
Input
Precipitation, Soil Moisture and Snow Parameters products
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (61 di 82)27/06/2008 9.52.01
SVERF
Output
WMO-GTS disseminates precipitation, soil moisture and snow parameters products through WMO-GTS according to WMO standard encoding
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.7.3 Test Case TC_ST_DIS_030Test Case identification
TC_ST_DIS_030
Test Case name
File format dissemination
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-DIS, SM-DIS, SP-DIS
Requirements to be tested
SR-54091-FUN-DIS
Input
Precipitation, Soil Moisture, and Snow Parameters products
Output
Dissemination of products to HV subsystem with the following format:• GRIB ver. 1• BUFR ver. 3• HDF-5• ASCII
• JPEG (or similar)
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.7.4 Test Case TC_ST_DIS_040Test Case identification
TC_ST_DIS_040
Test Case name
Archiving of records for disseminated data
H-SAF Subsystems
Offline Monitoring
Components specification
OM-ARC
Requirements to be tested
SR-54120-FUN-DIS
Input
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (62 di 82)27/06/2008 9.52.01
SVERF
Record of disseminated data
Output
Archived record of disseminated data
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.7.4.1 Test Procedure TP_ST_DIS_040
Test Procedure identification
TP_ST_DIS_040
Test description
Archiving of record for disseminated data
Test Case identification
TC_ST_DIS_040
Test Procedure steps
1. Verify if record of disseminated data are available
2. Start the archiving process
3. Start TBD_SVERF_04.sh. It is a script shall (or perl) that compares if input data are archived
3.7.5 Test Case TC_ST_DIS_050Test Case identification
TC_ST_DIS_050
Test Case name
Establishing of connection to UMARF via a client locally installed
H-SAF Subsystems
Offline Monitoring
Components specification
OM-CEN
Requirements to be tested
SR-54130-FUN-DIS
Input
UMARF client locally installed
Output
Establishing of UMARF connection
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.7.6 Test Case TC_ST_DIS_060Test Case identification
TC_ST_DIS_060
Test Case name
Dissemination of products through EUMETCast
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (63 di 82)27/06/2008 9.52.01
SVERF
H-SAF Subsystems
Precipitation, Soil Moisture, SP
Components specification
PR-DIS, SM-DIS, SP-DIS
Requirements to be tested
SR-54150-FUN-DISSR-54160-FUN-DIS
Input
Precipitation, Soil Moisture, Snow Parameters products
Output
Products disseminated to civil protection and hydrological validation institutes via EUMETCast guaranteeing time limits reported in table 16 of SRD ([AD6])
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.7.7 Test Case TC_ST_DIS_070Test Case identification
TC_ST_DIS_070
Test Case name
Dissemination of products to H-SAF central archive in real time
H-SAF Subsystems
Precipitation, Soil Moisture, Snow Parameters
Components specification
PR-DIS, SM-DIS, SP-DIS
Requirements to be tested
SR-54170-FUN-DIS
Input
Precipitation, Soil Moisture and Snow Parameters products
Output
Sending of Precipitation, Soil Moisture and Snow Parameters products to central archive in real time
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.7.8 Test Case TC_ST_DIS_080Test Case identification
TC_ST_DIS_080
Test Case name
Automatically distribution of Hydro validation reports towards operational chains
H-SAF Subsystems
Offline Monitoring
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (64 di 82)27/06/2008 9.52.01
SVERF
Components specification
OM-DIS
Requirements to be tested
SR-54180-FUN-DIS
Input
Hydro Validation reports
Output
Dissemination of HV reports from the central archive to the production chains.
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.7.9 Test Procedure TP_ST_DIS_080Test Procedure identification
TP_ST_DIS_080
Test description
Automatically distribution of Hydro validation reports towards operational chains
Test Case identification
TC_ST_DIS_080
Test Procedure steps
1. Verify if HV reports are available on the central archive
2. Start the OM-DIS process
3. Check the presence of r HV report onto specified directory on the processing chains
3.8 Reliability, Availability and Maintenance Requirements test
3.8.1 Test Case TC_ST_RAM_010Test Case identification
TC_ST_RAM_010
Test Case name
SW configuration control
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-70010-RAM-GEN
Input
None
Output
The SW shall be under configuration control as specified in CMP ([AD3])
Test Method
Demonstration
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (65 di 82)27/06/2008 9.52.01
SVERF
Dependency with other tests
TBD_SVERF_03
3.8.2 Test Case TC_ST_RAM_020Test Case identification
TC_ST_RAM_020
Test Case name
Logging message
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-70020-RAM-GEN
Input
Subsystems log files
Output
Each subsystems log files traces event-related items
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.8.3 Test Case TC_ST_RAM_030Test Case identification
TC_ST_RAM_030
Test Case name
Freely available or cots SW packages
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-70030-RAM-GEN SR-70040-RAM-GEN
Input
None
Output
Freely available or cots SW packages shall be used only with written support guarantee by the developer during the whole programme lifetime and should be encapsulated in a modular access interface
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (66 di 82)27/06/2008 9.52.01
SVERF
3.8.4 Test Case TC_ST_RAM_040Test Case identification
TC_ST_RAM_040
Test Case name
System availability
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-70050-RAM-GENSR-70060-RAM-GENSR-70070-FUN-DIS
Input
None
Output
Each subsystem should be available 24 hours a day, 7 days a week. Subsystems availability shall be 99%. The system failure should be fail-silent.
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.9 Operational Requirements test
3.9.1 Test Case TC_ST_OPE_010Test Case identification
TC_ST_OPE_010
Test Case name
Backup of operational SW
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-80010-OPE-GENSR-80020-OPE-GEN
Input
None
Output
Backup of operational SW on regular frequency according to CMP ([AD3]). Sometime it could need human intervention.
Test Method
Demonstration
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (67 di 82)27/06/2008 9.52.01
SVERF
Dependency with other tests
TBD_SVERF_03
3.9.2 Test Case TC_ST_OPE_020Test Case identification
TC_ST_OPE_020
Test Case name
User authentication
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-80040-OPE-GEN
Input
Subsystem/components with authentication via HMI (Human-Machine-Interface)
Output
User logged in
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.9.2.1 Test Procedure TP_ST_OPE_020
Test Procedure identification
TP_ST_OPE_020
Test description
User authentication
Test Case identification
TC_ST_OPE_020
Test Procedure steps
1. Get the HMI for system/component access
2. The subsystem/component requests the username/password
3. The system/component HMI requests the usr/pwd
4. The user write usr/pwd correctly
5. User logged in
3.9.3 Test Case TC_ST_OPE_030Test Case identification
TC_ST_OPE_030
Test Case name
Recovery Plan actuation to manage missing real time production
H-SAF Subsystems
All
Components specification
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (68 di 82)27/06/2008 9.52.01
SVERF
All
Requirements to be tested
SR-80050-OPE-GEN
Input
The Recovery Plan
Output
Coverage of all the expected cases of missing real time production by specific recovery activities
Test Method
Review of Design
Dependency with other tests
None
3.10Performance Requirements test
3.10.1 Test Case TC_ST_PER_010Test Case identification
TC_ST_PER_010
Test Case name
Processing performance
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-91000-PER-GENSR-91020-FUN-PRG
Input
Precipitation, Soil Moisture, Snow Parameters production whole sets
Output
Comparison between computed values and those ones specified in table 2 and 16 of SRD ([AD6])
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.10.2 Test Case TC_ST_PER_020Test Case identification
TC_ST_PER_020
Test Case name
Acquisition performance
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (69 di 82)27/06/2008 9.52.01
SVERF
SR-91010-PER-ACQ
Input
Input data to be acquired in NRT
Output
Estimation of acquisition times with the NRT expected one
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.11Portability Requirements test
3.11.1 Test Case TC_ST_POR_010Test Case identification
TC_ST_POR_010
Test Case name
Subsystem SW characteristics
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-92010-POR-GENSR-92020-POR-GENSR-92030-POR-GENSR-92040-POR-GEN
Input
Subsystems SW and platform
Output
Compliance with subsystem SW characteristics like:Platform independentUnix compliant
• SW for product generation written with languages reported in table 17 of SRD ([AD6])Compliance with ECSS guidelines
Test Method
Demonstration, Inspection, Review of Design
Dependency with other tests
TBD_SVERF_03
3.12Resources Requirements test
3.12.1 Test Case TC_ST_RES_010Test Case identification
TC_ST_RES_010
Test Case name
HW resource characteristics
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (70 di 82)27/06/2008 9.52.01
SVERF
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-93010-RES-GENSR-93050-RES-GENSR-93060-RES-GEN
Input
Subsystems HW
Output
Compliance with subsystem HW characteristics like:Online disk capacity sufficient for 7 operational days data processing workloadPhysical memory sufficient to handle heavy processing session
• Network communication bandwidth sufficient to guarantee timeliness reported in table 16 of SRD ([AD6])
Test Method
Demonstration
Dependency with other tests
TBD_SVERF_03
3.12.2 Test Case TC_ST_RES_020Test Case identification
TC_ST_RES_020
Test Case name
H-SAF Web site and Help Desk hosted by CNMCA
H-SAF Subsystems
Offline Monitoring
Components specification
All
Requirements to be tested
SR-93070-RES-GEN
Input
None
Output
Completeness of the information published in the web-site. Availability of the Help Desk via ticketing system
Test Method
Test
Dependency with other tests
None
3.12.3 Test Procedure TP_ST_RES_020Test Procedure identification
TP_ST_RES_020
Test description
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (71 di 82)27/06/2008 9.52.01
SVERF
Check of the H-SAF Web site and Help Desk availability
Test Case identification
TC_ST_RES_020
Test Procedure steps
1. Open a browser for internet access
2. Write the URL HTTP:\\TBD.com
3. Specify user and password
4. Check all the available information about H-SAF
5. Open a Ticket for a help request to be addressed to the Help Desk
3.13Safety Requirements test
3.13.1 Test Case TC_ST_SAF_010Test Case identification
TC_ST_SAF_010
Test Case name
SW in degrade operational mode
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-94010-SAF-GEN
Input
Any situation that could bring a SW item to work in degraded operational mode
Output
Warning message
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.13.2 Test Procedure TP_ST_SAF_010Test Procedure identification
TP_ST_SAF_010
Test description
SW in degrade operational mode
Test Case identification
TC_ST_SAF_010
Test Procedure steps
1. A situation that bring a SW item to work in degraded operational mode
2. The system shall generate a warning message
3.13.3 Test Case TC_ST_SAF_020
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (72 di 82)27/06/2008 9.52.02
SVERF
Test Case identification
TC_ST_SAF_020
Test Case name
Operator action on the system
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-94020-SAF-GEN
Input
Any operational situation that require an operator action
Output
System alarm
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.13.3.1 Test Procedure TP_ST_SAF_020
Test Procedure identification
TP_SAF_020
Test description
Operator action on the system
Test Case identification
TC_SAF_020
Test Procedure steps
1. A operational situation that require an operator action
2. The system shall generate a system alarm
3.13.4 Test Case TC_ST_SAF_030Test Case identification
TC_ST_SAF_030
Test Case name
Error during an application execution
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-94030-SAF-GEN
Input
Any error during an application execution
Output
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (73 di 82)27/06/2008 9.52.02
SVERF
The error shall not lead the application to an uncontrolled aborted status
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.13.4.1 Test Procedure TP_ST_SAF_030
Test Procedure identification
TP_ST_SAF_030
Test description
Error during an application execution
Test Case identification
TC_ST_SAF_030
Test Procedure steps
1. An error occurs during an application execution
2. The application shall not achieve an uncontrolled aborted status
3.13.5 Test Case TC_ST_SAF_040Test Case identification
TC_ST_SAF_040
Test Case name
SW termination status
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-94040-SAF-GEN
Input
Any SW item termination
Output
Termination status
Test Method
TBD_SVERF_02
Dependency with other tests
TBD_SVERF_03
3.13.5.1 Test Procedure TP_ST_SAF_40
Test Procedure identification
TP_ST_SAF_040
Test description
SW termination status
Test Case identification
TC_ST_SAF_040
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (74 di 82)27/06/2008 9.52.02
SVERF
Test Procedure steps
1. A SW item has terminated its operation
2. The item termination code is traced on system log
3. The item termination code can have the following value:• 0 if there is no error• >0 if there is a warning• <0 if there is an error to be managed
3.14Security Requirements test
3.14.1 Test Case TC_ST_SEC_010Test Case identification
TC_ST_SEC_010
Test Case name
SW design requirements
H-SAF Subsystems
All
Components specification
All
Requirements to be tested
SR-94060-SEC-GENSR-94070-SEC-GENSR-94080-SEC-GEN
Input
None
Output
SW design shall prevent:● Accidental data modification● Unauthorized data modifications● Propagation of SW damages
Test Method
Review of Design
Dependency with other tests
TBD_SVERF_03
3.15Documentation Requirements test
3.15.1 Test Case TC_ST_DOC_010Test Case identification
TC_ST_DOC_010
Test Case name
SW design requirements
H-SAF Subsystems
All
Components specification
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (75 di 82)27/06/2008 9.52.02
SVERF
All
Requirements to be tested
SR-95010-DOC-GEN
Input
Documentation
Output
Documentation shall be available for each SW items under configuration control
Test Method
Inspection
Dependency with other tests
TBD_SVERF_03
4 Requirements Traceability
4.1 System Requirements traceability to System Test CasesSystem Requirement Test Case Notes
SR-00030-FUN-GENSR-00040-FUN-GENSR-00050-FUN-GEN
TC_ST_GEN_010
SR-00060-FUN-GEN TC_ST_GEN_020
SR-00070-FUN-GEN TC_ST_GEN_030 SR-00020-FUN-GENSR-00021-FUN-GEN
TC_ST_GEN_040
SR-10020-FUN-ACQ TC_ST_GEN_050 Data Acquisition
SR-10010-FUN-ACQ TC_ST_ACQ_020 SR-10020-FUN-ACQ TC_ST_ACQ_030 SR-10030-FUN-ACQ TC_ST_ACQ_040 SR-10090-FUN-ACQ TC_ST_ACQ_050 SR-10100-FUN-ACQ TC_ST_ACQ_060 SR-10110-FUN-ACQSR-10120-FUN-ACQ
TC_ST_ACQ_070
SR-10130-FUN-ACQ TC_ST_ACQ_080 SR-12010-FUN-ACQSR-12020-FUN-ACQSR-12021-FUN-ACQSR-12030-FUN-ACQSR-12031-FUN-ACQSR-12040-FUN-ACQSR-12041-FUN-ACQSR-12050-FUN-ACQSR-12051-FUN-ACQSR-12060-FUN-ACQSR-12061-FUN-ACQSR-12070-FUN-ACQSR-12010-FUN-ACQ
TC_ST_ACQ_090
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (76 di 82)27/06/2008 9.52.02
SVERF
SR-12071-FUN-ACQSR-12080-FUN-ACQSR-12081-FUN-ACQSR-12100-FUN-ACQSR-12110-FUN-ACQSR-12111-FUN-ACQSR-12120-FUN-ACQSR-12130-FUN-ACQSR-12140-FUN-ACQSR-12150-FUN-ACQSR-12160-FUN-ACQ
SR-12220-FUN-ACQ TC_ST_ACQ_100 SR-12240-FUN-ACQSR-12250-FUN-ACQSR-12260-FUN-ACQSR-12270-FUN-ACQSR-12280-FUN-ACQ
TC_ST_ACQ_110
SR-12300-FUN-ACQ TC_ST_ACQ_120 SR-11010-FUN-ACQ TC_ST_ACQ_130 SR-11030-FUN-ACQSR-11040-FUN-ACQ
TC_ST_ACQ_140
SR-13310-FUN-ACQSR-13315-FUN-ACQSR-13020-FUN-ACQSR-13025-FUN-ACQSR-13030-FUN-ACQSR-13040-FUN-ACQSR-13050-FUN-ACQSR-13060-FUN-ACQSR-13070-FUN-ACQSR-13080-FUN-ACQSR-13090-FUN-ACQSR-13100-FUN-ACQSR-13070-FUN-ACQ
TC_ST_ACQ_150
SR-14010-FUN-ACQSR-14020-FUN-ACQSR-14030-FUN-ACQSR-14040-FUN-ACQ
TC_ST_ACQ_160
Pre-processing
SR-20010-FUN-PPRSR-20020-FUN-PPR
TC_ST_PPR_010
Product generation
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (77 di 82)27/06/2008 9.52.02
SVERF
SR-00010-FUN-GENSR-32010-FUN-PRGSR-32011-FUN-PRGSR-32060-FUN-PRGSR-32061-FUN-PRGSR-32062-FUN-PRGSR-32063-FUN-PRGSR-31010-FUN-PRGSR-31070-FUN-PRGSR-33040-FUN-PRGSR-33050-FUN-PRGSR-33060-FUN-PRGSR-33065-FUN-PRGSR-33200-FUN-PRGSR-33210-FUN-PRGSR-33220-FUN-PRG
TC_ST_PRG_010
SR-30010-FUN-GEN TC_ST_PRG_020 SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-32010-FUN-PRGSR-32020-FUN-PRGSR-32030-FUN-PRGSR-32040-FUN-PRG
TC_ST_PRG_030
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-32011-FUN-PRGSR-32021-FUN-PRGSR-32031-FUN-PRGSR-32041-FUN-PRG
TC_ST_PRG_040
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-31010-FUN-PRGSR-31020-FUN-PRGSR-31030-FUN-PRGSR-31040-FUN-PRGSR-31060-FUN-PRGSR-31061-FUN-PRG
TC_ST_PRG_050
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-30040-FUN-PRGSR-31070-FUN-PRGSR-31080-FUN-PRGSR-31090-FUN-PRGSR-31100-FUN-PRGSR-31150-FUN-PRG
TC_ST_PRG_060
SR-31270-FUN-PRG TC_ST_PRG_070
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (78 di 82)27/06/2008 9.52.02
SVERF
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33040-FUN-PRGSR-33070-FUN-PRGSR-33110-FUN-PRGSR-33140-FUN-PRG
TC_ST_PRG_080
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33050-FUN-PRGSR-33051-FUN-PRGSR-33120-FUN-PRGSR-33150-FUN-PRG
TC_ST_PRG_090
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33060-FUN-PRGSR-33090-FUN-PRGSR-33125-FUN-PRGSR-33160-FUN-PRG
TC_ST_PRG_100
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33070-FUN-PRGSR-33100-FUN-PRGSR-33130-FUN-PRGSR-33170-FUN-PRG
TC_ST_PRG_110
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33200-FUN-PRGSR-33230-FUN-PRGSR-33260-FUN-PRGSR-33290-FUN-PRG
TC_ST_PRG_120
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33210-FUN-PRGSR-33240-FUN-PRGSR-33270-FUN-PRGSR-33300-FUN-PRG
TC_ST_PRG_130
SR-30020-FUN-PRGSR-30030-FUN-PRGSR-33220-FUN-PRGSR-33250-FUN-PRGSR-33280-FUN-PRGSR-33320-FUN-PRG
TC_ST_PRG_140
SR-33320-FUN-PRG TC_ST_PRG_150 SR-34010-FUN-VPR TC_ST_PRG_160 SR-34020-FUN-VPRSR-34030-FUN-VPR
TC_ST_PRG_170
SR-34040-FUN-VPR TC_ST_PRG_180 SR-34050-FUN-VPR TC_ST_PRG_190 SR-34060-FUN-VPR TC_ST_PRG_200 Reporting
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (79 di 82)27/06/2008 9.52.02
SVERF
SR-34110-FUN-REP TC_ST_REP_010 SR-34120-FUN-REPSR-34130-FUN-REPSR-34140-FUN-REPSR-34180-FUN-REP
TC_ST_REP_020
SR-34170-FUN-REP TC_ST_REP_030 Archiving
SR-44010-FUN-ARCSR-44040-FUN-ARC
TC_ST_ARC_010
SR-44020-FUN-ARC TC_ST_ARC_020 SR-44050-FUN-ARCSR-44060-FUN-ARCSR-44070-FUN-ARC
TC_ST_ARC_030
SR-44080-FUN-ARC TC_ST_ARC_040 SR-44090-FUN-ARC TC_ST_ARC_050 SR-44021-FUN-ARC TC_ST_ARC_060 Dissemination
SR-54010-FUN-DISSR-54020-FUN-DISSR-54030-FUN-DISSR-54040-FUN-DISSR-54050-FUN-DISSR-54060-FUN-DIS
TC_ST_DIS_010
SR-54070-FUN-DISSR-54080-FUN-DIS
TC_ST_DIS_020
SR-54091-FUN-DIS TC_ST_DIS_030 SR-54120-FUN-DIS TC_ST_DIS_040 SR-54130-FUN-DIS TC_ST_DIS_050 SR-54150-FUN-DISSR-54160-FUN-DIS
TC_ST_DIS_060
SR-54170-FUN-DIS TC_ST_DIS_070 SR-54180-FUN-DIS TC_ST_DIS_080 Reliability, Availability and Maintenance
SR-70010- RAM-GEN TC_ST_RAM_010 SR-70020-RAM-GEN TC_ST_RAM_020 SR-70030-RAM-GEN TC_ST_RAM_030 SR-70050-RAM-GENSR-70060-RAM-GENSR-70070-FUN-DIS
TC_ST_RAM_040
Operational
SR-80010-OPE-GENSR-80020-OPE-GEN
TC_ST_OPE_010
SR-80040-OPE-GEN TC_ST_OPE_020 SR-80050-OPE-GEN TC_ST_OPE_030 Performance
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (80 di 82)27/06/2008 9.52.02
SVERF
SR-91000-PER-GENSR-91020-FUN-PRG
TC_ST_PER_010
SR-91010-FUN-PRG TC_ST_PER_020 Portability
SR-92010-POR-GENSR-92020-POR-GENSR-92030-POR-GENSR-92040-POR-GEN
TC_ST_POR_010
Resources
SR-93010-RES-GENSR-93050-RES-GENSR-93060-RES-GEN
TC_ST_RES_010
SR-93070-RES-GEN TC_ST_RES_020
Safety
SR-94010-SAF-GEN TC_ST_SAF_010 SR-94020-SAF-GEN TC_ST_SAF_020 SR-94030-SAF-GEN TC_ST_SAF_030 SR-94040-SAF-GEN TC_ST_SAF_040 Security
SR-94060-SEC-GENSR-94070-SEC-GENSR-94080-SEC-GEN
TC_ST_SEC_010
Documentation
SR-95010-DOC-GEN TC_ST_DOC_010 Table 6 System requirement vs. System/Component Test Cases
5 TBDs/TBCs listIn this section all TBDs and TBCs are summarized, in order to provide traceability and to monitor their resolution. References to the responsible of the item and to a planned date of resolution are also given.Each TBD/TBC item is labelled by a unique identifier, related to the pertinent document, as follows:TBD/TBC_XXX_YY: where XXX is the acronym of the document (for instance, the present document: SVERF) and YY is a progressive number.
TBDs list
TBD/TBC id description responsible date of resolution
TBD_SVERF_02 To be defined during test execution. The value is the name of the script used for test execution if applicable else the used method.
HSAF Project Team September 2008
TBD_SVERF_03 To be defined during test execution. The value is specified when a test execution depends on another.
HSAF Project Team September 2008
TBD_SVERF_04 To define a script shell (or perl) that compares if input data are archived.
HSAF Project Team September 2008
TBD_SVERF_05 To define a script shell (or perl) that prints on screen information about data on archive.
HSAF Project Team September 2008
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (81 di 82)27/06/2008 9.52.02
SVERF
TBD_SVERF_06 To define a script shell (or perl) that prints on screen time information of the input file.
HSAF Project Team September 2008
TBD_SVERF_07 To define the date of Starting of H-SAF Central Archiving.
HSAF Project Team September 2008
TBD_SVERF_08 To define a script shell (or perl) that checks the accessibility from H-SAF Central Archive to raw-data local archives
HSAF Project Team September 2008
Table 7 TBDs/TBDs list
6 Appendix A - Test toolsThe following table contains a description of the scripts used for executing the test procedures. The general tools are scripts that are used by several test cases to check various details of the output files such as log files, and database contents.
Tool Type Description
query.shGeneral The script executes query and writes the result to
stdout. The input of the script is the query.
List_log.sh General Prints on screen the content of system log
Table 8 Test tools
- END OF THE DOCUMENT -
file:///C|/Documents%20and%20Settings/Panti%20Massimo/Desktop/CDR2_DOCS/sverf_0_5.htm (82 di 82)27/06/2008 9.52.02