+ All Categories
Home > Documents > Table of Contents - Europa · Table of Contents 1 LPIS TG ETS ... You can make a pdf version of the...

Table of Contents - Europa · Table of Contents 1 LPIS TG ETS ... You can make a pdf version of the...

Date post: 03-May-2018
Category:
Upload: dangkiet
View: 217 times
Download: 0 times
Share this document with a friend
18
Table of Contents 1 LPIS TG ETS .......................................................................................................................................................................................................................1 1.1 LPIS Technical Guidance for ETS ....................................................................................................................................................................1 2 ETS Documentation ...........................................................................................................................................................................................................2 2.1 Executive summary..........................................................................................................................................................................................2 2.2 Release Notes ..................................................................................................................................................................................................2 2.3 Foreword..........................................................................................................................................................................................................3 2.4 Glossary...........................................................................................................................................................................................................3 2.5 Abbreviations ....................................................................................................................................................................................................3 3 ETS Scope ..........................................................................................................................................................................................................................4 3.1 In scope ............................................................................................................................................................................................................4 3.2 Not in scope ......................................................................................................................................................................................................4 3.3 Assessment is annual .......................................................................................................................................................................................4 3.4 Assessment by sampling ..................................................................................................................................................................................4 3.5 Subsets and item substitution ...........................................................................................................................................................................4 3.6 Considerations on a given LPIS implementation ..............................................................................................................................................4 4 ETS Data structure............................................................................................................................................................................................................5 4.1 MTS ..................................................................................................................................................................................................................5 4.2 Data structure implications...............................................................................................................................................................................5 4.3 Data structure for population upload................................................................................................................................................................5 5 ETS Sampling zones.........................................................................................................................................................................................................6 5.1 Zone selection..................................................................................................................................................................................................6 5.2 Reference Parcel Sampling ..............................................................................................................................................................................6 5.3 Population data and sample exchange instructions.........................................................................................................................................6 6 ETS Inspection workflow..................................................................................................................................................................................................8 6.1 Observation steps .............................................................................................................................................................................................8 6.2 Item conformance ...........................................................................................................................................................................................10 7 ETS Inspection variants ..................................................................................................................................................................................................11 7.1 Reference parcel aggregation........................................................................................................................................................................11 7.2 Field activities .................................................................................................................................................................................................11 7.3 Conditional triggers .........................................................................................................................................................................................11 8 ETS Data maintenance ....................................................................................................................................................................................................12 8.1 Context ...........................................................................................................................................................................................................12 8.2 Evidencing ......................................................................................................................................................................................................12 9 ETS Data Capture............................................................................................................................................................................................................13 10 ETS Conformance ..........................................................................................................................................................................................................14 10.1 Conformity for a quantitative quality element...............................................................................................................................................14 10.2 Conformance class 1 ....................................................................................................................................................................................14 10.3 Conformance class 2 ....................................................................................................................................................................................15 11 ETS Delivery ...................................................................................................................................................................................................................17 11.1 Textual reports ..............................................................................................................................................................................................17 11.2 Data packages ..............................................................................................................................................................................................17 i
Transcript

Table of Contents1 LPIS TG ETS.......................................................................................................................................................................................................................1

1.1 LPIS Technical Guidance for ETS....................................................................................................................................................................1

2 ETS Documentation...........................................................................................................................................................................................................22.1 Executive summary..........................................................................................................................................................................................22.2 Release Notes..................................................................................................................................................................................................22.3 Foreword..........................................................................................................................................................................................................32.4 Glossary...........................................................................................................................................................................................................32.5 Abbreviations....................................................................................................................................................................................................3

3 ETS Scope..........................................................................................................................................................................................................................43.1 In scope............................................................................................................................................................................................................43.2 Not in scope......................................................................................................................................................................................................43.3 Assessment is annual.......................................................................................................................................................................................43.4 Assessment by sampling..................................................................................................................................................................................43.5 Subsets and item substitution...........................................................................................................................................................................43.6 Considerations on a given LPIS implementation..............................................................................................................................................4

4 ETS Data structure............................................................................................................................................................................................................54.1 MTS..................................................................................................................................................................................................................54.2 Data structure implications...............................................................................................................................................................................54.3 Data structure for population upload................................................................................................................................................................5

5 ETS Sampling zones.........................................................................................................................................................................................................65.1 Zone selection..................................................................................................................................................................................................65.2 Reference Parcel Sampling..............................................................................................................................................................................65.3 Population data and sample exchange instructions.........................................................................................................................................6

6 ETS Inspection workflow..................................................................................................................................................................................................86.1 Observation steps.............................................................................................................................................................................................86.2 Item conformance...........................................................................................................................................................................................10

7 ETS Inspection variants..................................................................................................................................................................................................117.1 Reference parcel aggregation........................................................................................................................................................................117.2 Field activities.................................................................................................................................................................................................117.3 Conditional triggers.........................................................................................................................................................................................11

8 ETS Data maintenance....................................................................................................................................................................................................128.1 Context...........................................................................................................................................................................................................128.2 Evidencing......................................................................................................................................................................................................12

9 ETS Data Capture............................................................................................................................................................................................................13

10 ETS Conformance..........................................................................................................................................................................................................1410.1 Conformity for a quantitative quality element...............................................................................................................................................1410.2 Conformance class 1....................................................................................................................................................................................1410.3 Conformance class 2....................................................................................................................................................................................15

11 ETS Delivery...................................................................................................................................................................................................................1711.1 Textual reports..............................................................................................................................................................................................1711.2 Data packages..............................................................................................................................................................................................17

i

1 LPIS TG ETSversion 6.0

final @20150907

1.1 LPIS Technical Guidance for ETSThis article is the table of contents of the revised LPIS QA data test suite (ETS).

Chapter 1 provides the key concepts, activities and decision in the Data Product Specification (DPS) setup.1. The LCM conceptual model and dynamic model provided a coherent set of instructions for performing the inspection in chapter 2. This chaptercan be regarded as the merger of the former LPISQA chapter 2 (inspection), ETS annex I, ETS annex II, ETS annex III and severalclarifications into a single document.

2.

Chapter 3 provides any ancillary data that does not reside in the LCM conceptual model such as illustrations, examples and elaborations.These have mostly been recovered and adapted from the ETSv5.3 wikipages. The single download page represent the fourth chapter.

3.

Before you start, please read the generic introduction for a better reading of the diagrams and instruction of this guidance.

Printing:

You can make a pdf version of the first chapter of this ETS guidance1. You can make a pdf version of the second chapter of this ETS guidance. Complete and full scale diagrams can be downloaded by clicking onthe diagram in the pdf file.

2.

You can make a pdf version of the third and fourth chapter of this ETS guidance .3.

1. Technical guidance

1.1 Documentation 2nd release notes added

1.2 ETS_Scope1.3 Data Structure: preparing reference parcel data1.4 Reference parcel sampling and zone selection1.5 Data capture: Inspecting RP data

1.5.1 An overview of the inspection workflow1.5.2 Inspection variants1.5.3 Data maintenance: Dealing with updates during the inspection

1.6 Conformance statements1.7 Delivery: Reporting packages

2. Detailed ETS inspection instructions

2.1. Overview of the overall ETS workflow

2.1.1. Data conformance assessment: the generic process2.1.2. Definition of the ETS item under inspection2.1.3. Item inspection details2.1.4. Item measurement details2.1.5. Conformity assessment of an item and of the system2.1.6. Analysis of results and preparation of remedial action plan

2.1.6.1. How to report the LPIS conformance with class 1?2.1.6.1.1. Calculation of system bias2.1.6.1.2. Calculation of system precision2.1.6.1.3. Calculation of the number of reference parcels with incorrect reference area2.1.6.1.4. Calculation of the number of defective reference parcels

2.1.6.2. How to report the LPIS conformance with class 2?2.1.6.2.1. Calculation of the number of missed updates

2.1.6.3. Calculation the distribution of reference parcels with incorrect reference area2.1.6.4. Calculation the number of reference parcels with incorrect land classification2.1.6.5. Calculation of the number of non-conformities per cause2.1.6.6. Calculation of the ratios of declared versus recorded land

2.2. Application Schema for ETS

3. Additional information

3.1. Examples

3.1.1. Parcel design topics - Example for contamination corrected3.1.2. ETS methodology topics - Example of LUI boundary added3.1.3. Frequent inspection errors3.1.4. Detailed technical procedures - note: these three procedures are not copied into the on the fly pdf!

3.1.4.1. GNSS measurement3.1.4.2. Field observation3.1.4.3. Combining CAPI and GNSS measurement

3.2. Experiences from the past3.3. Tools3.4. LPIS QA portal3.5. JRC XML validator - ETS schema changes documented3.6. Questions and answers - 2 batches of Q&A appended

4. Downloads - ETS reporting schemas added, Annex I, II and III revised

5. Errata - NEW!!

1

2 ETS DocumentationGo up to the main ETS page

2.1 Executive summaryThe quality assurance framework of LPIS is an integral part of LPIS management and upkeep processes. In this framework, the LPIS of a MS/Region isregarded as a system under test (SUT), which is composed of two major components: the local application schema (eligibility profile) and the datarecords stored in the system. The so called Executable Test Suite (ETS) targets at the data component by annually assessing conformity according toArticle 6 of (EU) Regulation No 640/2014.

For data testing, the high level requirements (abstract test cases) are explicitly provided by Article 6 of (EU) Regulation No 640/2014. For simplicity andhistoric reasons, the abstract and executable test cases are merged into a single workflow and are referred to as ETS. The ETS proposed by DG JRCcan be directly implemented by all MS/Regions. The conformance testing procedure is based on data quality assessment according to ISO 19157:2013(Geographic information – Data quality) and ISO 2859-2:1985 (Sampling procedures for inspection by attributes, Part 2: Sampling plans indexed bylimiting quality (LQ) for isolated lot inspection).

These standards are used to define the data quality measures, the prescribed sample sizes, and the acceptance numbers for the quality measures.These elements together with the basic concepts (inspected item, critical defect, observed area, contamination, and land cover classification accordingto MS eligibility profile) are detailed in the LPIS quality assessment model leaf of the LPIS Core Model (LCM). The data inspection methodology withdetailed activity steps are described in the “Perform ETS” use case of the business model.

2.2 Release Notes2.2.1 May 2015

The quality assessment methodology for 2015 (ETS v6.0) involves the following differences with respect to the previous' version (ETS v5.3). A part fromthe individual geo-location of non-conformities, no additional inspection activities are involved.

Quality element 3 has been swapped with quality element 4 in order to follow regulation and conformance classes.• Quality element 7 has been removed• Otherwise the most striking innovation is documentation itself: it is no longer based on static annexes (I and II) but completely model driven.This results in a detailed workflow chapter.

The testing methodology and procedure have been adapted to the new requirements of the CAP reform•

zone selection and image provision are now provided for by EC services1. the 5m buffer rule and subsequent "copy/paste" is no longer allowed.2. a classification correctness test has been introduced3. any occurrence of critical defect or non-conformity is henceforth considered as a single weakness. ie. one reference parcel couldhave more than one weakness.

4.

the reporting distinguishes between conformance class 1 (parcel quality issues) and conformance class 2 (system weaknesses).Note that previous quality element 3 and 4 have been swapped.

5.

ISO 2859-2 numbers remain expressed as "percent non-conformity items" for the conformance class 1, but become"non-conformities per 100 items (reference parcels)" for the conformance class 2

6.

a new schema caters for the information exchanges that use to be handled as non-structured documentation7.

A derogation to the deadline for submission of application allows the custodian to move the reference date from end May to end of June(30th).

2.2.2 September 2015

Upon demand of the Member States, old style annexes were compiled and published. Below you find the corresponding release notes.

2.2.2.1 General and Inspection - related

General: Errata identified on the previous ETS version (v 5.3 from 2014) have been corrected• Annex I

The consistency between the individual checks for feasibility for inspection and the reported code list is improved.♦ The "5-m buffer" approach towards unmeasurable reference parcels is abolished . Parcel aggregation introduced to handle thesituation.

The determination of the area of small non-agriculture features through estimation is abolished. All measurements necessary for thecalculation of the total observed eligible area are done through delineation.

Point location (at minimum) of the non-agriculture features reported is introduced.♦ Provisions for GAC is removed.♦ Revision of the contamination detection procedure is made towards more clear and generic criteria (Table 8.2, points 3.7, 3.11 and4.5). MS are asked to document their local contamination detection rules.

"Area Classification" (10102_4) is introduced for the "classification correctness test" that verifies if a reference parcel correctlydifferentiates between arable land, permanent grassland and permanent crop.

Causes for non-conformities are reported per non-conformity (possible weakness) found♦ QE2b "LPIS eligibility rates" (10203) is revised to better explain to calculation process.♦ Obsolete waivers for contamination are suppressed.♦ QE7 has been abolished.♦

Annex II (apart of all changes in the flow of events resulted from the modifications of Annex I)Implementation of the revised contamination check is explained.♦ Use of the proprietary imagery is clarified.♦ Activity diagram was updated and uses standard UML notation.♦ The “four-eye control” chapter is moved to a separate Annex IV.♦ The use of the terms Land under Inspection (LUI) and Item for Inspection is made consistent though the whole document.♦ It is clarified that aggregation of reference parcels is used only to derive the quantitative values, and that the item of inspection stillremains the individual reference parcel itself.

The accounting of the skipped reference parcels in the ETS reporting is clarified.♦

Annex IIIGrouping of the minimum legend classes into agricultural land cover categories for the classification correctness test is included.♦ Chapter 7 becomes a mere reference to the technical guidance of pro rata grassland.♦

2.2.2.2 Publishing (WikiCAP)

Maroon colour texts relate to changes from or additions to initial publication of ETS 6.0• Some last minute typing errors spotted in some Annexes. See Errata section at the end.• WikiCAP, point 3.1.1 Parcel design topics, Example for contamination corrected• WikiCAP, point 3.5 JRC XML validator, ETS schema changes documented• WikiCAP, point 3.6. Questions and answers, 2 batches of Q&A appended•

2

2.2.2.3 Reporting

ETS schemas - last uploads are available on the LPIS Registry

ETS observation schema - New element RP_CLS is added to accommodate the reporting of the results from newly introduced classificationcorrectness test

ETS observation schema - Annotation is added to reflect the abolishment the reporting of certain elements (such as RP_MEA_GAC).Elements themselves are not yet removed from schema.

ETS scoreboard schema - Annotation is added to reflect the abolishment the reporting of certain elements (OTSC_RIG). Elementsthemselves are not yet removed from schema.

QualityReportMetadata schema - Annotation is added to reflect the abolishment the reporting of certain elements (DQ_Element_QE7).Elements themselves are not yet removed from schema.

LPIS common types - Annotation is added to reflect the abolishment the reporting of certain elements (waivers A and D). Elementsthemselves are not yet removed from schema.

Relevant XML examples are revised to reflect the above-mentioned updates.• An error LPIS Sample pre-selection status example is corrected.• Full XSD/XML version of the ETS assessment report is prepared to complement the word versions. The application of this schema is fully atthe discretion of the MS Administration.

2.2.2.4 Last minute corrections

Annex Ipage 1, release notes, bullet 7. The text "The abbreviation ATS is changed to MTS." should be read "The abbreviation ATS ischanged to ICS."

page 18, item 3.7, point 2.b. Text "Use the information provided from the MTS and the predefined list of local ground conditions."should be read "Use the information provided from the ICS and the predefined list of local ground conditions."

page 23, item 4.5, point 4. Text "...a trigger was observed. Use the information provided from the MTS and the predefined list ofacceptable waivers..." should be read "...a trigger was observed. Use the information provided from the ICS and the predefined listof acceptable waivers..."

Annex IIpage 4, point 2.10, note, Text "If both VHR and aerial imagery are available, a positive outcome of the feasibility for inspection ononly one image will be sufficient to proceed with the inspection of the Reference Parcel." should be read "If both VHR and aerialimagery are available, a positive outcome of the feasibility for inspection on only one image will NOT be sufficient to proceed withthe inspection of the Reference Parcel."

♦ •

2.3 Foreword2.3.1 Rationale

The importance of the LPIS comes from the requirement that it must channel all area based aids; the corresponding financial value exceeded €40bn for2012 (see item 05.03 on pages 17-18) and 2013. It concerned in 2013 around 7.40 million beneficiaries ([1]). For this specific purpose, LPIS quality canroughly be defined as the ability of the system to fulfill two explicit LPIS functions:

the unambiguous localisation of all declared agricultural parcels by farmer and inspectors,1. and the quantification of all eligible area for crosschecks during the administrative controls by the paying agency.2.

Failure of an LPIS in the unambiguous localisation induces risks for double declaration of land. Inadequate quantification of eligible area renders thecrosschecks ineffective for preventing and identifying over-declarations by farmers. Both failures involve financial risks for the EU Funds.

Furthermore, any well functioning LPIS greatly facilitates operations by farmers, inspectors and paying agencies, resulting in a better overallperformance. Obviously, a better LPIS substantially improves IACS effectiveness and management of EU Funds.

2.3.2 Quality Assurance

Figure 1: quality conceptsBoth Member States and the EU have therefore a keen interest in demonstrating the quality of the LPIS and in addressing quality issues, if any. Suchprocesses of planned and systematic quality demonstration form the hearth of a quality assurance (QA) system. A QA framework relies on mutuallyagreed quality testing between “consumer” (the European Commission) and “supplier” (the Member State). A test or series of tests assessescompliance for each specified quality requirement.

A distinction is made between “prime” and “secondary” quality elements. The prime elements are those that the European Commission considersfundamental for a correct LPIS operation. They are applicable to all LPIS systems. Secondary quality elements might not be applicable for all systems,but may provide additional input for analysing and remediating issues identified on the prime quality elements.

The Commission Delegated Regulation (EU) No 640/2014 calls for an annual reporting on the six prime quality elements grouped into two conformanceclasses. For each quality element, one or more measures, the inspection procedure and conformance levels have been designed.

A former discussion document has been elaborated in 2011 on these LPIS properties and the reasons why they are essential for a good functioning. Italso proposed a methodology to implement and integrate an adequate quality policy in the regulatory framework. The main application of thisquantitative information is to provide an instrument for achieving business process improvement. Essentially, this quality assurance frameworkconstitutes a yearly check-step within the commonly known plan-do-check-act (PDCA) cycle.

2.4 Glossary

2.5 AbbreviationsGo up to the main ETS page

3

3 ETS ScopeGo up to the main ETS page

3.1 In scopeFor the LPIS QA only the first pillar of IACS is currently considered, indicated by the first two requirements. In particular any reference parcel that ismentioned on the farmer’s application shall be subject to inspection. This translates to reference parcels that:

were declared during the previous application year for 1st pillar support OR◊ hold a non-zero “maximum eligible area” ; i.e. can appear on the pre-printed form or re-enter an application for 1st pillar supportwithout triggering an additional verification procedure.

3.2 Not in scopeCorrect execution of this combined query on an appropriately implemented LPIS should result that this "scope of the LPIS QA", does NOT include:

Reference parcel with zero MEA but declared exclusively for "other uses".◊ Reference Parcels holding agriculture land, but not eligible for direct aids as their size is below the minimum threshold establishedin the national/regional eligibility profile.

3.3 Assessment is annualLPIS QA has to be performed yearly, hence is an assessment of a specific LPIS situation, at a specific reference date. For example: LPIS QA 2015could be the quality assessment of LPIS data at the precise date of the 31 May 2015 (end of month after deadline submission of applications). If usingthe derogation for 2015 to postpone deadline of application to June (no later than 15 June) then the applicable reference date will be 30 of June 2015. Itis without interest to make a self assessement of updated data as the results will be biased and root causes of possible non-conformities could be nothighlighted.

3.4 Assessment by samplingIf any parcel of the farmer's aid application is subject to inspection, quality assessment is based on a sample of a concerned LPIS. As from 2015 ECservices are responsible of both the sample selection and the images provision. Sample selection are base on industry standard sampling planfollowing ISO 2859-2 standards. Upon reception of the complete lot of reference parcels a subset of items (reference parcels) for quality assessment isreturned to the MS. This subset is on purpose bigger than the minimum corresponding sampling needed in order to overcome the skipping of parcelsdue to technical issues.

3.5 Subsets and item substitutionFrom all these parcels in scope, the inspection of ground situation and some historic conditions can exclude parcels from the sample reference parcel,not suitable for inspection.

when local technical image conditions prevent a clear inspection, a pre-selected reference parcel can be skipped (ie. for reason oflocal cloud cover). The parcels that are not skipped are fit for inspection.

1.

when local cropping conditions prevent the delineation of a particular segment of a reference parcel, such reference parcels can be2. either flagged as not fit for measurement.• or subject to the parcel aggregation variant, using a new, aggregated item of inspection.•

The application of the parcel aggregation variant is:systematic: if applied, it needs to be applied to all reference parcels where to conditions are right.◊ optional unless the rate of parcels "not fit for measurement" is expected to exceed 50%◊

Note: discarding of entire zones is no longer an option as the IDQA, performed by JRC, qualifies the image and zone suitable for inspection.

3.6 Considerations on a given LPIS implementationThe LPIS QA scope restricts the LPIS population for the purpose of this LPIS quality assessment to ONLY those reference parcels that comply with oneof the two conditions "declared for aid in the previous year' or "hold non-zero reference area in the current year".

This is not necessarily the complete set of "blocks" that are managed in the "LPIS-layer" of the GIS-environment. It can be a subset, forinstance, 'blocks' that are completely urban, water or forested AND that have not been declared by farmers should not be subject for qualitytesting. Non-agriculture blocks should not be considered when constructing the "lot of reference parcels" (defined below) for this ETS.

1.

Not all individual blocks/polygons in a system necessarily represent a single reference parcel2.

individual landscape features can often be considered as an element of one and only one reference parcel1. some production blocks may have been subdivided to detail their content (e.g detailed land cover categories or cropping patterns).2.

It's strongly recommended to merge, where appropriate, such subdivisions into a single RP polygon to reflect the true reference parcel FOR THEpurpose of the LPIS QA.

Go up to the main ETS page

4

4 ETS Data structureGo to the main ETS page

4.1 MTSThe DGJRC developed a model test suite to enable the LPIS custodian to map the local LPIS implementation to the concepts and definitions of the LPISCore Model (LCM). The LCM holds the conceptual elements wherein the various test of the ETS have been defined. Correct mapping of the localimplementation allows a correct translation and conversion of the local data into the ETS test environment.

4.2 Data structure implicationsA correct conversion of the data concepts is essential for

a correct interpretation of the scope of the quality assessment, especially as the scope relates historical activity and current data values.• presenting correct data values as input for the ETS tests.•

The MTS also provides the basis for all xsd-schemas that are essential for a lossless transfer of the information between the LPIS custodian and the ECservices. These are critical for:

the production of the ETS sample (population upload and sample download)• transferring the ETS reporting package•

all observations made during the inspection procedures1. feature metadata to address the relevant LPIS maintenance processes that will take place between sample creation and reporting.2.

documenting and delivering the eligibilty profile which offers the complete set of local rules to identify, capture and manage agricultural land.•

This information exchange is required to perform an appropriate screening of the LPIS quality assessment results.

4.3 Data structure for population uploadThe next chapter describes how a sample will be drawn from the uploaded population of reference parcels. The data structure of the file holding thatpopulation of reference parcels is:

Element name Definition Description

<cap:geometryProperty> Spatial representation of the reference parcel asreferred to in Art.5 of (EU) R 640/2014.

Geometry of reference parcel, representing its geographical position.

For this entry the representation GM_Point is required.

<cap:rpID> Unique thematic ID of reference parcel referred to inArticle 70 of Regulation (EU) No 1306/2013.

Nation-wide unique alphanumerical thematic identification code ofreference parcel.

<cap:referenceArea> Value for the quantification of area as referred to inArt. 5(3) of (EU) R 640/2014.

Officially known area of reference parcel in hectare, whichcorresponds to the maximum area, for which an application can bemade.

NOTE 1: Reference area value shall be expressed in hectares.

Go to the main ETS page

5

5 ETS Sampling zonesGo up to the main ETS page

5.1 Zone selectionAs mandated by the Regulations (Article 6.2 of (EU) Regulation No 640/2014), the LPIS QA zones and reference parcels are selected and defined bythe EC services.

In collaboration with the contracted image provider, a procedure has been designed to achieve a number benefits compared to the CwRS-imagecoupling from ETS v5.3. General goal is to reach imagery of sufficient quality to serve as a true reference for the current situation on the ground with anunbiased sampling.Different criteria have been taken into account:

An equal probability of inspection for all reference parcels while preventing the exclusion of zones and bias• independence from CwRS zoning, further ensuring absence of bias• improved geometric and radiometric image quality by means of•

related timing of capture when cloud and haze free conditions arrive1. restricting image capture to near-nadir conditions2. using sensors with a Ground Sampling Distance (GSD) of 50cm or less3.

concentration of field activities in a few representative locations, lowering the resources needed for the inspection process.•

In practice, the zone selection procedures operates through a 4 level stratification based on reference parcel abundance and agricultural arearepresentativeness. These two parameters are derived from the LPIS populations uploaded by the custodians for the previous LPIS quality assessment.The resulting 78 image zones are shown in Figure 1.

Figure 2: LPIS QA data capture zones over EU

In Figure 2, each color represents a zone where a one or more images will be acquired. Additional imagery may be acquired in a given zone to reachthe minimum sample size. Special imagery is captured in low reference parcel density zones at a European.

5.2 Reference Parcel SamplingThis sample of reference parcels to be inspected in the ETS will be generated in the intersection of the LPIS QA zones acquired as above and the fullpopulation of reference parcels ( scope of the quality assessment). In practise this means that LPIS custodians have to upload their population for thereference date so this sampling can be generated by DGJRC.

In order to receive a sample pre-selection Member States shall:

do Steps 1, 2 and 3;• download the sample pre-selection list in Step 9.•

To ensure correct timing of this assessment, upload of population data to DGJRC will be possible only from the last date of submittal of applicationsuntil the end of that month, in line with Article 13(1) of (EU) Regulation No 809/2014. "Member States shall fix the final dates by which the singleapplication, aid applications or payment claims shall be submitted. The final dates shall not be later than 15 May each year. However, Estonia, Latvia,Lithuania, Finland and Sweden may fix a later date which shall not be later than 15 June.".

If the LPIS is subject to the 2015 derogation of this application submittal data (Commission Implementing Regulation (EU) 2015/747 of 11 May 2015),the resulting upload data can be delayed to the June 30th, 2015'.

DGJRC will produce the sample within days after acquisition of the necessary LPIS QA imagery, but no later than September 30th 2015.

5.3 Population data and sample exchange instructionsCreating the sample pre-selection requires:

Step 1 (by MS): create a point representation from the reference parcel polygons (total population of parcels from the Lot);• Step 2 (by MS): convert reference parcels' point data into a harmonised data structure: LpisPointZeroState.xsd,• Step 3 (by MS): upload the LPISPointZeroState through the LPIS QA Web Application. For instructions, go to:•

Logging into the LPIS QA Web Application• Establishing LPIS Settings•

Step 4 (by EC): prepare LPIS Control Zones data:•

ApplicableLpisqaZones.xsd (GML)• ApplicableCidZones.xsd (XML) – if applicable•

6

Step 5 (by EC): receive and analyse reference parcel data and LPIS Control Zones data (by EC),• Step 6 (by EC): clip reference parcel data with the LPIS Control Zones,• Step 7 (by EC): determine the sample size for the ETS inspection, based ISO 2859/2-1985, procedure A,• Step 8 (by EC): generate a sequential list of randomly selected reference parcels, and send a notification e-mail. See details on•

Creating a sample pre-selection•

Step 9 (by MS): download the pre-selection list LpisSamplePreselection.xsd For instruction go to:•

Logging into the LPIS QA Web Application• Downloading a sample pre-selection•

This methodology is presented in diagram 1, below.

Diagram 1 Workflow of the ETS sampling methodology.

Go up to the main ETS page

7

6 ETS Inspection workflowGo up to the main ETS page

6.1 Observation stepsDiagram 3 illustrates the overall item inspection process. The main activity flow comprises the following steps:

Retrieve the list of preselected parcels from the LPIS QA preselection issued by DG JRC1. Decide whether one of the following critical defects can be observed for the item:

total absence of agricultural land1. invalid perimeter2. invalid common RP boundary3. incomplete block4. multi-surface5. multi-parcels6.

2.

If no critical defect is found in the item, it is conforming for this criterion.3. Observe and geolocate the contamination conditions listed below observed for the item:

artificial sealed surface,1. crosscutting non-agricultural land cover.2.

4.

When no contamination has been found or when an observed contamination has been waivered the item is conforming for this criterion.5. Detect the presence of different agricultural Land Cover (LC) classes which are on the land represented by the item under inspection. Use thecorresponding class definitions from the eligibility profile.

6.

When each LC instance present in the item under inspection has been correctly classified mark item as conforming for this criterion.7. Decide for each instance of observed agricultural land whether the item had assigned the right LC class of the eligibility profile.8. Verify in the eligibility profile whether and which Landscape Features (LF) are applicable.9. Detect LF instances that are inside or on the immediate border of the item under inspection and classify them according to the types definedin the eligibility profile.

10.

Verify the correctness of LC classification for each instance of LF present in the item under inspection.11. When each LF LC instance present in the item under inspection has been correctly classified mark item as conforming for this criterion.12. Decide whether there is any reason preventing the area measurement of the item under inspection! Area measurement is not feasible, whenany boundary section is not visible in the orthoimage, or in case of a force major event (both for Computer Assistance for Photo Interpretation(CAPI) and field measurements).

13.

Process of defining the surface area of each LC type present in the item under inspection by delineating their boundaries.14. Definition of conformance verdict based on partial conformance verdicts (critical defect, contamination, classification correctness, area)resulting from the inspection process.

15.

8

Diagram 3: Item inspection

6.2 Item conformanceThe last activity step, where the partial conformance verdicts are combined into an overall conformance verdict is clarified in diagram 4. These detailedsteps are:

Retrieve area conformance values forarea conformance1. critical defect2. contamination3. classification correctness4.

1.

Verify whether all these Data Quality Measure (DQM) conformance values are TRUE. This condition is fulfilled, when no problems with areavalues, critical defects, contamination or classification have been found. As a result, the item under inspection is conforming in general.

2.

Record item conformity as TRUE in your information system.3.

Diagram 4: testing item conformance

Go up to the main ETS page

10

7 ETS Inspection variantsGo up to the main ETS page

Conditions of the image or the ground can interfere with this main activity sequence. The methodology allows for several alternative flows to ensure asuitable inspection for all these conditions.

7.1 Reference parcel aggregationThe reference parcel aggregation method was introduced for agricultural landscapes where the visible cropping pattern often coincides clusters ofcomplete reference parcels. The principle of crop aggregation is analogous to OTSC AP measurement, where a full crop group is measured if it spansseveral reference parcels.

Using the reference parcel aggregations option remains in principle optional, but it becomes mandatory where low “feasibility for measurement” resultscreates a methodological or perceived problem:

a methodological problem arises when less than 200 reference parcel are measured as this is often the minimum sample size for LQ 12.5indexed tests.

1.

a perceived problem can occur when more than 200 but less than half the RP measured. Indeed what is the true reference value of a LPISwhen more than half of its reference parcels can not be measured?

2.

So, application of the reference parcel aggregation is subject to an a priori decision made on 2014 LPIS quality assessment results.

The estimated number of reference parcel in 2014 that will not be feasible for measurement can all conditions remain equal, be equal to the sum of

reference parcels found to be "not feasible for measurement" in 2014◊ reference parcels that were found "feasible for measurement" in 2014, but only by applying the 5 meter buffer rule.◊

If that sum represents more than 50% of the inspected sample then reference parcel aggregation becomes mandatory.

Any application of the reference parcel aggregation variant is systematic: if applied, aggregation and crop measurement needs to be performed for allsampled reference parcels where the conditions apply.

Notes:

By their very nature, all cadastral parcel and agricultural parcel designs could be more subject to the conditions that require reference parcelaggregation.

1.

The 5m buffer rule is totally obsolete in ETS v6.0.2.

7.2 Field activities7.2.1 Context

The introduction of a limited number of dedicated LPIS QA image zones has two direct consequences:

in areas with lower reference parcel density, there are fewer parcels in the sample pre-selection. It is no longer evident to skip an item,considering that the sample pre-selection list can be shorter than in ETS v5.3.

1.

there should be fewer logistical challenges to organize field activities.2.

Furthermore, field activities will yield a better overall inspection result as well as a better analysis of those results.

Finally, the EC services expect a raise in field activities as some inspections (classification correctness) might not be correctly assessed from theimagery alone.

7.2.2 Different types of field activities

ETS v6 still recognizes four processes where field activities are relevant:

field inspection - parcel inspection is fully based on GNSS field survey.◊ combined inspection - parcel inspection is based on merging CAPI delineation and field survey. This usually involves a borderinspection in the field

perimeter inspection - inspection of a LUI to vindicate a critical defect (formerly called boundary inspection)◊ field observation - other activities complementing the normal CAPI inspection◊

7.2.3 Implementation

All field activities are subject to a discretionary decision of the Member State whether it is to replace or support the CAPI inspection. However, thatdecision must be systematically, and not discretionary applied to all items in the sample. Any discretionary decision would lead to the inclusion of "nice"parcels in the sample and thus biasing the results of the ETS.

7.3 Conditional triggersSeveral activity diagrams of the detailed technical documentation of the next chapter demonstrate alternative path to their main path. The path each itemshould follow is conditioned by the earlier findings on that item. It is important to recall that any item can only follow a single path and cannot jump toactivities that are not on that path. The path that an item follows will always determine the content of its inspection records of the ETS reporting packageand will be subject to automatic screening during the upload of that package.

Go up to the main ETS page

11

8 ETS Data maintenanceGo up to the main ETS page

8.1 ContextThe reference data for the ETS inspection is the same as the deadline for the farmer's application. The observations made during the inspection arebased on imagery that can be earlier or later and possibly on field activities that are organized by the end of the year.

It is acceptable that the sampled item under inspection would be updated to match any concurrent system update on condition that such system updateoccurred in "tempore non suspectu" (i.e. the LPIS custodian had no hand in the time when the update initiated)

Considering that the farmer's application is (or should be) following that farmer's indication of any changes on his land, the EC service expect that therate of items subject to such change will be limited. In any case, the rate of change inside the LPIS QA zones is expected to be representative of therate of change in the LPIS population as a whole.

8.2 EvidencingThe LPIS quality assessments of the previous years have revealed that the unstructured evidence instructions failed to provide sufficient reliabilityregarding the "in tempore non suspect" character of the updates and structured feature meta-information provides an easy but structured alternative:

ETS v6.0 therefor introduces a new xml schema, based on the LCM implementation of an anomaly and lifecycle information. It holds the attributes:

Identification. A unique identification of the anomaly (e.g. an automatically generated sequential number)• Object: this should be the identifier of the affected reference parcel (RPID)• Cause. The cause of non conformity, according to the list of QE4, "changes in land" is expect to be the most frequent cause• Observation(s). The details on the observation, including:•

Author (processOperator)1. Values that are suspected non-conforming: in casu the obsolete reference area as in the LPISPointZeroState2. Observation date (phenomenonTime), as reported to the LPIS custodian3. Other comments (parameter)4.

Status in the process. Details on the process closure of the anomaly, including:•

Date of reception (resultTime)1. Status of actions regarding the confirmation: in casu completed as the reference area has been updated2. Status and logged consequences on the LPIS objects: in casu the updated reference area value3. Other comments4.

Any discrepancy between the reference area values of the LPISPointZeroState and LPISPolygonAreaZeroState should be accompanied by a duly filledin record within this anomaly file. Its presence and correctness will be automatically checked during the upload of the ETS Reporting Package.

Go up to the main ETS page

12

9 ETS Data CaptureGo up to the main ETS page

The ETS inspection procedure is explained in full technical detail in the next chapter. It represents the truly technical documentation of the ExecutableTest Suite of the LPIS QA. The main body of this technical guidance merely introduces the key activities, decisions and output of that workflow.

In general, there are the following phases:

the LPIS ETS performs an inspection (series of observations and measurements) on items preselected by DGJRC.1. these observations and measurements are analyzed to come up with a parcel conformance statement.2. the resulting counts of these item conformance statements are used to come up with a conformance statement for each of the qualityelements indicated in the Regulation

3.

if any of the quality elements in one of the two conformance classes of the Regulation fails, then the system is non-conforming for theconformance class and a remedial action plan is required

4.

The general process leading to the 3 phases as illustrated in diagram 2.

Diagram 2: ETS data flow

Go up to the main ETS page

13

10 ETS ConformanceGo to the main ETS page

10.1 Conformity for a quantitative quality elementThe Regulation defines 6 quality elements, grouped into two conformance classes. Based on the item conformance verdicts issued for the variouscriteria during the item inspection, verdicts will be made on each conformance class.

All quality elements have been expressed in quantitative terms and require measuring or counting. To apply a verdict on a measure that involvescounting non-conforming items or non-conformities, limiting quality (LQ) indexes from ISO 2859-2 have been used. Such indexes model the statisticalrelationship between an expectation, the required sample size and the resulting counts to provide via the acceptance numbers a statistically robustverdict on conformity.

The table below offers an extract (relevant part for current LPIS sizes) of the sampling table of ISO 2859-2.n= sampling size, AC= acceptance numbers.

Lot Size n/AC LQ 2 LQ 12.535001 -> 150000 n 500 200

AC 5 18150001 -> 500000 n 800 200

AC 10 18> 500000 n 1250 200

AC 18 18

10.2 Conformance class 110.2.1 Context

The conformance class 1 means to “assess the quality of LPIS”, and thus counts non-conforming items (either reference parcels or crop aggregates)This factual counting has been used for all elements and is still relevant for the first three quality elements (QE1, QE2 and QE3). Furthermore, countingitems offers a straightforward entry for the LPIS upkeep processes.

Element Value Description Expectation LQindex

1 Relates to maximum eligible area of the system - -QE1a QE1 A looks at the absence of bias (i.e. accuracy) of the land represented in the LPIS as a whole. >= 98%,<= 102% -QE1b1 Maximum eligible area: overestimation LIB >= -2% -QE1b2 Maximum eligible area: underestimation UIB <= 2% -

2 Assesses the parcels that have correctness issues. - -

QE2a1 Proportion of RPs (whole set of data) with incorrectly recorded area, “contaminated” or “misclassified” withineligible features <5% 12.5

QE2a2 Proportion of RPs (>0.1 ha) with incorrectly recorded area or “contaminated” or “misclassified” withineligible features <5% 12.5

QE2b Distribution of RPs, according to the correctness of the eligible area recorded - -3 QE3 Assesses the parcels that have functional issues ("Critical Defects") <1% 210.2.2 Expectations

As indicated in the table above, the expectations remain set at <5% (for QE2) and <1% (for QE3) non-conforming items. These expectations aretested through acceptance numbers for indexing with limiting quality LQ12.5 or LQ2 respectively.

10.2.3 Procedure

Diagram 5 illustrates the procedure to aggregate observations for the conformance class 1 assessment. The details and intermediate value used in thefigure are detailed in the technical chapter.

14

Diagram 5: conformance class 1 testing

10.3 Conformance class 210.3.1 Context

Conformance class 2 aims to "identify possible weaknesses", and this requires a broader system wide analysis, beyond the individual item or referenceparcel. This is most obvious for QE4 which analyses the LPIS processes and design as factors for creating quality problems. For instance, a single,large parcel can be contaminated, can include ineligible land and can have its land wrongly classified. Although this represents a single non-conformingitem, it does reflect three different system weaknesses.

Element Value Description Expectation LQindex

4 QE4 Categorization of the non-conforming RP <5% 12.55 QE5 Ratio of total declared area in relation to the total area recorded for the conforming RPs - -

6 QE6 Percentage, accumulated over the years, of the reference parcels which had been subject to change, but werenot addressed in IACS. < 25% -

10.3.2 Expectation

For QE4, the expectation remains at <5%, however no longer counted as non-conforming items but counted in terms of non-conformities per 100items. This expectation is tested through the acceptance number for indexing with limiting quality LQ12.5.

10.3.3 Procedure

Diagram 6 illustrates the procedure to aggregate observations for the conformance class 2 assessment. The details and intermediate value used in thefigure are detailed in the technical chapter

15

11 ETS DeliveryGo up to the main ETS page

11.1 Textual reports11.1.1 Content and delivery

The Regulation requires the MS or its regions to send by January 31st, (ie. by 31/1/2016 for the 2015 QA), an assessment report and, whereappropriate a remedial action plan, to the unit of the European Commission responsible for IACS (DG AGRI D3). This textual document set shall hold:

a scoreboard and assessment report (via a template)• if appropriate, a remedial action plan•

11.1.2 Guidance for compiling the assessment report and remedial action plan

Usability drives the assessment of raw scores to lead to meaningful evaluation results in the particular context of the LPIS. The ETS scoreboardtemplate merges the scoreboard and assessment report into a single document. This combined scoreboard and assessment report shall contain asummary of the member states' analysis of its scores, in particular relating the scores to the MS context. This combined assessment report shouldhold not more than 4 pages and use the template provided on download article.

If appropriate, i.e. When either of quality elements fails to meet the acceptance threshold and that failure is expected to continue, a plan withthe remedial actions and the timetable for their implementation shall be added as a separate WORD document. This summary plan shall buildupon the analysis and differentiate between earlier actions. Special consideration shall be given to ongoing refresh projects, if present. The remedialaction plan shall hold not more than 2 pages.

This remedial plan has no pre-defined template but should be inspired by the PDCA (Plan, Do, Check, Act) cycle, so possible chapters are:

check: explain the observed failure to meet the expectation; provide results of additional tests or outcome of a study for a betterunderstanding

1.

act: correct obvious failures; implement immediate mitigating actions2. plan: explain what the MS intends to do on the long run to address issues not dealt with by the immediate actions3. do: point out elements of that plan, that have been completed in the meantime4.

11.2 Data packagesData packages hold both the inspection records (ETS reporting package) and the reference imagery.

11.2.1 ETS reporting package

To enable verification of the inspection method applied and the content expressed in the textual document, the ETS reporting package, is to beuploaded to the European Commission. It shall hold:

ETS scoreboard: holding the quality element data and scores1. ETS observations: raw observations (observed values) for all measures on all inspected parcels of the sample2. ETS inspection measurements: geographical features mapped during the ETS inspection for all relevant measures on all inspectedparcels of the sample

3.

Boundary inspections and Field Observations records containing a description of a field visit and a link to its graphicaldocumentation

4.

Sample pre-selection status containing a list of the inspected reference parcels. Note that this file must also contain the skipped(with a valid reason) reference parcels as well as the remaining unprocessed reference parcels present in the LPIS samplepre-selection.

5.

LPIS polygon zero state which is an extract from the LPIS data under inspection, i.e. reflecting the state at the first step of the ETS.It contains all reference parcels that were either inspected or skipped (geographical and alphanumerical attributes). Note that thisfile must all contain boundaries, identifier and reference area of any parcel (additional parcels, not restricted to parcels within thescope of the current assessment year) within a distance of 100 meters from the boundary of the inspected or skipped parcel.

6.

If a reference parcel aggregation is applied: the table of aggregated parcels containing RP ID of preselected RP and the list of allcorresponding RPs inside its aggregate.

7.

for cadastral parcel or topographic block or any non-production block system that operates a separate layer to identify agriculturalland units: Land administration: all original third party land identification polygons within a distance of 100 meters from the boundaryof the inspected parcel (GML: INSPIRE Annex I, cadastral parcel)

8.

if applicable, the file with reference parcel update data.9. Metadata: an ISO 19157 compliant meta record on the reported quality assessment10.

11.2.2 Delivery instructions

For the 2015 implementation, all deliveries are due by January 31st, 2016 Deliveries are of 3 different types:

The assessment report and, where appropriate, the remedial action plan shall be emailed [email protected]. Assessment report has to follow the specific template, in MS Word DOC format (seedownload page)

1.

The ETS reporting package shall be uploaded on the LPIS QA portal2. Orthorectified Imagery, for which the following instructions apply:3.

LPIS QA VHR, EU-financed: through delivery to CID : This should follow CID's instructions of the imagery deliveryCTS2013

ancillary images, MS-financed: MS has to provide an approriate URL for the Web Map Service (WMS) oralternatively to deliver via the LPIS QA portal (For the latter, please follow the procedure described in non-CwRSimage delivery)

Go up to the main ETS page

17


Recommended