INSPIRE Test Framework - Design Report

Size: px
Start display at page:

Download "INSPIRE Test Framework - Design Report"

Transcription

1 Ref. Ares(2016) /11/2016 Joint Research Centre (JRC) INSPIRE Test Framework - Design Report Are3na Re3ference Platform Date: 02/09/2016 Doc. Version: v1.00 Commission européenne, B-1049 Bruxelles / Europese Commissie, B-1049 Brussel - Belgium. Telephone: (32-2) Office: 05/45. Telephone: direct line (32-2) Commission européenne, L-2920 Luxembourg. Telephone: (352)

2 Document Control Information Settings Document Title: Project Title: Document Author: Project Owner: Project Manager: Value Doc. Version: V1.00 Sensitivity: INSPIRE Test Framework - design report Are3na Re3ference Platform Jon Herrmann Clemens Portele Michael Lutz Leda Bargiotti Public Date: 02/09/2016 Status: Accepted Document Approver(s) and Reviewer(s): NOTE: All Approvers are required. Records of each approver must be maintained. All Reviewers in the list are considered required unless explicitly listed as Optional. Name Role Action Date Pieter Breyne Quality Assurance Manager Review Michael Lutz Project Owner Accepted Document history: The Document Author is authorized to make the following types of changes to the document without requiring that the document be re-approved: Editorial, formatting, and spelling Clarification To request a change to this document, contact the Document Author or Owner. Configuration Management: Document Location The latest version of this controlled document is stored on the CITnet ARE3NA wiki at the following link: Date: 02/08/ / 69 Doc. Version: v0.4

3 TABLE OF CONTENTS 1 EXECUTIVE SUMMARY SCOPE GLOSSARY REQUIREMENTS AND PRIORITIES SOLUTION STRATEGY Overview and Technical Context Existing Components SYSTEM STRUCTURE Component overview Test Framework Core Test Drivers Web Application Data Source Result Transformer RUNTIME VIEW Define and retrieve Test Objects Starting a Test CONCEPTS The INSPIRE Test Framework Model Common domain model properties and hierarchy Specification model items Test model items Result model items Overview of the model with all classes Web Interface User Scenarios Activities REST API GITB API Specifications for Abstract Test Suites Specifications for Executable Test Suite Languages Requirements by Test Object Type General requirements Web services OGC web services ATOM feeds and OpenSearch endpoints XML documents GML data Other data (e.g. binary data) GMD metadata Specifications by Test Engine Overview SoapUI BaseX TeamEngine Date: 02/08/ / 69 Doc. Version: v0.4

4 8.7.5 Schematron reference implementation Integration of tests for external conformance classes Test Reports DEPLOYMENT OPTIONS MANAGEMENT OF TEST SUITES (ABSTRACT AND EXECUTABLE) IMPLEMENTATION OVERVIEW DOCUMENTATION...61 APPENDIX 1: REFERENCES AND RELATED DOCUMENTS...62 APPENDIX 2: PREDEFINED MODEL ITEM PROPERTY KEYS...63 APPENDIX 3: ARCHITECTURE EVALUATION...64 Date: 02/08/ / 69 Doc. Version: v0.4

5 1 EXECUTIVE SUMMARY This report documents the requirements and the proposed design for the INSPIRE Test Framework. The design covers the software architecture and the design decisions of an open source framework for testing INSPIRE data sets, metadata and services. The work is part of work package 2.4 of the Are3na Re3ference Platform - Conformance and interoperability testing. This design will be the basis for the implementation of the INSPIRE Test Framework. 2 SCOPE The MIG work programme states that to ensure that result from tests of conformity are identical, a common, officially approved, validator should be accessible from INSPIRE web. That validator should support conformance tests against spatial data sets, metadata and spatial data services. This report documents the requirements for an open source framework for testing INSPIRE spatial data sets, metadata and spatial data services ( INSPIRE Test Framework ), specifies a software architecture addressing these requirements, and captures design decisions. The design is based on prior and ongoing work in the INSPIRE MIG-T, i.e. the MIWP-5 group, and it has been developed with the support of the MIWP-5 group, mainly through reviews. This document is also related to the activity on interoperability and conformance testing in ISA, i.e. the Global ebusiness Interoperability Test Bed (GITB) project. The relationship with GITB is analysed in detail in a separate document, the GITB INSPIRE Report (deliverable SC235_D4.4.1 [3]), and the results of that work are taken into account in the design specified in this document. The INSPIRE Test Framework will be implemented based on the design specified in this document (deliverables SC235_D4.3.2.x). Executable Test Suites, implementing selected Abstract Test Suites for the INSPIRE Technical Guidelines for spatial data sets, metadata and spatial data services, and executable in the INSPIRE Test Framework will also be developed (deliverables SC235_D4.3.3.x). The INSPIRE Test Framework, together with the Executable Test Suites, will provide the validator capability identified in the MIG work programme. 3 GLOSSARY The table below lists key terms and definitions in the context of the INSPIRE Test Framework. Term / Synonyms abstract test case Synonym: conformance test case [OGC Specification Model] abstract test module Synonym: conformance test module Definition Table 1 - Glossary test for a particular requirement or a set of related requirements [OGC Specification Model] NOTE: An abstract or conformance test case is a formal basis for deriving executable test cases. It should be complete in the sense that it is sufficient to enable a test verdict to be assigned unambiguously to each potentially observable test outcome. set of related tests, all within a single conformance class [OGC Specification Model] Date: 02/08/ / 69 Doc. Version: v0.4

6 [OGC Specification Model] abstract test suite Synonym: ATS [ISO 19105] API conformance class consistency test executable test suite Synonym: ETS [ISO 19105] executable test suite set requirement specification test assertion test case Synonym: executable test case [ISO 19105] test driver test engine set of conformance classes that define tests for all requirements of a specification [derived from OGC Specification Model and ISO 19105] set of routines, protocols, and tools for building software and applications [Wikipedia] NOTE: In the INSPIRE Test Framework, APIs are used for the internal communication between the components of the test framework and to allow machines to communicate with the INSPIRE Test Framework. set of conformance test modules that must be applied to receive a single certificate of conformance [OGC Specification Model] test case that includes one or more test assertions involving information from both the test object and from a resource referenced from the test object EXAMPLE: Comparison of consistency between the service capabilities and the metadata record of the service set of executable test cases [ISO 19105] set of executable test suites expression in the content of a document conveying criteria to be fulfilled if compliance with the document is to be claimed and from which no deviation is permitted [ISO Directives Part 2] document containing recommendations, requirements and conformance tests for those requirements [OGC Specification Model] atomic test in a test step NOTE: The assertion tests information retrieved from the test object in a test step against the an expected value based on the requirements associated with the test case. specific test of an implementation to meet particular requirements [ISO 19105] NOTE: Instantiation of an abstract test case with values. component of the test framework that communicates with a test engine during a test run a tool that executes an Executable Test Suite Date: 02/08/ / 69 Doc. Version: v0.4

7 test framework test object Synonym: SUT [ISO 19105] test object type test report test result test run test step test task validation test platform that supports multiple test engines, multiple test object types and the generation of harmonised test reports across the different test engines INSPIRE asset under test classification of a test object NOTE: An Executable Test Suite declares which types of test objects it supports. This allows to determine which Executable Test Suites may be included in a test run against a test object. Test object types are structured in a hierarchical vocabulary with the root elements service and document. report derived from the test result NOTE: reports may be human readable (HTML) or machinereadable (XML, JSON) and they may be structured in different ways (usually by test suite, module, case, step and assertion). result of a test run execution of one or more test tasks NOTE: in a test run one or more Executable Test Suites will be executed against one or more test objects. an interaction with a test object as part of a test case NOTE: Multiple steps may be necessary to execute a test case. A test step will be, for example, a query on data or a request on a web service. The result of the interaction is then subject to assertions. execution of an Executable Test Suite against an test object test case that is executed on the test object alone 4 REQUIREMENTS AND PRIORITIES The INSPIRE Test Framework is intended as a reusable reference tool for conformance testing in INSPIRE, available under an open source license and contributing to wider interoperability testing relevant to ISA. It should support the validation of components of the infrastructure (data sets, metadata and services) against the conformance classes specified in the INSPIRE Technical Guidance documents. Furthermore, it should be configurable by organisations to define additional test suites for conformance testing against community-specific INSPIRE profiles and extensions. Initial requirements for the INSPIRE Test Framework were gathered in the MIWP-5 Validation & Conformity subgroup. In the MIWP-5 Briefing and Planning meeting on the INSPIRE Test Framework implementation with JRC and MIWP-5 subgroup leaders in December 2015, the Date: 02/08/ / 69 Doc. Version: v0.4

8 user expectations and key requirements for the first version of the INSPIRE Test Framework were determined. The requirements were prioritized using the MoSCoW approach 1. Requirements with a MUST priority should be the minimum that will be available at the end of this project. Additional requirements may be implemented based on the available time and resources. The main actors that are targeted with the first version of the INSPIRE Test Framework are the data and service providers in order to support them in meeting their implementation obligations by having a tool that executes agreed tests against their spatial data sets, services and metadata. The main goal is to help data and service providers, not to monitor them. Other actors that have been mentioned as potential future beneficiaries of an INSPIRE Test Framework are: the INSPIRE geo-portal, national coordinators, users of spatial data, solution providers, the European Commission, test developers, tool providers and MIG-T members. The following table provides an overview of user expectations of data and service providers. Id Priority Description Table 2 - User expectations of data and service providers UE-1 M I can validate a Test Object against a set of Executable Test Suites selected by me UE-2 S/C I can validate a Test Object against a subset of the test cases in an Executable Test Suite where the subset is selected by me UE-3 C/W I can test a Test Object against all applicable Executable Test Suites that are in force UE-4 M I can use a central deployment of the validator to test a Test Object UE-5 M I can use a local deployment of the validator in my network to test a Test Object UE-6 M I get informed about the progress of a Test Run and the Test result in useful way UE-7 S/C I can share a Test Report using a hyperlink to it from metadata, websites, etc. UE-8 M The Test Report includes text that I can copy into the conformance section of the metadata of the spatial data set, service or metadata UE-9 S/C/W I can find the history of my Test Results UE-10 M I can use the tests in my production and publication workflow using an API UE-11 M I can add my own Executable Test Suites for INSPIRE extensions / profiles that I want to support UE-12 C/W I can find who had similar issues UE-13 S/C I can control who has access to my Test Results This list of user expectations of data and service providers has been the main source for identifying and prioritizing requirements. The architectural requirements capture the essential functionality of the system. 1 The MoSCoW method is a prioritization technique for requirements. The term MoSCoW itself is an acronym derived from the first letter of each of four prioritization categories (Must have, Should have, Could have, and Would like but won't get). See also Date: 02/08/ / 69 Doc. Version: v0.4

9 Id Priority Description Table 3 Architecture and licensing A-1 M The INSPIRE Test Framework is deployable centrally and in the user s environment A-2 M The INSPIRE Test Framework is a generic test framework that can process one or more Executable Test Suites A-3 S/C It should be possible to update or add Test Cases without needing to recompile, rebuild or redeploy the Test Engine A-4 S/C Existing Executable Test Suites can be reused A-5 M The INSPIRE Test Framework and the INSPIRE Executable Test Suites are open source A-6 M The INSPIRE Test Framework supports different Test Engines in order to validate different types of Test Objects: A-6.1 M Support for tests on XML data sets (potentially very large) A-6.2 M Support for tests on XML metadata records (smaller documents, together with the network service tests) A-6.3 M Support for general conformance tests on network services (profiles of OGC Web Services, ATOM feeds) A-6.4 S/C Support for Quality-of-Service tests on network services Note: There are open questions how this can and should be tested properly (interference with other users, network latency and bandwidth, etc.) A-7 M The INSPIRE Test Framework can be controlled through GUI and API A-8 M The INSPIRE Test Framework provides test progress information during Test Runs A-9 S/C If a Test Run fails, an is sent to the user A-10 S/C Combinations of Test Objects, Executable Test Suites and parameter values can be saved as Test Run templates A-11 C/W Periodic Test Runs can be scheduled for Test Run templates A-12 M The INSPIRE Test Framework GUI is multilingual Note: Abstract and Executable Test Suites will be in English. A-13 M The domain model of the relevant artefacts (Abstract and Executable Test Suites, Test Objects, Test Results, etc.) needs to be stable and agreed early. The following requirements are related to the Executable Test Suites and their management. Id Priority Description Table 4 Executable Test Suites and their management TM-1 M Executable Test Suites should allow easy modification TM-2 M Executable Test Suites are versioned on two levels; first, on the Conformance Class / Technical Guideline version; second, on the version of the Executable Test Suite for the Conformance Class / Technical Guideline version TM-3 S/C Older Executable Test Suites are available, but are marked as deprecated TM-4 S/C Enable validation against specific versions of Executable Test Suites TM-5 C/W In an Executable Test Suite, support versioning of Test Cases Date: 02/08/ / 69 Doc. Version: v0.4

10 TM-6 M Support classification of the severity (or weight) of each Test Assertion TM-7 S/C Recommendations are tested and failures are reported as warnings TM-8 M Assertions in Executable Test Suites link to the Abstract Test Suites and the requirements they test The following requirements are related to reporting capabilities. Id Priority Description Table 5 Reporting TR-1 M Test Reports are provided in a human-readable representation (HTML) TR-2 M Test Reports are provided in a machine-readable representation (XML or JSON) TR-3 M Test Reports may be multilingual Note: Text specified in the Executable Test Suite will be in English TR-4 M Test Results and Test Reports are time-stamped and include the version of the Executable Test Suite(s) NOTE: There is an issue with the current data specifications where the schemas have changed from version 3 to 4, but the specification document has not changed. In general, if there is a new schema version with a new namespace, there needs to be an updated specification document, too. TR-5 M Test Reports include links to the abstract test case, the executable test case and the IR/TG requirements for test cases with errors / warnings TR-6 S/C Test Reports have a persistent HTTP URI TR-7 S/C Access to Test Reports can be restricted TR-8 S/C A user dashboard with direct access to my Test Results TR-9 S/C/W Require registration of users The following requirements are related to test objects and their types. Id Priority Description Table 6 Test Object Types TO-1 M Support validation of a metadata record, download services and spatial data of the data themes of Annex I of the Directive TO-2 S Support validation of view and discovery services TO-3 S Support validation of spatial data of the data themes of Annexes II and III of the Directive TO-4 C Support validation of spatial data services Appendix 3 includes an overview how and to what extent the design in this document covers the requirements and user expectations. Date: 02/08/ / 69 Doc. Version: v0.4

11 5 SOLUTION STRATEGY 5.1 Overview and Technical Context The requirements demand for a flexible framework that supports the interaction and testing of different Test Object Types and that is capable of supporting multiple Test Engines. Although the underlying data and message encoding is mostly XML at the moment, the Test Object Types differ in the ways the data has to be processed during test execution in a Test Engine. Well-established open source Test Engines already exist, and these are usually designed for specific types of Test Objects. For example, the TEAM Engine and soapui are primarily focused on testing web services while XML databases like BaseX are capable of testing very large XML documents. Therefore, the INSPIRE Test Framework is designed component-based to guarantee a flexible architecture. To take advantages of already existing solutions, well-established Test Engines may be integrated with Test Drivers as adapters. This plug-in design also allows adding additional Test Engines in the future. The technical context of the system, the interaction with actors and other systems are shown in Figure 1. The INSPIRE Test Framework is controlled via a HTML web interface or the REST API. We consider the INSPIRE Test Framework to be comprised of those components that may be developed or adapted during the development of the framework. In addition, there are additional components that will be part of any deployment of the INSPIRE Test Framework. These are: A database that is used to store data like test reports, information about test runs or internal data about test objects, etc. One or more Test Engines necessary to execute the Executable Test Suites that will be available as part of the deployment. A Servlet container to host the web application of the INSPIRE Test Framework. A Test Developer implements Executable Test Suites derived from the Conformance Classes of an Abstract Test Suite for a specific Test Engine and publishes the Executable Test Suite in a Test Repository. Currently, the INSPIRE Abstract Test Suites are documented on GitHub. In the future, this documentation may be moved elsewhere. The INSPIRE Abstract Test Suites, Abstract Test Cases, Requirements, etc. will all receive persistent URIs that redirect to the current storage location of the resource; same for the Executable Test Suites. The http URI scheme for the Abstract Test Suites is specified in 8.5. In addition, there will be external resources that may be accessed via http(s) during a Test Run. Typical examples are XML Schema documents and entries in the INSPIRE code list register. Date: 02/08/ / 69 Doc. Version: v0.4

12 Figure 1: Technical Context View The INSPIRE Test Framework executes the Executable Test Suites with a Test Driver supporting the Test Engine of that suite. It will parse the associated INSPIRE Abstract Test Suite for its metadata. This requires that the Abstract Test Suites are available in a representation that can be processed and analysed by software. The Test Results with the observations collected during a Test Run and the Abstract Test Suite metadata are stored in the database. This information will be used in the Test Reports that are provided to the user. Abstract and Executable Test Suites have to be developed with sufficient information so that users are in a position to understand the Test Reports and software is capable of processing Test Result documents. 5.2 Existing Components The design specified in this document is based on existing Test Engines components that can be used to execute tests. Some Test Engine candidates are identified in this document and more may be needed in the future, for example, when coverage data will be tested or if capacity or performance tests should be supported. In addition, for several of the conformance classes defined in or referenced from INSPIRE Technical Guidelines there are existing Executable Test Suites that can be used as a starting point (in the case of INSPIRE conformance classes) or re-used as-is (in the case of third-party conformance classes, in particular for OGC standards). Typically, each Test Engine will use a different structure to capture test results. In order to have a consistent experience for both end users (using the user interface) and developers (using the REST API), an additional layer is needed that can invoke the different Test Engines and stores and presents the Test Results in a consistent way with helpful information to users. This is the role of the INSPIRE Test Framework. Date: 02/08/ / 69 Doc. Version: v0.4

13 While such a framework could, in principle, be implemented from scratch, it should be based on existing software considering the existing constraints on time and resources. Potential candidates for the software include ETF 2, the TeamEngine 3 and the GITB POC 4. This document assumes that ETF will be used as the basis for the INSPIRE Test Framework as it is based on the concepts of ISO and the OGC specification model, already meets many of the must requirements identified in chapter 4, already supports test engines for validating web services (SoapUI) and very large XML document sets (BaseX), has existing Executable Test Suites for INSPIRE view and download services 5 that are used at least in the Netherlands 6 and the ELF project 7, is the candidate with which the authors of this document are most familiar with. The additional developments necessary to meet the INSPIRE requirements would be included in the ETF code base. The current draft design is driven by the requirements in chapter 4 (mainly the must requirements, but also some of those with should, could or would priority, see Appendix 3 for an evaluation of the requirements), knowledge about the existing INSPIRE Abstract Test Suites and knowledge of the document authors regarding user expectations for validation software. As a result, this design is in parts more capable than it would be strictly necessary to meet the minimum requirements; i.e., it is not necessary to have everything in the first release but the goal is to provide a complete design. To meet time and resource constraints it may be necessary to drop some non-essential characteristics in the first release. For example, candidates for this are: the Statistical Result Table support for multiple result styles automated detection of the type of a test object some parts of the API, e.g. the capabilities view or WebSockets 6 SYSTEM STRUCTURE This section describes the decomposition of the test framework into components. Each INSPIRE Test Framework component is packaged as a library. 6.1 Component overview The test framework consists of different components which are organized in three broad layers: (publication on GitHub in progress, by end of April 2016) Date: 02/08/ / 69 Doc. Version: v0.4

14 Figure 2: Architecture layers The Presentation layer handles HTTP requests, renders HTML for the user or returns machine readable responses. In most cases it accesses the Service and Domain layer, but in minor cases direct access to the Data Mapping and Data Access layer is required. The Service and Domain layer contains the domain model and the business logic and only accesses the Data Mapping and Data Access layer. The Data Mapping and Data Access layer is responsible for persisting data or accessing remote services. The components and interfaces, are shown in the high level diagram: Figure 3: Component View Components can be assigned to layers from Figure 2 based on the colouring. Date: 02/08/ / 69 Doc. Version: v0.4

15 Several APIs are used for communication between the components. The first API ( REST API ) allows external components to communicate programmatically with the INSPIRE Test Framework. One API is classified as SPI ( Service Provider Interface ), which is an API intended to be implemented or extended by a third party in this case to extend the framework with additional capabilities provided by a new Test Engine. Table 7 API overview API REST API Model API Test Driver SPI Test Engine APIs Description The Web Application provides a REST API to enable software to query available Executable Test Suites, Test Results stored in the database and enables the management of Test Objects and Test Runs. The Model API assists the Data Source, the Web Application and the Test Framework Core with sharing the internal INSPIRE Test Framework domain model. The Test Driver service provider interface enables the Test Framework Core to control the Test Drivers and to collect the Test Results. Each Test Engine will have its own API. This API is used by the Test Driver for the Test Engine to communicate with the Test Engine during a Test Run. The components shown in the figure are described in the following table. Table 8 Component overview Component Test framework core Web Application Test Drivers Data Source Report Transformer Description The test framework core provides the basic functionalities of the whole framework. It manages and loads the test drivers into an isolated runtime container and stores the results in a database. The Web Application provides a Web GUI and enables the User to use the INSPIRE Test Framework using a web browser. The component also provides a REST API to support automated use of the test framework by software. The component lists available Executable Test Suites (and the associated Abstract Test Suite metadata), Test Results stored in the database and enables the management of Test Objects and Test Runs. The figure shows two Test Drivers, one for testing services and one for data tests. These are examples and simply indicate that there will be multiple Test Drivers to support different Test Engines. Test Drivers are run in an internal, isolated runtime container and translate test run command controls and results between the Test Framework Core and a Test Engine. The Data Source receives and stores the Test Results and other internal domain model data with a connected database. It also looks up Abstract Test Suite metadata in the Abstract Test Suite Repository (currently on GitHub). The Report Transformer is used to transform the test results into a human or machine readable format. Date: 02/08/ / 69 Doc. Version: v0.4

16 6.2 Test Framework Core The Test Framework Core is part of the Service and Domain layer and implements the basic capabilities of the test framework. Figure 6 shows a simplified high level view of important Test Framework Core classes which are used to create and manage test runs. The Domain Model is not part of this section and is described in full in section 8.1. A Test Run (class TestRun) is the execution of one or more Executable Test Suites to test one ore more Test Objects. Each execution of one Executable Test Suite is represented by a Test Task (class TestTask). This is shown in Figure 4. Figure 4: Test Run structure During the test run a TestTask enters different states represented by the STATE enumeration. The transitions of the TestTask states are shown in Figure 5. Date: 02/08/ / 69 Doc. Version: v0.4

17 Figure 5: Task Run state transitions A TestTask is created right after the User starts the test and is being initialized. If the initialization fails, the task changes its state to FAILED_FINALIZING. It is important to note that the FAILED state does not represent a negative test result but a problem in the test itself due to a problem in an Executable Test Suite implementation, the test driver implementation or due to insufficient resources. If the initialization sequence is successful, the TestTask gets enqueued with other TestTasks in the test run until it gets scheduled and enters the RUNNING state. After the TestTask finished successfully (COMPLETED_FINALIZING state), failed (FAILED_FINALIZING state) or the user cancelled the task (CANCELING state) the TestTask resources are freed and the information about the last state is persisted in the system (with the results if the test run finished successfully). The TestTask class is shown in the context of other important Test Driver classes in Figure 6. Date: 02/08/ / 69 Doc. Version: v0.4

18 Figure 6: Test Framework Core - important classes The Test Framework loads a test driver, which implements the TestDriver interface, via the TestDriverManager object. The test driver instantiates a factory of type TestRunTaskFactory which is used to create objects of type TestRunTask. These objects represent a single test run. The TestSuiteDao manages all ExecutableTestSuites objects for a TestDriver. A ExecutableTestSuite can be used by one or more TestTask objects, that represent a test run to validate a single Test Object. The RunnableTestTask interface extends the Java Runnable interface (not shown in the diagram). A derived implementation is started by passing the object as parameter to the TaskPoolRegistry s submittask method. The progress of the task can be retrieved on demand by using the TaskProgress interface. A TestObject is of certain types which are used to check if a ExecutableTestSuite is applicable to a TestObject. The INSPIRE test framework provides a TestObjectTypeDetector interface which is used by concrete TestSuiteDaos implementations to detect the TestObject type, shown in Figure 7. Date: 02/08/ / 69 Doc. Version: v0.4

19 Figure 7: TestObjectType detector The class TestObjectType, its associations and the hierarchy is described in section in detail. The detectors are part of the Service Provider Interface package, implied by the WfsServiceDetector or the InspireViewServiceDetector. In future releases a plug-in mechanism can be considered to extend the detectors. When the User creates a TestObject, the detectors task is to reduce the choice of TestObject types, the user has to select. A Detector can be for example based on a regular expression, which parses the URL of a service endpoint and detects a pattern like wfs in or can analyse the content the URL points to. 6.3 Test Drivers Each Test Driver implements interfaces from the service provider interface package, in Figure 8 for the TeamEngine test driver. Date: 02/08/ / 69 Doc. Version: v0.4

20 Figure 8: Test Driver implementation example (using TeamEngine as the test engine) A TestDriver must implement or derive from the TestDriver, TestRunTaskFactory, TestSuiteDao, ExecutableTestSuite, RunnableTestTask classes and interfaces. The TestSuiteDao manages the ExecutableTestSuite objects which are only used inside the Service and Domain layer. To provide other layers with data of an ExecutableTestSuite the ExecutableTestSuiteDto uses the DtoAssembler to transform the object into a data transfer object. A test driver must either create a complete Test Result object structure (explained in detail in the Domain model section, 8.1.4) which will be transformed to XML by the framework or directly output a result as XML. The three test drivers for SoapUI, BaseX and TeamEngine persist results as follows: The SoapUI test driver creates the test result object structure which is afterwards serialized to XML The BaseX test driver outputs the results directly as XML The TeamEngine test driver outputs the results as XML in the Team Engine specific output format which is afterwards transformed to the INSPIRE test framework specific result format. Date: 02/08/ / 69 Doc. Version: v0.4

21 6.4 Web Application The Web Application is part of the Presentation layer, handles HTTP requests, returns HTML for the user or machine readable responses. Figure 9 shows the Pages the controllers and important classes which are used from sublayers. Figure 9: Web Application - Controller classes For each of the important domain model items, -TestRun, TestResult, TestObject and ExecutableTestSuite- one controller exists. Queries from the web are processed and corresponding classes from the sub layers are called. For example, the starting of a Test Run will cause the TestRunController to call the TestDriverManager to start the Test Run in the matching driver. Property files for each language are used by the controller to translate the web interface in the requested language. As a fall-back, English is used if the requested translation does not exist. 6.5 Data Source The data source is used to save test results, Test Objects, and for accessing local/remote Executable Test Suites and Abstract Test Suites. Date: 02/08/ / 69 Doc. Version: v0.4

22 Figure 10: Data Source - Data Access Object classes For accessing data in the framework the data access object pattern is used which provides an abstract interface to the database or a remote repository. This is represented by the Dao interface which can be used to create, delete retrieve (retrieve one item with the get method or multiple with the getall method) or update data. As return values and parameter types, classes of the type Dto (based on the Data Transfer Object pattern) are used. Classes that implement the Dto interface represent the same concept as a ModelItem from the Domain Model (discussed in section 8.1.1) but are used to reduce overhead and for exchanging data between different architecture layers. Dto objects must only implement a getid method used to identify the object. For serializing and deserializing a Dto object, the DtoAssembler is used which holds information on how to convert information into the different object representations. Serialized objects except ExecutableTestSuite and AbstractTestSuite Dtos- are persisted in a XML database, which means that files are created and index structures are created to speed the up querying of data. As the whole framework does not need to process queries with complex relations, the NoSQL approach will simplify the development and reduce dependencies to additional SQL libraries. Due to the design of the database interface of the framework, this could also be changed in future releases, if necessary. ExecutableTestSuite and AbstractTestSuite Dtos can only be loaded, either from a file or a remote repository. A remote repository is configured through a configuration file. Date: 02/08/ / 69 Doc. Version: v0.4

23 6.6 Result Transformer To transform the machine generated test results into human readable test reports or other result formats like EARL, Result Transfomers are used, shown in Figure 11. Figure 11: Result transformer classes Each Transformer possesses a name, methods to check if it can transform a Test Result or a Statistical Report Table (explained in the Domain model section) and a method to perform the transform operation. As already said in the Data Source section, the Test Results and also the Statistical Report Table are persisted as XML. Therefore, the XslResultTransformer can directly use these data and transform them using a XSLT style sheet. 7 RUNTIME VIEW Some typical use cases exist for the INSPIRE Test Framework, for which the interaction between the higher level structures are described as UML sequence diagrams in this section. Domain model classes used in the diagrams are explained in detail in section Define and retrieve Test Objects SoapUI is used as an example for a test driver for web service tests, in the sequence diagram in Figure 12. Date: 02/08/ / 69 Doc. Version: v0.4

24 Figure 12: Runtime View Definition of a Test Object On application start the TestSuiteController requests all supported TestObject types. When the User opens the view and enters an endpoint the TestSuiteControllert checks if the SoapUITestSuiteDao can detect the TestObject type and returns the suggested type to the user. The communication is done via REST in the background of the view. The User negates the suggested types so that the TestSuiteController shows all known TestObject types from which the user can select one. The created TestObject is persisted in the database via the BasexTestObjectDao and afterwards shown in the TestObjectView to the user. Figure 13 shows the sequence diagram if a user opens a view to show the first 100 created TestObjects. Date: 02/08/ / 69 Doc. Version: v0.4

25 Figure 13: Runtime View - Retrieving multiple Test Objects The TestObjectController requests the first 100 persisted TestObjects from the database. The database deserializes the Xml and creates objects which are then inserted into the view. 7.2 Starting a Test The following sequence diagram shows the start of a test and the transfer of the Dtos through the layers. Date: 02/08/ / 69 Doc. Version: v0.4

26 Figure 14: Runtime View - Start a Test part 1 The User opens the CreateTestRunView, selects the TestObject, an Executable Test Suite and sets the parameters and afterwards repeats the steps to select a second TestObject, etc. The User clicks the start button and gets redirected to the monitor view. Date: 02/08/ / 69 Doc. Version: v0.4

27 Figure 15: Runtime View - Start a Test part 2 The TestRunController starts the TestTask throught the TaskPoolRegistry. The Monitor View continuously pulls the status and progress of the TestTask through the TaskPoolRegistry which returns the TestTask with an associated TaskProgress object. The TaskProgress object is used to update the view and show the progress to the User. After the Test finishes the results are transformed and persisted in the TestResultDao. The View is updated by the TestRunController and redirects the User to the ResultView, which shows the User a human readable test report. 8 CONCEPTS 8.1 The INSPIRE Test Framework Model The INSPIRE Test Framework Domain Model documents the detailed requirements regarding the information that are relevant for Abstract and Executable Test Suites, Test Objects, Test Runs, Test Results and Test Reports. This builds on the high-level requirements captured in Chapter 4 and adds additional detail based on the INSPIRE Technical Guidance documents, the international standards underpinning INSPIRE and the existing Abstract Test Suites. The model needs to be detailed enough to serve as the basis for the implementation of the INSPIRE Test Framework. At its core, the model is based on the terminology and concepts from the two main standards in the geo-community related to specifications and conformance: ISO [1] The OGC Specification Model [2] Date: 02/08/ / 69 Doc. Version: v0.4

28 It also draws from previous work 8 of the INSPIRE MIG-T sub-group MIWP-5, analyses of the ISA Global ebusiness Interoperability Test Bed (GITB) [3] as well as previous experience of the authors Common domain model properties and hierarchy Figure 16: Domain Model Common properties and highest level hierarchy The diagram in Figure 16 shows the abstract class ModelItem from which the abstract ResultModelItem and the abstract RepositoryItem classes are derived. ModelItem specifies a common ID attribute which is used to identify all model items within the test framework and also for referencing items from outside the test framework for instance as part of an URL. Each ModelItem can have arbitrary properties (key-value-pairs), which can be used to provide additional metadata without the need for a revision of the model. These properties are stored in the property properties. The framework will make use of some predefined keys which are listed in Appendix 1. If a derived object has one superordinate object, the parent association points to that superordinate object. The associations with a role name parent in the following sections are subtypes of this association but are shown here for clarity and readability. The RepositoryItem class possess the properties author, creationdate, lastupdatedate, lasteditor, version and itemhash to provide information about the author, the date the item was created, the date it has been modified, the last editor of the item, a version number and a hash value which is used to check if the item has changed. It also provides a label, description and an optional reference property which is used to reference the repository item outside of the framework as http URI, e.g. a link to a specification document. The abstract SpecificationModelItem class is derived from the RepositoryItem class and itself extended by other model classes related to specifications and their Abstract Test Suites (not 8 accessed 16/2/2016 Date: 02/08/ / 69 Doc. Version: v0.4

29 shown in this diagram). These classes are used to describe and break down a specification into certain testable parts and to describe the abstract test approach and the test procedure. The inherited label property is used to identify the associated part in the specification documents, for example, the number of a Requirement or the section number of an Abstract Test Case. The TestModelItem class is derived from the RepositoryItem class, too, and represents all objects that are used to specify and plan a test run, like Test Objects, Executable Test Suites, Test Cases, etc. The abstract ResultModelItem class is derived from the ModelItem class and represents hierarchical model items that carry test results from a test run. For each derived class there is exactly one corresponding TestModel sibling, which is linked by the association role resultedfrom. Each object derived from ResultModelItem carries a duration and a starttimestamp property indicating the time a test was started and how long that part of the test took (in milli-seconds). The derived property resultstatus is an enumerant which indicates the status of the ResultModel item, taking the results of its children results into account Specification model items Figure 17 shows all classes which are derived from the abstract SpecificationModelItem class. Figure 17: Domain Model Specification model items A Specification represents an INSPIRE Technical Guidance document with implementation requirements or a third-party specification that is normatively referenced, e.g. OGC WFS 2.0. The inherited reference property is an http URL where the specification document is available. If both HTML and PDF representations are available, the HTML representation is preferred. Otherwise, the link should point to a PDF document. The inherited version property carries the exact version of the specification. To provide a full label, e.g. OGC Web Feature Service 2.0 Interface Standard, the property fulllabel can be set. Date: 02/08/ / 69 Doc. Version: v0.4

30 The main normative statements in a Specification are the Requirements (recommendations are not supported in the first release of the INSPIRE Test Framework), where each Requirement is part of exactly one Specification and tested by Abstract Tests Case (see below). As the Specification and the Requirements objects are created before the Abstract Test Suites and the Abstract Test Cases are created in the platform, the association between Specification and Abstract Test Suite objects and the association between Requirement to Abstract Test Case objects are optional and must be established when adding the information about Abstract Test Suites. For specifications that do not identify requirements (e.g. older OGC standards or some INSPIRE Technical Guidance documents), there should be an entry for each section with requirements and the identifier (property label) is the section number. The optional property dateofentryintoforce applies only to INSPIRE requirements and will only be provided for legal requirements that are not yet in force. Each Specification has exactly one Abstract Test Suite that specifies one or more Conformance Classes to which an implementation may conform. Each Conformance Class consists of one or more Abstract Test Cases. The information stored for each case is derived from the text information included in the test cases of the existing Abstract Test Suites for INSPIRE 9. The mark-up, in particular the links, would not be included. The idea is to be able to include the text in the Test Reports in addition to the link to the description in the source repository (currently GitHub). One Conformance Class is used to test exactly one Test Object Type and its subtypes. The class TestObjectType is described in section The Purpose section will be stored in the property description. The Prerequisites, Test method, and Notes can be set as properties of the same name in camel case notation ( Test method can be set with testmethod ). The property references establishes the relationship between the Abstract Test Cases and the Requirements they test. It is derived from the References section and must be manually set by the Test Developer. ISO introduces the concept of Abstract Test Modules (class AbstractTestModule) to group Abstract Test Cases within a Conformance Class. The current INSPIRE Abstract Test Suites do not make use of this capability. It is an option to remove this extra grouping during the development phase Test model items Figure 18 shows all classes which are derived from the abstract TestModelItem class. 9 For example, see for the HTML-styled versions and for the raw text versions in Markdown. Date: 02/08/ / 69 Doc. Version: v0.4

31 Figure 18: Domain Model - Test model items There are two main classes. A TestObject is an INSPIRE asset that implements one or more Conformance Classes specified in INSPIRE Technical Guidance documents. It adds the property resource which is a resolvable URI (a http/https URI or a file URI) where the Test Object can be accessed. The types of access depend on the type of the Test Object. This will be discussed in section The other main class is ExecutableTestSuite which is the smallest unit of tests that can be executed in a Test Run against a Test Object. An Executable Test Suite will be executed by a Test Engine and can depend on other Executable Test Suites which will be executed before. The property TestDriverId identifies the Test Driver that is used to communicate with the Test Engine. The inherited reference property is the resolvable URI (a http/https URI or a file URI) where the file that is the Executable Test Suite in the format that is understood by the Test Engine is located. The inherited properties property is used for specifying optional test parameters and optionally its default values for the Executable Test Suite. Date: 02/08/ / 69 Doc. Version: v0.4

32 To support error messages in multiple languages as well as automated processing of Test Results, MessageTemplate objects are associated with an Executable Test Suite. The MessageTemplate class represents the message template for a message. It carries a code property which is used to identify and lookup the object and a locale property that says in which language the message template is. The combination of code and locale must be unique for each Executable Test Suite. The template property is a string with optional insertion points. This class is used in conjunction with the Test Assertions, discussed at the end of this section. An example could be the template Feature {fid} of type {featuretype} has an invalid code list value {value} for attribute status. The value must be one of the values of the code list where fid, featuretype and value are the insertion points. The Report Transformer will use the appropriate template, in the language of choice, to compile the human readable message when generating the Test Report. An Executable Test Suite consists of TestCase objects, which may be grouped by Test Modules (class TestModule). If the AbstractTestModule class will be removed, see the discussion above, this class would be removed, too. Test Case can depend on another Test Case, which will be executed by the Test Engine before. The ParametizableTestCase class is derived from the TestCase class and can be called from an instance of TestCase during test execution time with parameters. The accepted parameter keys are carried in the property parameterkeys. To test a set of Feature Types, for example, there would be one TestCase object that drives the test, determines information about the Feature Types and calls one or more ParameterizableTestCase objects with specific parameters (at least one parameter would be the Feature Type name in that case). An example for such a Parameterizable Test Case is a Test Case that is defined for every feature type or layer listed in a WFS / WMS capabilities document. The execution of a Test Case may require several steps to be executed; for example, to perform a GetCapabilities request first to determine certain information on a service and then to execute additional requests based on that information and test the responses against assertions. Such steps are represented by instances of the TestStep class. The property statementforexecution carries the statement that is executed by the Test Engine with the test step, for example a GetCapabilities request. Each Test Step is of a special type and textual described by one TestStepType object, which only has a label and a description property. The TestStepType objects are Test Engine specific and, therefore, provided by the Test Engine, however one TestStepType with the label Manual is available globally in the INSPIRE test framework and indicates a test step that needs human interaction, such as checks that require expert knowledge or are too complex for writing a machine executable test. For each step there may be Test Assertions (class TestAssertion). An assertion is an atomic test on the result of a Test Step. The information about different assertion types is included in a referenced TestAssertionType object and also provided by the Test Engine. The property expression includes the expression (in the language that is used by the Test Engine for the assertion type) that is used to analyse the asset. If the result value is the same as the property expectedresult, the assertion passes, otherwise it will fail. The association translatedby is used to translate a message into one or more languages. Multiple Executable Test Suites may be grouped to an ExecutableTestSuiteSet Result model items Figure 19 shows all classes which are derived from the abstract ResultModelItem class. Date: 02/08/ / 69 Doc. Version: v0.4

33 Figure 19: Domain Model - Result model items A Test Result (class TestResult) is the information about the result of executing an Executable Test Suite against a Test Object. It contains the results of the different Test Modules, Test Cases (also the generated ones), Test Steps and Test Assertions that are part of the Executable Test Suite. For Test Results where the Test Run is finished, the property completed is set to true. If available, information about the test execution environment, e.g. the machine on which the test is executed, is stored in the property exectionenvironment. Attachments, for example service responses obtained in a Test Step may be attached to the Test Result and referenced by Test Step Results. At least the log file of the test run is attached to the Test Result. If supported by the Executable Test Suite, statistical reports may be generated, which may provide additional statistical information about the Test Object, e.g. the number of features per feature type in the Test Object, and attached to the test results. The class StatisticalReportTable is derived from an Attachment. The styling must be supported by the ResultStylesheet, which can determine the attachment type from the property type. The TestAssertionResult may generate multiple messages, expressed through the association messages. The associated class Message possesses a code property used to lookup the corresponding MessageTranslation objects. A Message object may have multiple MessageTranslationParameters objects which carry a token and a value property. The Date: 02/08/ / 69 Doc. Version: v0.4

34 properties are inserted in the insertion points of the corresponding MessageTranslation object s template property. This translation mechanism which can also be used to encode the location where the error occurred, is describe with an example: Assume we have an assertion that tests where an attribute value of a feature is registered in the INSPIRE code list register. For failed tests, an error should be reported that states a message like: Address with gml:id ab123cd67 : Attribute status has value but the value is not one of the allowed values in the INSPIRE code list register at Only the registered values are allowed. For this, we need a MessageTemplate, where the property template could be: {ftype} with gml:id {oid} : Attribute {attribute} has value {value}, but the value is not one of the allowed values in the INSPIRE code list register at {codelist}. Only the registered values are allowed. The corresponding MessageTranslationParameters would be: ftype=address oid=ab123cd67 attribute=status value= codelist= Templates will typically consist of two parts. First, a locator that helps the person tasked with correcting the error to find where in the Test Object the issue occurs. The information included in a locator will depend on the Test Object Type and on the Test Engine. Second, the message explains what the issue at the indicated location is and how to correct it. In some cases, we need to detect which part of the message is the locator and which is the error description. The INSPIRE Test Framework, therefore, consistently follows a pattern where the locator is first, the error description is second and both are separated by a colon. Since a locator string may also include colons, the locator is wrapped in double square brackets in the template string. For example: [[{ftype} with gml:id {oid} ]]: Attribute {attribute} has value {value}, but the value is not one of the allowed values in the INSPIRE code list register at {codelist}. Only the registered values are allowed. To support generating error messages in a test report in different languages, we also need to associate a language (property locale) with a template. In the above case, this would be en. The MessageTemplate with locale de could be: [[{ftype} mit gml:id {oid} ]]: Attribut {attribute} hat den Wert {value}, der keiner der erlaubten Werte aus dem INSPIRE-Codelist-Register {codelist} ist. Nur die darin registrierten Werte sind zugelassen. To be able to identify the different translations for a same error, we use the properties code. The code is stored in the Test Result along with the parameter values and the appropriate MessageTemplate for the code in the desired language is selected when styling the Test Report for the Test Result. The derived resultstatus of a ResultModelItem object is an aggregated result from all contained TestAssertionResult objects. It is determined by the following rules, where the first rule that is met applies: FAILED, if at least one status value is FAILED, WARNING, if at least one status value is WARNING, Date: 02/08/ / 69 Doc. Version: v0.4

35 INFO, if at least one status value is INFO, PASSED, if all status values are PASSED, PASSED_MANUAL, if at least one status value is PASSED_MANUAL and all others values are PASSED. A test case possesses the status PASSED_MANUAL, if the test is not automated (no manual test steps) or not fully automated (at least one manual test step) and the user has to validate results manually based on instructions in the report, SKIPPED, if at least one status value is SKIPPED because a test case depends on another test case which has the status FAILED, NOT_APPLICABLE if at least one status value is NOT_APPLICABLE, in the case the test object does not provide the capabilities for executing the test, UNDEFINED, in all other cases. The list of Test Result status codes needs to be a fixed enumeration as the INSPIRE Test Framework code needs to implement behaviour for each value, e.g. when styling test reports. An instance of the class TestResultCollection represents the result of a Test Run, which is the execution of one or more Executable Test Suites on the same Test Object Overview of the model with all classes Figure 20 shows the composed Domain Model and the associations of the most important classes from each model branch. All properties and some associations are omitted to make the model more legible. Date: 02/08/ / 69 Doc. Version: v0.4

36 Figure 20: Domain Model Composed model branches The classes TestRun, TestRunParameters and Attachment are not derived from the higher level classes. Figure 21 shows these classes and the TestObjectType class including their associations and all properties that are omitted in Figure 20. Date: 02/08/ / 69 Doc. Version: v0.4

37 Figure 21: Domain Model TestRun and TestObjectType classes A Test Run (class TestRun) is a collection of one or more TestTasks which bundle the execution of one Executable Test Suite on one Test Object. It is associated with a TestResultCollection which holds the results of each TestTask. Depending on the Executable Test Suites, the initiator of the test run (linked by the the association startedby) may be asked to provide parameters values for each Test Task. These are recorded as TestTaskParameters (an association class) which represent name-value-pairs. The class User does not represent a persistent user account that is managed. The user may only enter a name and a locale at test run start. Each TestResult can be transformed by one ResultStylesheet into a specific output format. As discussed in 8.1.3, each TestObject is of certain types (class TestObjectType), where the property type is a unique name for the Test Object Type. The hierarchy of types is shown in Figure 22. Date: 02/08/ / 69 Doc. Version: v0.4

38 Figure 22: Test Object type examples For each Executable Test Suite the applicable Test Object Types are available via the property supportstype. This ensures that only those Executable Test Suites are executed in a Test Run on a Test Object that are capable of handling the Test Object. 8.2 Web Interface This section shows some design concepts of the web interface which assists the user in fulfilling his tasks. The acceptance of the software depends essentially on the understandability and operability of the system. Therefore, users with different levels of prior knowledge and different use cases are identified. From the scenarios basic activities are derived, harmonized and connected with each other. On basis of the scenarios and activity wireframes can be derived User Scenarios The interface design is explained on basis of five personas and scenarios: User 1 is not familiar with the system. He simply wants to enter the URL of his WFS and test it for conformance to conformance classes Pre-defined WFS and Direct WFS. User 2 is not familiar with the system. Her organization deployed an Executable Test Suite on a local INSPIRE Test Framework installation. The Executable Test Suite is named Validate Metadata (additional organization internal tests) and as it is developed by her organization, not associated with an official conformance class. She knows the name of the Executable Test Suite and simply wants to upload a ZIP file, which contains test data in XML files, for testing. User 3 is familiar with the concepts of the system and wants to create two Test Objects, one WFS and one WMS service, and reuse them in test runs later. The WFS is secured with HTTP basic authentication. Date: 02/08/ / 69 Doc. Version: v0.4

39 User 4 is familiar with the concepts of the system and wants to test, if his WFS and WMS service are still conformant to the Pre-defined WFS and View WMS classes. The Test Object for one service already exists in the system. User 5 is not familiar with the system and wants to open the Test Report of a not finished test run Activities For these scenarios basic activities can be identified: the selection of a Test Object, the creation of one, the selection of an Executable Test Suite (they are selected implicit by selecting a conformance class) and the opening of a Test Report. The activities are shown as activity diagrams in Figure 23, Figure 24 and Figure 25. Figure 23: Select Test Object activity The user may select an existing Test Object or create a new Test Object. If the Test Object should not be used only once, she/he can mark the Test Object so that it is not deleted after the Test Run. At the end of the process the User confirms the selected/created Test Object. Date: 02/08/ / 69 Doc. Version: v0.4

40 Figure 24: Create new Test Object activity The User selects if a service-based or a file-based should be created. For a service-based Test Object the URL and, if applicable, the credentials are required. If a filebased Test Object should be tested the User can choose if he wants to upload one or multiple files; which only would make sense for small data, respectively in a public deployment and, therefore, will be restricted to approx. 500 MB; or if files should be used that the INSPIRE test framework has access to. This could be a moot point in a local area network and would apply to a scenario where the INSPIRE test framework is deployed locally (User 2). Date: 02/08/ / 69 Doc. Version: v0.4

41 Figure 25: Select Executable Test Suite activity The User can select one or multiple Executable Test Suites or Conformance Classes. By selecting a conformance class all associated Executable Test Suites will be selected and used successively to validate the Test Object. Date: 02/08/ / 69 Doc. Version: v0.4

42 Figure 26: Configure Test Run activity All activities are manifested as view in a main view which pretends a flow through the views. At first the User selects the Test Object or creates a new one, then selects one ore multiple Executable Test Suites through a conformance class. Afterwards she/he enters parameters for the Test Run. If also other Test Objects should be tested the process is repeated until finally the Test Run is started. During the Test Run a Monitor View is presented which shows the progress of the whole Test Run. When the Test Run finished the Test Report will be automatically presented as HTML file. If the user wants to find a report again, the Test Reports Overview could be opened and browsed for the report. This is shown in Figure 27. Date: 02/08/ / 69 Doc. Version: v0.4

43 Figure 27: Browse Test Reports activity If the test run did not finish the button View Test Run Monitor is shown (but no View Test Report button) which can be used to reopen the monitor view and wait for the test run to finish. Other entries in the menu are shown in Figure 28. Date: 02/08/ / 69 Doc. Version: v0.4

44 Figure 28: Select other task from menu activity Some of the views shown before, are reused in the menu views. For example the View Test Objects entry is approximately equal to the view from Figure REST API The REST API enables a client component to start and control test runs and to retrieve test results. The REST API is defined in the OpenAPI JSON format and consist of an operation definition part and a domain model part, which basically reflects the domain model (some simplifications are necessary which are not supported by the standard or to reduce the response size). The draft API is available on Swagger hub (for now marked as unpublished), it will be published on the source code repository of the INSPIRE test framework and is also included as part of the application, so that a client can request the definition directly ( The API base path includes the actual version number, which will be set to v1 on first release. If breaking API changes are necessary in future releases, clients should be able to use the previous version. The complete API definition is available on Swagger hub at Operations: Date: 02/08/ / 69 Doc. Version: v0.4

45 Definitions: A brief description of the operations is given in Table 9. Table 9: API operation description HTTP-Method, Operation and Path parameter GET /capabilities/* GET /status GET /testobjects POST /testobjects Purpose / Description Returns the Capabilities of the services, including information about available Executable Test Suites, Abstract Test Suites, Conformance Classes, Test Object Types, Results Styles and Test Drivers. These information can be requested separately, i.e. by calling /capabilities/ets. Returns the status of the service, which can be used to check for problems. If a problem occurs a message will be returned, like a information about low disk space. Returns a collection of all created test objects. Supports optional parameters for pagination (limit, offset), sorting (sortby=<property name>), ordering (sortorder, ASCENDING/DESCENDING). Creates a new test object. GET /testobjects/{id} Returns a test object by its ID. POST /testobjects/{id}/uploaddata Upload data for the Test Object, e.g. gml files. DELETE /testobjects/{id} Delete the test object and the uploaded test data (if applicable). GET /testruns POST /testruns GET /testruns/{id} Returns only running test runs, unless parameter finished is set to true. Supports optional parameters for pagination (limit, offset), sorting (sortby=<property name>), ordering (sortorder, ASCENDING/DESCENDING). Creates a Test Run for one or more Executable Test Suites and returns a TestRun object with the test run id. The available Executable Test Suites or Conformance Classes can be queried by calling /capabilities/ets and /capabilities/ccs. Returns a Test Run object with simplified information about each Test Run Task validation progress, i.e. the percent value of the completed test steps. DELETE /testruns/{id} Delete a test run by its ID. Date: 02/08/ / 69 Doc. Version: v0.4

46 POST /testruns/{id}/cancel Cancels a test run by its ID. GET /testruns/{id}/progress GET /testruns/{id}/progress/{testtaski ndex}/{logsequencenumber} GET /testobjecttypes Returns progresses of Test Tasks with additional log messages. The operation is only available for clients that implement the HTTP WebSocket Protocol, which is supported by all modern Web Browsers and can be used in HTML Web Interfaces. To ensure backwards compatibility, the API offers also a progress interface which uses a logsequencenumber (see next row). Returns progresses of Test Tasks with additional log messages. The Client may poll continuously for new progress and test task log entry updates. The Client starts the polling by sending the logsequencenumber 0. As long as no new log message exists, the server will respond with a 304 HTTP message (nothing modified), otherwise the server will respond with the delta log entries and the new sequence number. For example, the client starts with logsequencenumber. The server responds after some repolls with three log entries and sends the new logsequencenumber 2. The Client will use the new logsequencenumber for the next polling request, until a new logsequencenumber is send. This method reduces the overhead between client and server. To ensure the client does not poll too often, the server will send a inform the client about the polling rate limit with a X-Poll-Interval header. If the client ignores and triggers the limit, a 403 HTTP message (forbidden) is send. Returns all Test Objects Types which are supported by the registred Executable Test Suites. GET /testresultcollection/ GET /testresultcollection/{id} Returns all Test Results as a collection of Test Results in a simplified view (only test result IDs and the status of each Test Result are included). The results can be filtered with the testresultsstatus parameter, i.e to show only failed test results. Supports optional parameters for pagination (limit, offset), sorting (sortby=<property name>), ordering (sortorder, ASCENDING/DESCENDING). Get a collection of Test Result by its ID with all details as JSON. GET /testresultcollection/{id}}.{output Format} Get a collection of Test Result by its ID in a specific output format, i.e. HTML. The report style with the highest priority is be used, unless a style is specified with the style parameter. The available styles can be queried by calling /capabilities/resultstyles. DELETE /testresultcollection/{id} Delete a collection of Test Results by its ID. Date: 02/08/ / 69 Doc. Version: v0.4

47 GET /testresults/{id} Get a Test Result as JSON object by its ID. GET /testresults/{id}.{outputformat} GET /testresults/{id}/attachments/{at tachmentid}/ Get a Test Result by its ID in a specific output format, i.e. HTML. The report style with the highest priority is be used, unless a style is specified with the style parameter. Get an Attachment as file for a specific Test Result. The sequence of API calls for a client that wants to create and monitor a new Test Run are shown in Figure 29. The exchange of JSON messages is simplified (and not valid JSON) but reflects the most important parts of the interaction. Figure 29: API - create/monitor Test Run example sequence Date: 02/08/ / 69 Doc. Version: v0.4

48 First, the client requests a list of all supported Test Object types, to verify that the required Test Object types are supported. Then the client creates two Test Objects with the IDs 2121 and is of type Web Feature Service and 4145 of type INSPIRE GML. As Test Object 4145 represents a spatial data set, a ZIP archive of the GML documents is uploaded to the INSPIRE Test Framework. Afterwards, the client requests all available Executable Test Suites. Executable Test Suite 12 supports testing Web Feature Service based test objects and Executable Test Suite 34 supports the required INSPIRE GML test object type. The client also gets the information that Executable Test Suite 12 can be parameterized with a maxfeatures parameter. The client starts a Test Run by creating a Test Run JSON object that bundles each TestObject, the Executable Test Suite that is used for validation and specific test parameters, if applicable. Returned is a TestRun object with an ID. The ID 4444 is used to monitor the progress of the Test and the ID 888 is finally used to query the test results as HTML-file. When the Test Run is completed the Client can retrieve the results of the Test Runs by using the TestResultCollection ID. 8.4 GITB API In this version of the INSPIRE test framework, the GITB API will not be implemented but it is recognised as a potential future capability. The API requires a communication based on the SOAP protocol. Although the implemented API is rest based, the messages can be easily transformed to XML or into the required SOAP protocol. In addition, the implementation would require a more detailed analysis of the GITB interfaces and data structures that are part of the communication process. 8.5 Specifications for Abstract Test Suites Abstract Test Suites are currently managed using GitHub with the following approach: For each INSPIRE specification document with conformance classes (and abstract test suites), a repository is created. For each version of the INSPIRE specification a branch with the version identifier as the name is maintained. This branch will always contain the latest approved version of the Abstract Test Suite for this document version. The README.md document includes links to the conformance classes. Additional branches will be used during development and review phases. Each conformance class in an Abstract Test Suite is represented by a directory in the repository. The README.md file specifies o the standardisation target of the conformance class; o dependencies to other conformance classes; o the list of documents referenced from test cases; o the list of test cases and their cross reference to the requirements identified in the INSPIRE specification; o for XML-based tests the XML namespace abbreviations used. Each test case in a conformance class is a Markdown document in the conformance class directory. All directory and file names use only lower case letters, digits and hyphens. In order to support parsing the test cases with software, all Markdown documents of test cases follow a standard template (text in curly brackets are placeholders) 11 : 11 It is planned to add version identifiers and error messages for failed assertions in future updates of the Abstract Test Suites. Date: 02/08/ / 69 Doc. Version: v0.4

49 # {Human readable name for this test} **Purpose**: {Why this test is necessary?} **Prerequisites** {References to other test cases that must be passed before evaluating this one, if applicable. Encoded as Markdown links using the persistent URI see below not the GitHub URI.} **Test method** {Complete description of the test steps and assertions to test. Use bullets or any markdown formatting as necessary, including links tot he contextual XPath references.} **Reference(s)** {References to other test cases that must be passed before evaluating this one, if applicable. Encoded as Markdown links using the persistent URI see below not the GitHub URI. References to the requirements tested by the test case.} **Test type**: {Either Automated, Automated/Manual or Manual.} **Notes** {Any additional notes (optional).} ## Contextual XPath references The namespace prefixes used as described in [README.md](README.md#namespaces). Abbreviation XPath expression {XPath expressions as needed.} An example: # Metadata date **Purpose**: As the Metadata Date is not supported by ISO WMS 1.3.0, an extension shall be used to map this to an element within an element. The date shall be expressed in conformity with the INS MD. **Prerequisites** * [Schema validation]( /schema-validation) **Test method** This test only applies to [scenario 2](#scenario-2). Otherwise the test case is skipped. Check if there is a MetadataDate node in the ExtendedCapabilities section. **Reference(s)**: * [TG VS]( /README#ref_TG_VS), Chapter **Test type**: Automated **Notes** ## Contextual XPath references The namespace prefixes used as described in [README.md]( /README#namespaces). Abbreviation XPath expression Date: 02/08/ / 69 Doc. Version: v0.4

50 MetadataDate <a name="metadatadate"></a> /wms:wms_capabilities/wms:capability/inspire_vs:extendedcapabilities/inspire_comm on:metadatadate metadataurl <a name="metadataurl"></a> /wms:wms_capability/wms:capability/inspire_vs:extendedcapabilities/inspire_common :MetadataUrl ExtendedCapabilities <a name="extendedcapabilities"></a> /wms:wms_capabilities/wms:capability/inspire_vs:extendedcapabilities scenario 2 <a name="scenario-2"/> /wms:wms_capabilities/wms:capability/inspire_vs:extendedcapabilities[inspire_comm on:resourcelocator or inspire_common:resourcetype or inspire_common:temporalreference or inspire_common:conformity or inspire_common:metadatapointofcontact or inspire_common:metadatadate or inspire_common:spatialdataservicetype or inspire_common:mandatorykeyword or inspire_common:keyword] The template can be parsed using the following regular expression (which supports some variations, too): (?s)#\s?(.*)\s+\*\*purpose\*\*[:]?\s*(.*)\s+\*\*prerequisites\*\*\s*(\*\s\[(.+)\] \((.+)\)\s+)*\*\*test method\*\*\s+(.*)\s+\*\*reference\s?\(s\)\*\*[:]?\s*(.*)\s+ \*\*Test type\*\*[:]?\s+(automated Manual Automated/Manual)\s+\*\*Notes\*\*[:]?\s +(.*)\s+##\s*contextual XPath references\s+(.*) To reference these resources, persistent URIs are used. This supports moving the Abstract Test Suites to other locations than GitHub in the future. The USI scheme is as follows (text in curly brackets are placeholders): INSPIRE specification document (Technical Guidance), latest version, latest Abstract Test Suite version URI: Example: Media types: application/json returns JSON with information about the files and directories in the root of the repository. Everything else returns the HTML of the root of the repository. For HTML, this redirects to the master branch in the GitHub repository for the Technical Guidance: INSPIRE specification document (Technical Guidance), specific version, latest Abstract Test Suite version URI: Example: Media types: application/json returns JSON with information about the files and directories in the root of the repository. Everything else returns the HTML of the root of the repository. For HTML, this redirects to the branch with the version id in the GitHub repository for the Technical Guidance: Conformance class URI: Example: Media types: application/json returns JSON with information about the files and directories in the root of the conformance class. Everything else returns the HTML of the conformance class. Date: 02/08/ / 69 Doc. Version: v0.4

51 For HTML, this redirects to the directory of the conformance class: Abstract test case URI: Example: Media types: application/json returns JSON with information about the file of the test case. text/plain returns the raw markdown file of the test case. Everything else returns the HTML of the test case. For HTML, this redirects to the Markdown document for the test case: To identify different versions of an Abstract Test Suite additional branches may be created to identify Abstract Test Suite releases. For example, while will always link to the latest version of the Abstract Test Suite for this version of the technical guidance on view services, additional branches , etc. could be created to identify previous releases of the Abstract Test Suite. If, how and when such versioning will be applied is a decision of the governing structure of the Abstract Test Suites, i.e. the MIG or the MIWP-5 sub-group. 8.6 Specifications for Executable Test Suite Languages Requirements by Test Object Type This section specifies known requirements for writing Executable Test Suites for the different types of Test Objects that occur in INSPIRE, see section These requirements will allow to identify suitable Test Engines for the different Test Object Types. Executable Test Suites will be developed in the native languages of the Test Engines, i.e. the requirements are requirements on the Test Engines used for the identified types of Test Objects General requirements General requirements for all Test Engines are: Multi-lingual support for messages; note that a test engine may not be able to support this directly in which case it has to be determined, if and how this can be addressed Well-defined mapping to the domain model, in particular to the Test Model Item classes (8.1.3) and the Test Result Item classes (8.1.4) Web services For INSPIRE network services the following additional requirements exist beside the ones listed in 8.6.2: Support for XML assertions based on XPath and/or XQuery Multiple test steps per test case Processing of service responses / scripting Generating dynamic test cases based on a parameterizable test case Invocation of Executable Test Suites for GML data, GMD metadata, GeoTIFF data OGC web services For INSPIRE network services based on OGC web services the following additional requirements exist beside the ones listed in 8.6.3: Date: 02/08/ / 69 Doc. Version: v0.4

52 XML Schema validation Support for assertions against images (for WMS and WMTS) Invocation of Executable Test Suites for the OGC standard (OGC CITE) ATOM feeds and OpenSearch endpoints For ATOM-based INSPIRE download services the following additional requirements exist beside the ones listed in 8.6.3: RelaxNG validation XML documents For XML documents representing spatial data sets or metadata records the following additional requirements exist beside the ones listed in 8.6.2: Support for XML assertions based on XQuery XML Schema validation Validation against code lists in the INSPIRE registry Access to referenced XML documents and using those in test steps and test assertions GML data For interoperable spatial data sets using the default encoding GML the following additional requirements exist beside the ones listed in 8.6.6: Support for large XML document sets (100s of GBs) Support for geometry validation and geometry predicates, including transformation between selected coordinate reference systems Invocation of Executable Test Suites for GML data and GMD metadata Invocation of Executable Test Suites for the OGC standard (OGC CITE, note that this may not be useful for very large XML document sets for practical reasons) Other data (e.g. binary data) To be identified in the future, when the Abstract Test Suites for the relevant data themes are completed. The annex I data themes currently only use GML data GMD metadata For metadata records using the GMD encoding currently no additional requirements are known beside the ones listed in Specifications by Test Engine Overview This section identifies the Test Engine that will be used in the first release of the INSPIRE Test Framework and provides an overview how Executable Test Suites will be developed for each Test Engine. Each Test Engine uses its own language to specify how tests will be executed; i.e., each Executable Test Suites will be written in the language that the associated Test Engine supports. The documentation for developing Executable Test Suites will include detailed guidance how to represent the Executable Test Suite information in the domain model in the language of the Test Engine. This may require additional conventions, if the language is not able to natively represent all metadata about Executable Test Suites specified in the domain model in 8.1. Date: 02/08/ / 69 Doc. Version: v0.4

53 It is important to understand that in general, the information in the domain model may not be sufficient to execute the tests. That information needs to be available within the INSPIRE Test Framework in order to provide useful Test Results and Test Reports, not as a complete specification of an Executable Test Suite. The ISA GITB specifies the generic Test Description Language (TDL) for this purpose, independent of the Test Engine used. The GITB INSPIRE Report [3] contains an analysis of the mapping between the domain model and TDL. If TDL would cover the test-related information in the domain model, it could be a candidate for representing this information in XML within the INSPIRE Test Framework. However, currently it seems unlikely that the information can be sufficient to automatically derive Executable Test Suites as it seems to be done in the GITB SoapUI The Open Source version of SoapUI 12 is used for testing web services. The SoapUI GUI is typically used for developing tests. The following table shows how SoapUI model items are mapped to INSPIRE Test Model Items, respectively how Test Result Model Items are generated from SoapUI results. Table 10: Mapping between SoapUI and Test (Result) Model Items SoapUI test model classes INSPIRE test framework model classes Mapping description Test project Executable Test Suite Executable Test Suite items may possess additional meta data information (see Repository Item class). Additional information may be stored in property files, that are persisted with the test project or information are managed and obtained from the underlying repository connector (e.g. a git repository will provide information about the last editor, etc.). Dependencies to other Executable Test Suite are expressed through the property etf.dependson. Test suite Test Module All properties can be mapped. Test case Test Case Dependencies to other Test cases are set through the property etf.dependson. Test step Test Step A manual Test Step is expressed through a Properties test step which only possesses the three properties etf.teststep.manual.instructions for instructions for a human user, etf.teststep.expectedresult and etf.teststep.expression (see test step model item class) Date: 02/08/ / 69 Doc. Version: v0.4

54 Test assertion Test Assertion As the SoapUI model does not provide internal IDs for assertions, the INSPIRE test framework ID is generated once from the SoapUI test assertion label, including the labels of all parent items (to get a unique identifier). As properties cannot be set in SoapUI assertions, the properties need to be set in the parent test step. A assertion property key is set with the following syntax: etf.testassertion. <label of the assertion>. <property>. At least one property etf.testassertion.<label>.id will exist. If the assertion returns an error message as String (e.g. Groovy script assertions), the assertion must return the message in the following form: { <Translation Template Name>; <Message Token Key 1>; <Message Token Value 1>; ; <Message Token Key n>; <Message Token Key n> } For example: { FEATURE_HAS_INVALID _PROPERTY_VALUE; fid; ID123456; featuretype; property; LandCoverValue LandCoverUnit; invalidvalue; } XQuery assertions can return the message inside XML node in the form: <messageargs> FEATURE_HAS_INVALID _PROPERTY_VALUE; fid; ID123456; featuretype; property; LandCoverValue LandCoverUnit; invalidvalue; </messageargs> For the use in ETF, several Groovy extensions have been developed simplifying the development of Executable Test Suites in SoapUI. These include: Project wide authentication handling Additional assertions like schema validation or exception report parsing Analysis of capabilities and schema to generate GetFeature-requests including property usage Functionality to improve reporting of failed assertions Other helper functions to simplify development BaseX BaseX is an XML database 13. The BaseX GUI is typically used for developing tests. Each Executable Test Suite is an XQuery that generates an XML document representing a TestResult. In the existing Executable Test Suites using BaseX, an XML document representing an ExectuableTestSuite (see 8.1.3) is specified that includes the XQuery expression for each Test Assertion. This document is processed by a generic XQuery script that executes all assertions in the test suite and generates the TestResult Date: 02/08/ / 69 Doc. Version: v0.4

55 This approach is also foreseen for the INSPIRE Executable Test Suites for XML documents. For example, an abstract test case to verify that valid code list values for hydrography network features are used could be implemented in the Executable Test Suite through the XML snippets shown below. This XML is processed by an XQuery that executes the XQuery expressions in the <expression> element. $feature is pre-defined as the sequence of all feature nodes for the conformance class. To convert the query results to the ETF result format, predefined functions are used. Note that the example uses a preliminary XML structure, which will change once the domain model has been implemented. <TestCase id="hy-n-as.values"> <label>code list values</label> <description>verify whether all attributes whose value type is a code list take the values set out therein: When an attribute has a code list as its type, compare the values of each instance with those provided in the INSPIRE code list register. To pass this tests any instance of an attribute <li>shall take only values explicitly specified in the code list when the code list s extensibility is 'none'.</li> <li>shall take only a value explicitly specified in the code list or shall take a value that is narrower (i.e. more specific) than those explicitly specified in the application schema when the code list s extensibility is 'narrower'.</li></description> <requirement ref="hy_ir_a4_3"/> <requirement ref="hy_ir_a6_1"/> <version>0.1</version> <author>interactive instruments GmbH</author> <creationdate> </creationdate> <lasteditor>interactive instruments GmbH</lastEditor> <lastupdatedate> </lastupdatedate> </TestCase> <Assertion enabled="true" id="hy-n-as.values.flowdirection" mode="global" severity="error"> <label>code list values for WatercourseLink property flowdirection are from the INSPIRE code list</label> <description>verify that only values from the INSPIRE registry are used as no extensions are allowed for this property: Inspect the code list valued property elements hy-n:flowdirection of WatercourseLink features. If a reference (@xlink:href) has a value that does is not one of the allowed values (see report disallowedcodelistvalue.</description> <requirement ref="hy_ir_a4_3"/> <requirement ref="hy_ir_a6_1"/> <expression>let $featurestoinspect := $features[self::hy-n:watercourselink] let $values := ( ' ' ' ) let $featureswitherrors := $featurestoinspect[hy-n:flowdirection/@xlink:href and not(hyn:flowdirection/@xlink:href = $values)][position() le $limiterrors] return (local:statistics-info($featureswitherrors), for $feature in $featureswitherrors order by $feature/@gml:id return local:object-message($feature, concat('the property ''hy-n:flowdirection'' has a value ''',$feature/hy-n:flowdirection/@xlink:href,''' that is not one of the allowed values listed at </Assertion> <Assertion enabled="true" id="hy-n-as.values.hydronodecategory" mode="global" severity="error"> <label>code list values for HydroNode property hydronodecategory are from the INSPIRE code list</label> <shortdescription>verify that only values from the INSPIRE registry are used as no extensions are allowed for this property.</shortdescription> <description>inspect the code list valued property elements hy-n:hydronodecategory of HydroNode features. If a reference (@xlink:href) has a value that does is not one of the allowed values (see report disallowedcodelistvalue.</description> <requirement ref="hy_ir_a4_3"/> <requirement ref="hy_ir_a6_1"/> <expression>let $featurestoinspect := $features[self::hy-n:hydronode] let $values := ( ' ' ' Date: 02/08/ / 69 Doc. Version: v0.4

56 ' ' ' ) let $featureswitherrors := $featurestoinspect[hy-n:hydronodecategory/@xlink:href and not(hy-n:hydronodecategory/@xlink:href = $values)][position() le $limiterrors] return (local:statistics-info($featureswitherrors), for $feature in $featureswitherrors order by $feature/@gml:id return local:object-message($feature, concat('the property ''hy-n:hydronodecategory'' has a value ''',$feature/hy-n:hydronodecategory/@xlink:href,''' that is not one of the allowed values listed at </Assertion> In this initial version the report structure is mapped and written to disk from an executed BaseX XQuery. As this approach is low level (e.g. every change in the model has to be reflected in the XQuery), in future versions of the INSPIRE Test Framework, an adapter module is planned to be loaded by BaseX, which provides a TestResultListener interface. This interface will be used for reporting the progress of the test run, reporting test results, logging as well as storing attachments. To make migration from the initial low-level approach to the TestResultListener straightforward, it is planned to use the same pre-defined functions and parameters in the initial XQuerys as in the TestResultListerner interface. Then migration to the TestResultListener approach mainly requires changing the namespace of the functions in the Executable Test Suites. Figure 30: TestResultListener interface The start of a test is reported by calling the start method which expects the ID of the Test Model Item which is executed next. The end method is called just after a test has been run and expects the ID of the Test Model Item and the status of the test result. Messages are reported with the addmesssage methods. At least the name of the message that should be translated, is expected. The addattachment method can be used to persist a labelled attachment. The debug, error and log methods are used to write more technical information to the test run log. Date: 02/08/ / 69 Doc. Version: v0.4

Spatial Data on the Web

Spatial Data on the Web Spatial Data on the Web Tools and guidance for data providers Clemens Portele, Andreas Zahnen, Michael Lutz, Alexander Kotsev The European Commission s science and knowledge service Joint Research Centre

More information

Spatial Data on the Web

Spatial Data on the Web Spatial Data on the Web Tools and guidance for data providers The European Commission s science and knowledge service W3C Data on the Web Best Practices 35 W3C/OGC Spatial Data on the Web Best Practices

More information

Validating services and data in an SDI

Validating services and data in an SDI Validating services and data in an SDI Presentation to: By: Date: INSPIRE Conference Clemens Portele, Jon Herrmann, Roy Mellum 30 September 2016 4 October, 2016 ELF is the response from the European Mapping

More information

INSPIRE tools What's new?

INSPIRE tools What's new? INSPIRE tools What's new? Michael Lutz INSPIRE Conference, Antwerp 18 September 2018 Joint Research Centre The European Commission s science and knowledge service INSPIRE reference validator Why a reference

More information

Global ebusiness Interoperability Test Beds (GITB) Test Registry and Repository User Guide

Global ebusiness Interoperability Test Beds (GITB) Test Registry and Repository User Guide Global ebusiness Interoperability Test Beds (GITB) Test Registry and Repository User Guide CEN Workshop GITB Phase 3 October 2015 Global ebusiness Interoperability Test Beds (GITB) 2 Table of Contents

More information

From the INSPIRE Engine Room

From the INSPIRE Engine Room From the INSPIRE Engine Room Michael Lutz ENiiG Conference, Lisbon 9 November 2016 Joint Research Centre the European Commission's in-house science service The JRC's role in INSPIRE Support MS in implementation

More information

ISA Action 1.17: A Reusable INSPIRE Reference Platform (ARE3NA)

ISA Action 1.17: A Reusable INSPIRE Reference Platform (ARE3NA) ISA Action 1.17: A Reusable INSPIRE Reference Platform (ARE3NA) Expert contract supporting the Study on RDF and PIDs for INSPIRE Deliverable D.EC.3.2 RDF in INSPIRE Open issues, tools, and implications

More information

Building a missing item in INSPIRE: The Re3gistry

Building a missing item in INSPIRE: The Re3gistry Building a missing item in INSPIRE: The Re3gistry www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting legislation Key pillars of data interoperability Conceptual data models Encoding

More information

Testbed-12 CITE User Guide - Profiles

Testbed-12 CITE User Guide - Profiles Testbed-12 CITE User Guide - Profiles Table of Contents 1. Introduction............................................................................. 3 2. TestNG...................................................................................

More information

Suggestions for writing Abstract Test Suites (ATS) for INSPIRE conformance testing for Metadata and Network Services

Suggestions for writing Abstract Test Suites (ATS) for INSPIRE conformance testing for Metadata and Network Services Suggestions for writing Abstract Test Suites (ATS) for INSPIRE conformance testing for Metadata and Network Services MIWP-5 Workshop 02. December 2014 Sven Böhme, Federal Agency for Cartography and Geodesy

More information

Infrastructure for Spatial Information in Europe. Proposed action for update of MIWP: Alternative encodings for INSPIRE data

Infrastructure for Spatial Information in Europe. Proposed action for update of MIWP: Alternative encodings for INSPIRE data INSPIRE Infrastructure for Spatial Information in Europe Proposed action for update of MIWP: Alternative encodings for INSPIRE data Type Creator MIWP Action fiche DG ENV Date/status/version 20/11/2017

More information

The European Commission s science and knowledge service. Joint Research Centre

The European Commission s science and knowledge service. Joint Research Centre The European Commission s science and knowledge service Joint Research Centre GeoDCAT-AP The story so far Andrea Perego, Antonio Rotundo, Lieven Raes GeoDCAT-AP Webinar 6 June 2018 What is GeoDCAT-AP Geospatial

More information

DanubeGIS User Manual Document number: Version: 1 Date: 11-Nov-2016

DanubeGIS User Manual Document number: Version: 1 Date: 11-Nov-2016 DanubeGIS User Manual Document number: Version: 1 Date: 11-Nov-2016 Imprint Published by: ICPDR International Commission for the Protection of the Danube River ICPDR 2016 Contact ICPDR Secretariat Vienna

More information

Introduction to INSPIRE. Network Services

Introduction to INSPIRE. Network Services Introduction to INSPIRE. Network Services European Commission Joint Research Centre Institute for Environment and Sustainability Digital Earth and Reference Data Unit www.jrc.ec.europa.eu Serving society

More information

DLV02.01 Business processes. Study on functional, technical and semantic interoperability requirements for the Single Digital Gateway implementation

DLV02.01 Business processes. Study on functional, technical and semantic interoperability requirements for the Single Digital Gateway implementation Study on functional, technical and semantic interoperability requirements for the Single Digital Gateway implementation 18/06/2018 Table of Contents 1. INTRODUCTION... 7 2. METHODOLOGY... 8 2.1. DOCUMENT

More information

Validation in the Netherlands and European Location Framework

Validation in the Netherlands and European Location Framework Validation in the Netherlands and European Location Framework INSPIRE Workshop on validation and conformity testing 15 16 May 2014 Thijs Brentjens Contents Geonovum and ELF INSPIRE INSPIRE in the Netherlands

More information

Initial Operating Capability & The INSPIRE Community Geoportal

Initial Operating Capability & The INSPIRE Community Geoportal INSPIRE Conference, Rotterdam, 15 19 June 2009 1 Infrastructure for Spatial Information in the European Community Initial Operating Capability & The INSPIRE Community Geoportal EC INSPIRE GEOPORTAL TEAM

More information

Consolidation Team INSPIRE Annex I data specifications testing Call for Participation

Consolidation Team INSPIRE Annex I data specifications testing Call for Participation INSPIRE Infrastructure for Spatial Information in Europe Technical documents Consolidation Team INSPIRE Annex I data specifications testing Call for Participation Title INSPIRE Annex I data specifications

More information

Intelligence Community and Department of Defense Content Discovery & Retrieval Integrated Project Team (CDR IPT)

Intelligence Community and Department of Defense Content Discovery & Retrieval Integrated Project Team (CDR IPT) Intelligence Community and Department of Defense Content Discovery & Retrieval Integrated Project Team (CDR IPT) IC/DoD REST Interface Encoding Specification for CDR Search, v1.1 12 May 2011 REVISION/HISTORY

More information

Extension of INSPIRE Download Services TG for Observation Data

Extension of INSPIRE Download Services TG for Observation Data Extension of INSPIRE Download Services TG for Observation Data Simon Jirka (52 North) 14 th June 2014, MIG Workshop on WCS-based INSPIRE Download Services Agenda Motivation Sensor Web Proposed Update for

More information

Automatic Test Markup Language <ATML/> Sept 28, 2004

Automatic Test Markup Language <ATML/> Sept 28, 2004 Automatic Test Markup Language Sept 28, 2004 ATML Document Page 1 of 16 Contents Automatic Test Markup Language...1 ...1 1 Introduction...3 1.1 Mission Statement...3 1.2...3 1.3...3 1.4

More information

GeoDCAT-AP Representing geographic metadata by using the "DCAT application profile for data portals in Europe"

GeoDCAT-AP Representing geographic metadata by using the DCAT application profile for data portals in Europe GeoDCAT-AP Representing geographic metadata by using the "DCAT application profile for data portals in Europe" Andrea Perego, Vlado Cetl, Anders Friis-Christensen, Michael Lutz, Lorena Hernandez Joint

More information

This document is a preview generated by EVS

This document is a preview generated by EVS TECHNICAL REPORT RAPPORT TECHNIQUE TECHNISCHER BERICHT CEN/TR 15449-5 April 2015 ICS 07.040; 35.240.70 English Version Geographic information - Spatial data infrastructures - Part 5: Validation and testing

More information

INSPIRE status report

INSPIRE status report INSPIRE Team INSPIRE Status report 29/10/2010 Page 1 of 7 INSPIRE status report Table of contents 1 INTRODUCTION... 1 2 INSPIRE STATUS... 2 2.1 BACKGROUND AND RATIONAL... 2 2.2 STAKEHOLDER PARTICIPATION...

More information

Proposed update of Technical Guidance for INSPIRE Download services based on SOS

Proposed update of Technical Guidance for INSPIRE Download services based on SOS Proposed update of Technical Guidance for INSPIRE Download services based on SOS Organised by: Simon Jirka, Alexander Kotsev, Michael Lutz Dr. Simon Jirka (jirka@52north.org) 52 North GmbH Workshop - The

More information

INSPIRE Data Specifications What s new? What s next?

INSPIRE Data Specifications What s new? What s next? INSPIRE Data Specifications What s new? What s next? Michael Lutz INSPIRE Conference 25 th June 2013, Firenze www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting legislation What s new?

More information

IR on metadata Change proposal(s) on the Resource Locator element

IR on metadata Change proposal(s) on the Resource Locator element INSPIRE Infrastructure for Spatial Information in Europe IR on metadata Change proposal(s) on the Resource Locator element Type Creator Document for information and discussion CZ, DE, DK, FR, NL, ENV Date/status/version

More information

Web Coverage Services (WCS)

Web Coverage Services (WCS) Web Coverage Services (WCS) www.jrc.ec.europa.eu Thematic Cluster #3 Jordi Escriu Facilitator Thematic Cluster #3 Serving society Stimulating innovation Supporting legislation Coverages in INSPIRE Coverage:

More information

ELF Data Specifications

ELF Data Specifications ELF Data Specifications Presentation to: Author: Date: INSPIRE conference Anja Hopfstock (WP2), Antti Jakobsson (ELF project director) 16 th June 2014 Why extending INSPIRE? INSPIRE too much too little

More information

Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary

Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary December 17, 2009 Version History Version Publication Date Author Description

More information

Name type specification definitions part 1 basic name

Name type specification definitions part 1 basic name Open Geospatial Consortium Inc. Date: 2010-03-31 Reference number of this document: OGC 09-048r3 OGC Name of this document: http://www.opengis.net/doc/pol-nts/def-1/1.1 Version: 1.1 Category: OpenGIS Policy

More information

Dictionary Driven Exchange Content Assembly Blueprints

Dictionary Driven Exchange Content Assembly Blueprints Dictionary Driven Exchange Content Assembly Blueprints Concepts, Procedures and Techniques (CAM Content Assembly Mechanism Specification) Author: David RR Webber Chair OASIS CAM TC January, 2010 http://www.oasis-open.org/committees/cam

More information

Open Geospatial Consortium

Open Geospatial Consortium Open Geospatial Consortium Date: 28-March-2011 Reference number of this document: 10-195 Editors: OGC Aviation Domain Working Group Requirements for Aviation Metadata Copyright 2011 Open Geospatial Consortium.

More information

Rolling work programme for INSPIRE maintenance and implementation

Rolling work programme for INSPIRE maintenance and implementation INSPIRE Maintenance and Implementation Group (MIG) Rolling work programme for INSPIRE maintenance and implementation Creator Michael Lutz Date of last update 2013-12-16 Subject Publisher Type Description

More information

ISO. International Organization for Standardization. ISO/IEC JTC 1/SC 32 Data Management and Interchange WG4 SQL/MM. Secretariat: USA (ANSI)

ISO. International Organization for Standardization. ISO/IEC JTC 1/SC 32 Data Management and Interchange WG4 SQL/MM. Secretariat: USA (ANSI) ISO/IEC JTC 1/SC 32 N 0736 ISO/IEC JTC 1/SC 32/WG 4 SQL/MM:VIE-006 January, 2002 ISO International Organization for Standardization ISO/IEC JTC 1/SC 32 Data Management and Interchange WG4 SQL/MM Secretariat:

More information

GUIDELINE NUMBER E-NAVIGATION TECHNICAL SERVICES DOCUMENTATION GUIDELINE

GUIDELINE NUMBER E-NAVIGATION TECHNICAL SERVICES DOCUMENTATION GUIDELINE ENAV20-9.23 IALA GUIDELINE GUIDELINE NUMBER E-NAVIGATION TECHNICAL SERVICES DOCUMENTATION GUIDELINE Edition x.x Date (of approval by Council) Revokes Guideline [number] DOCUMENT REVISION Revisions to this

More information

Service metadata validation in Spatineo Monitor

Service metadata validation in Spatineo Monitor Service metadata validation in Spatineo Monitor Ilkka Rinne Spatineo Inc. INSPIRE MIG validation workshop JRC/Ispra, 15th & 16th May 2014 Spatineo Linnankoskenkatu 16 A 17 FI-00250 Helsinki +358 20 703

More information

D WSMO Data Grounding Component

D WSMO Data Grounding Component Project Number: 215219 Project Acronym: SOA4All Project Title: Instrument: Thematic Priority: Service Oriented Architectures for All Integrated Project Information and Communication Technologies Activity

More information

INSPIRE & Environment Data in the EU

INSPIRE & Environment Data in the EU INSPIRE & Environment Data in the EU Andrea Perego Research Data infrastructures for Environmental related Societal Challenges Workshop @ pre-rda P6 Workshops, Paris 22 September 2015 INSPIRE in a nutshell

More information

1. CONCEPTUAL MODEL 1.1 DOMAIN MODEL 1.2 UML DIAGRAM

1. CONCEPTUAL MODEL 1.1 DOMAIN MODEL 1.2 UML DIAGRAM 1 1. CONCEPTUAL MODEL 1.1 DOMAIN MODEL In the context of federation of repositories of Semantic Interoperability s, a number of entities are relevant. The primary entities to be described by ADMS are the

More information

Framework specification, logical architecture, physical architecture, requirements, use cases.

Framework specification, logical architecture, physical architecture, requirements, use cases. Title: A5.2-D3 3.3.1 Alignment Editor Specification Editor(s)/Organisation(s): Thorsten Reitz (Fraunhofer IGD) Contributing Authors: Thorsten Reitz (Fraunhofer IGD), Marian de Vries (TUD) References: A1.8-D4

More information

Metadata of geographic information

Metadata of geographic information Metadata of geographic information Kai Koistinen Management of environmental data and information 4.10.2017 Topics Metadata of geographic information What is metadata? Metadata standards and recommendations

More information

Standards, standardisation & INSPIRE Status, issues, opportunities

Standards, standardisation & INSPIRE Status, issues, opportunities Standards, standardisation & INSPIRE Status, issues, opportunities INSPIRE Coordination Team 6 th MIG meeting, 13-14 June 2017 Joint Research Centre The European Commission's science and knowledge service

More information

TestCases for the SCA Assembly Model Version 1.1

TestCases for the SCA Assembly Model Version 1.1 TestCases for the SCA Assembly Model Version 1.1 Committee Specification Draft 04 / Public Review Draft 03 21 June 2011 Specification URIs This version: http://docs.oasis-open.org/opencsa/sca-assembly/sca-assembly-1.1-testcases-csprd03.pdf

More information

Distributed Multitiered Application

Distributed Multitiered Application Distributed Multitiered Application Java EE platform uses a distributed multitiered application model for enterprise applications. Logic is divided into components https://docs.oracle.com/javaee/7/tutorial/overview004.htm

More information

ETSI TS V ( )

ETSI TS V ( ) TECHNICAL SPECIFICATION Universal Mobile Telecommunications System (UMTS); LTE; Presentation layer for 3GPP services () 1 Reference RTS/TSGS-0426307vf00 Keywords LTE,UMTS 650 Route des Lucioles F-06921

More information

Guidelines for the encoding of spatial data

Guidelines for the encoding of spatial data INSPIRE Infrastructure for Spatial Information in Europe Guidelines for the encoding of spatial data Title Status Creator Date 2012-06-15 Subject Publisher Type Description Contributor Format Source Rights

More information

IVI. Interchangeable Virtual Instruments. IVI-3.10: Measurement and Stimulus Subsystems (IVI-MSS) Specification. Page 1

IVI. Interchangeable Virtual Instruments. IVI-3.10: Measurement and Stimulus Subsystems (IVI-MSS) Specification. Page 1 IVI Interchangeable Virtual Instruments IVI-3.10: Measurement and Stimulus Subsystems (IVI-MSS) Specification March, 2008 Edition Revision 1.0.1 Page 1 Important Information The IVI Measurement and Stimulus

More information

Solution Architecture Template (SAT) Design Guidelines

Solution Architecture Template (SAT) Design Guidelines Solution Architecture Template (SAT) Design Guidelines Change control Modification Details Version 2.0.0 Alignment with EIRA v2.0.0 Version 1.0.0 Initial version ISA² Action - European Interoperability

More information

Study and guidelines on Geospatial Linked Data as part of ISA Action 1.17 Resource Description Framework

Study and guidelines on Geospatial Linked Data as part of ISA Action 1.17 Resource Description Framework DG Joint Research Center Study and guidelines on Geospatial Linked Data as part of ISA Action 1.17 Resource Description Framework 6 th of May 2014 Danny Vandenbroucke Diederik Tirry Agenda 1 Introduction

More information

Guidelines for Interface Publication Issue 3

Guidelines for Interface Publication Issue 3 Editorial Note : These Guidelines are based on the OFTEL Guidelines Issue 2, which provided guidance on interface publication under the R&TTE Directive. They have now been amended to reflect the terminology

More information

Action : Streamlining the monitoring and reporting for 2019

Action : Streamlining the monitoring and reporting for 2019 INSPIRE MIWP/2016.2/REVIEW/1.0 Infrastructure for Spatial Information in Europe Action 2016.2: Streamlining the monitoring and reporting for 2019 Review of the Monitoring & Reporting Decision (2009/442/EC)

More information

INSPIRE Coverage Types

INSPIRE Coverage Types INSPIRE Infrastructure for Spatial Information in Europe INSPIRE Coverage Types Title Status Creator Date 2012-06-15 Subject Publisher Type Description Contributor Format Source Rights Identifier Language

More information

Integration of INSPIRE & SDMX data infrastructures for the 2021 population and housing census

Integration of INSPIRE & SDMX data infrastructures for the 2021 population and housing census Integration of INSPIRE & SDMX data infrastructures for the 2021 population and housing census Nadezhda VLAHOVA, Fabian BACH, Ekkehard PETRI *, Vlado CETL, Hannes REUTER European Commission (*ekkehard.petri@ec.europa.eu

More information

For each use case, the business need, usage scenario and derived requirements are stated. 1.1 USE CASE 1: EXPLORE AND SEARCH FOR SEMANTIC ASSESTS

For each use case, the business need, usage scenario and derived requirements are stated. 1.1 USE CASE 1: EXPLORE AND SEARCH FOR SEMANTIC ASSESTS 1 1. USE CASES For each use case, the business need, usage scenario and derived requirements are stated. 1.1 USE CASE 1: EXPLORE AND SEARCH FOR SEMANTIC ASSESTS Business need: Users need to be able to

More information

Testing - an essential aspect of establishing an SDI

Testing - an essential aspect of establishing an SDI Testing - an essential aspect of establishing an SDI Clemens Portele, Anders Östman, Michael Koutroumpas, Xin He, Janne Kovanen, Markus Schneider, Andriani Skopeliti INSPIRE Conference 2011 30 June 2011

More information

DCMI Abstract Model - DRAFT Update

DCMI Abstract Model - DRAFT Update 1 of 7 9/19/2006 7:02 PM Architecture Working Group > AMDraftUpdate User UserPreferences Site Page Actions Search Title: Text: AttachFile DeletePage LikePages LocalSiteMap SpellCheck DCMI Abstract Model

More information

The UK Marine Environmental Data and Information Network MEDIN

The UK Marine Environmental Data and Information Network MEDIN The UK Marine Environmental Data and Information Network MEDIN M. Charlesworth, R. Lowry, H. Freeman, J. Rapaport, B Seeley Content MEDIN - a brief overview for context Discovery Metadata Standard and

More information

Proposed Revisions to ebxml Technical. Architecture Specification v1.04

Proposed Revisions to ebxml Technical. Architecture Specification v1.04 Proposed Revisions to ebxml Technical Architecture Specification v1.04 Business Process Team 11 May 2001 (This document is the non-normative version formatted for printing, July 2001) Copyright UN/CEFACT

More information

INSPIRE roadmap and architecture: lessons learned INSPIRE 2017

INSPIRE roadmap and architecture: lessons learned INSPIRE 2017 INSPIRE roadmap and architecture: lessons learned INSPIRE 2017 Stijn Goedertier GIM Thierry Meessen GIM Jeff Konnen ACT Luxembourg Patrick Weber ACT Luxembourg 1 Administration du cadastre et de la topographie

More information

Test Assertions for the SCA Web Service Binding Version 1.1 Specification

Test Assertions for the SCA Web Service Binding Version 1.1 Specification Test Assertions for the SCA Web Service Binding Version 1.1 Specification Working Draft 02 7 October 2009 Specification URIs: This Version: http://docs.oasis-open.org/sca-bindings/sca-wsbinding-1.1-test-assertions-cd01.html

More information

A Standards-Based Registry/Repository Using UK MOD Requirements as a Basis. Version 0.3 (draft) Paul Spencer and others

A Standards-Based Registry/Repository Using UK MOD Requirements as a Basis. Version 0.3 (draft) Paul Spencer and others A Standards-Based Registry/Repository Using UK MOD Requirements as a Basis Version 0.3 (draft) Paul Spencer and others CONTENTS 1 Introduction... 3 1.1 Some Terminology... 3 2 Current Situation (Paul)...4

More information

Test Assertions for the SCA Assembly Model Version 1.1 Specification

Test Assertions for the SCA Assembly Model Version 1.1 Specification Test Assertions for the SCA Assembly Model Version 1.1 Specification Committee Draft 03 10 August 2010 Specification URIs: This Version: http://docs.oasis-open.org/opencsa/sca-assembly/sca-assembly-1.1-test-assertions-cd03.html

More information

Proposed Revisions to ebxml Technical Architecture Specification v ebxml Business Process Project Team

Proposed Revisions to ebxml Technical Architecture Specification v ebxml Business Process Project Team 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 Proposed Revisions to ebxml Technical Architecture Specification v1.0.4 ebxml Business Process Project Team 11

More information

European Commission. Immigration Portal Development Case. Date: 08/06/2007 Version: 1.0 Authors: Revised by: Approved by: Public: Reference Number:

European Commission. Immigration Portal Development Case. Date: 08/06/2007 Version: 1.0 Authors: Revised by: Approved by: Public: Reference Number: EUROPEAN COMMISSION DIRECTORATE-GENERAL INFORMATICS Information systems Directorate European Commission Immigration Portal Development Case Date: 08/06/2007 Version: 1.0 Authors: Revised by: Approved by:

More information

A tutorial report for SENG Agent Based Software Engineering. Course Instructor: Dr. Behrouz H. Far. XML Tutorial.

A tutorial report for SENG Agent Based Software Engineering. Course Instructor: Dr. Behrouz H. Far. XML Tutorial. A tutorial report for SENG 609.22 Agent Based Software Engineering Course Instructor: Dr. Behrouz H. Far XML Tutorial Yanan Zhang Department of Electrical and Computer Engineering University of Calgary

More information

Draft ETSI EN V1.0.0 ( )

Draft ETSI EN V1.0.0 ( ) Draft EN 319 522-4-3 V1.0.0 (2018-05) Electronic Signatures and Infrastructures (ESI); Electronic Registered Delivery Services; Part 4: Bindings; Sub-part 3: Capability/requirements bindings 2 Draft EN

More information

U.S. Department of Defense. High Level Architecture Interface Specification. Version 1.3

U.S. Department of Defense. High Level Architecture Interface Specification. Version 1.3 U.S. Department of Defense High Level Architecture Interface Specification Version 1.3 2 April 1998 Contents 1. Overview... 1 1.1 Scope...1 1.2 Purpose...1 1.3 Background...1 1.3.1 HLA federation object

More information

ORCA-Registry v2.4.1 Documentation

ORCA-Registry v2.4.1 Documentation ORCA-Registry v2.4.1 Documentation Document History James Blanden 26 May 2008 Version 1.0 Initial document. James Blanden 19 June 2008 Version 1.1 Updates for ORCA-Registry v2.0. James Blanden 8 January

More information

Intelligence Community and Department of Defense Content Discovery & Retrieval Integrated Project Team (CDR IPT)

Intelligence Community and Department of Defense Content Discovery & Retrieval Integrated Project Team (CDR IPT) Intelligence Community and Department of Defense Content Discovery & Retrieval Integrated Project Team (CDR IPT) IC/DoD REST Encoding Specification for CDR Brokered Search v1.1 12 May 2011 REVISION/HISTORY

More information

SCA JMS Binding v1.1 TestCases Version 1.0

SCA JMS Binding v1.1 TestCases Version 1.0 SCA JMS Binding v1.1 TestCases Version 1.0 Committee Specification Draft 01 / Public Review Draft 01 8 November 2010 Specification URIs: This Version: http://docs.oasis-open.org/opencsa/sca-bindings/sca-jmsbinding-1.1-testcases-1.0-csprd01.html

More information

AMWA Specification. AMWA Specification Policy Application Specification UL Guidelines May 24, 2016 (rev 1.1) Executive Summary

AMWA Specification. AMWA Specification Policy Application Specification UL Guidelines May 24, 2016 (rev 1.1) Executive Summary AMWA Specification AMWA Specification Policy Application Specification UL Guidelines May 24, 2016 (rev 1.1) Executive Summary This document describes requirements and recommended practices for creating

More information

Intel Authoring Tools for UPnP* Technologies

Intel Authoring Tools for UPnP* Technologies Intel Authoring Tools for UPnP* Technologies (Version 1.00, 05-07-2003) INFORMATION IN THIS DOCUMENT IS PROVIDED IN CONNECTION WITH INTEL PRODUCTS. NO LICENSE, EXPRESS OR IMPLIED, BY ESTOPPEL OR OTHERWISE,

More information

Conformance Testing. Service Offering Description

Conformance Testing. Service Offering Description EUROPEAN COMMISSION DIGIT Connecting Europe Facility Conformance Testing Service Offering Description [Subject] Version [3.0] Status [Final] Date: 12/06/2017 Document Approver(s): Approver Name CORNEAU

More information

Apache Wink Developer Guide. Draft Version. (This document is still under construction)

Apache Wink Developer Guide. Draft Version. (This document is still under construction) Apache Wink Developer Guide Software Version: 1.0 Draft Version (This document is still under construction) Document Release Date: [August 2009] Software Release Date: [August 2009] Apache Wink Developer

More information

Title: Author(s)/Organisation(s): Working Group: References: Quality Assurance: A5.2-D3 [3.7] Information Grounding Service Component Specification

Title: Author(s)/Organisation(s): Working Group: References: Quality Assurance: A5.2-D3 [3.7] Information Grounding Service Component Specification Title: A5.2-D3 [3.7] Information Grounding Service Component Specification Author(s)/Organisation(s): Ana Belén Antón/ETRA Working Group: Architecture Team/WP05 References: A1.8-D5 User Involvement Document,

More information

Basic Principles of MedWIS - WISE interoperability

Basic Principles of MedWIS - WISE interoperability Co-ordination committee seminar of the national focal points Basic Principles of MedWIS - WISE interoperability Eduardo García ADASA Sistemas Nice - France Agenda WISE vs MedWIS WISE WISE DS WISE vs WISE

More information

Beginning To Define ebxml Initial Draft

Beginning To Define ebxml Initial Draft Beginning To Define ebxml Initial Draft File Name Version BeginningToDefineebXML 1 Abstract This document provides a visual representation of how the ebxml Architecture could work. As ebxml evolves, this

More information

Experiences with. data for use in apps

Experiences with. data for use in apps Experiences with publishing INSPIRE data for use in apps Presentation to: Author: Date: INSPIRE Conference 2014 Clemens Portele 2014 06 18 From INSPIRE Conference 2013: "How to use INSPIRE data?" INSPIRE

More information

European Platform on Rare Diseases Registration

European Platform on Rare Diseases Registration The European Commission s science and knowledge service Joint Research Centre European Platform on Rare Diseases Registration Simona Martin Agnieszka Kinsner-Ovaskainen Monica Lanzoni Andri Papadopoulou

More information

ISO 2146 INTERNATIONAL STANDARD. Information and documentation Registry services for libraries and related organizations

ISO 2146 INTERNATIONAL STANDARD. Information and documentation Registry services for libraries and related organizations INTERNATIONAL STANDARD ISO 2146 Third edition 2010-04-15 Information and documentation Registry services for libraries and related organizations Information et documentation Services de registre pour les

More information

Information Technology Document Schema Definition Languages (DSDL) Part 1: Overview

Information Technology Document Schema Definition Languages (DSDL) Part 1: Overview ISO/IEC JTC 1/SC 34 Date: 2008-09-17 ISO/IEC FCD 19757-1 ISO/IEC JTC 1/SC 34/WG 1 Secretariat: Japanese Industrial Standards Committee Information Technology Document Schema Definition Languages (DSDL)

More information

Detailed analysis + Integration plan

Detailed analysis + Integration plan Outline Integration methodology Detailed analysis + Integration plan Conclusions 2 Outline Integration methodology Detailed analysis + Integration plan Conclusions 3 EULF-ISA Integration: methodology Phase

More information

(Geo)DCAT-AP Status, Usage, Implementation Guidelines, Extensions

(Geo)DCAT-AP Status, Usage, Implementation Guidelines, Extensions (Geo)DCAT-AP Status, Usage, Implementation Guidelines, Extensions HMA-AWG Meeting ESRIN (Room D) 20. May 2016 Uwe Voges (con terra GmbH) GeoDCAT-AP European Data Portal European Data Portal (EDP): central

More information

Test Assertions Part 1 - Test Assertions Model Version 1.0

Test Assertions Part 1 - Test Assertions Model Version 1.0 Test Assertions Part 1 - Test Assertions Model Version 1.0 Draft 1.0.3 20 January 2010 Specification URIs: This Version: Previous Version: [N/A] Latest Version: http://docs.oasis-open.org/tag/model/v1.0/testassertionsmodel-1.0.html

More information

Addressing the needs of INSPIRE: The Challenges of improving Interoperability within the European Union

Addressing the needs of INSPIRE: The Challenges of improving Interoperability within the European Union Addressing the needs of INSPIRE: The Challenges of improving Interoperability within the European Union Andrew Coote Facilitator, Addresses Thematic Working Group andrew.coote@consultingwhere.com Disclaimer

More information

INSPIRE Infrastructure for Spatial Information in Europe

INSPIRE Infrastructure for Spatial Information in Europe INSPIRE Infrastructure for Spatial Information in Europe INSPIRE Domain Model Title Creator INSPIRE Domain Model IOC Services Team Date 23-03-2010 Subject Status Publisher Type Description Format Source

More information

Extending INSPIRE Code Lists and Application Schemas

Extending INSPIRE Code Lists and Application Schemas Extending INSPIRE Code Lists and Application Schemas Astrid Feichtner 1 Roland Wanninger 2 Markus Seifert 1 1 Secretariat for Spatial Data Infrastructure in Bavaria 2 Bavarian State Office for the Conservation

More information

PRINCIPLES AND FUNCTIONAL REQUIREMENTS

PRINCIPLES AND FUNCTIONAL REQUIREMENTS INTERNATIONAL COUNCIL ON ARCHIVES PRINCIPLES AND FUNCTIONAL REQUIREMENTS FOR RECORDS IN ELECTRONIC OFFICE ENVIRONMENTS RECORDKEEPING REQUIREMENTS FOR BUSINESS SYSTEMS THAT DO NOT MANAGE RECORDS OCTOBER

More information

Toward Horizon 2020: INSPIRE, PSI and other EU policies on data sharing and standardization

Toward Horizon 2020: INSPIRE, PSI and other EU policies on data sharing and standardization Toward Horizon 2020: INSPIRE, PSI and other EU policies on data sharing and standardization www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting legislation The Mission of the Joint Research

More information

Simile Tools Workshop Summary MacKenzie Smith, MIT Libraries

Simile Tools Workshop Summary MacKenzie Smith, MIT Libraries Simile Tools Workshop Summary MacKenzie Smith, MIT Libraries Intro On June 10 th and 11 th, 2010 a group of Simile Exhibit users, software developers and architects met in Washington D.C. to discuss the

More information

Framework for building information modelling (BIM) guidance

Framework for building information modelling (BIM) guidance TECHNICAL SPECIFICATION ISO/TS 12911 First edition 2012-09-01 Framework for building information modelling (BIM) guidance Cadre pour les directives de modélisation des données du bâtiment Reference number

More information

European Conference on Quality and Methodology in Official Statistics (Q2008), 8-11, July, 2008, Rome - Italy

European Conference on Quality and Methodology in Official Statistics (Q2008), 8-11, July, 2008, Rome - Italy European Conference on Quality and Methodology in Official Statistics (Q2008), 8-11, July, 2008, Rome - Italy Metadata Life Cycle Statistics Portugal Isabel Morgado Methodology and Information Systems

More information

INSPIRE 2013, Florence

INSPIRE 2013, Florence Joining up INSPIRE XML and Core Location RDF schemas to interconnect Belgian address data INSPIRE 2013, Florence 25 June 2013 Stijn.Goedertier@pwc.be Andrea Perego Michael Lutz Nikolaos Loutas Vassilios

More information

CWIC Data Partner s Guide (OpenSearch) Approval Date:

CWIC Data Partner s Guide (OpenSearch) Approval Date: CEOS CWIC Project CWIC Data Partner s Guide (OpenSearch) Approval Date: 2017-05-09 Publication Date: 2017-05-10 Reference number of this Document: CWIC-DOC-14-001r010 Document version: V1.0 Category: CWIC

More information

MIG-T - Discussion #2402 MIWP-7b: Extension of Download Service Technical Guidelines for Web Coverage Services (WCS)

MIG-T - Discussion #2402 MIWP-7b: Extension of Download Service Technical Guidelines for Web Coverage Services (WCS) MIG-T - Discussion #2402 MIWP-7b: Extension of Download Service Technical Guidelines for Web Coverage Services (WCS) 30 Mar 2015 11:10 am - Michael Lutz Status: Priority: Assignee: INSPIRE Theme: Description

More information

Java J Course Outline

Java J Course Outline JAVA EE - J2SE - CORE JAVA After all having a lot number of programming languages. Why JAVA; yet another language!!! AND NOW WHY ONLY JAVA??? CHAPTER 1: INTRODUCTION What is Java? History Versioning The

More information

Using the OGC SOS as INSPIRE Download Service for Observation Data

Using the OGC SOS as INSPIRE Download Service for Observation Data Using the OGC SOS as INSPIRE Download Service for Observation Data Simon Jirka (52 North) Alexander Kotsev (JRC) Michael Lutz (JRC) Matthes Rieke (52 North) Robin Smith (JRC) Paul Smits (JRC) 18 th June

More information

FP7-INFRASTRUCTURES Grant Agreement no Scoping Study for a pan-european Geological Data Infrastructure D 4.4

FP7-INFRASTRUCTURES Grant Agreement no Scoping Study for a pan-european Geological Data Infrastructure D 4.4 FP7-INFRASTRUCTURES-2012-1 Grant Agreement no. 312845 Scoping Study for a pan-european Geological Data Infrastructure D 4.4 Report on recommendations for implementation of the EGDI Deliverable number D4.4

More information

How to Create a European INSPIRE Compliant Data Specification. Anja Hopfstock, BKG (Germany) Morten Borrebæk, SK (Norway)

How to Create a European INSPIRE Compliant Data Specification. Anja Hopfstock, BKG (Germany) Morten Borrebæk, SK (Norway) How to Create a European INSPIRE Compliant Data Specification Anja Hopfstock, BKG (Germany) Morten Borrebæk, SK (Norway) ESDIN Key Goals Further the ambition of the European Commission to create a European

More information