A5.2-D3 [3.6] HUMBOLDT Processing Components General Model and Implementations

Size: px
Start display at page:

Download "A5.2-D3 [3.6] HUMBOLDT Processing Components General Model and Implementations"

Transcription

1 Title: A5.2-D3 [3.6] HUMBOLDT Processing Components General Model and Implementations Author(s)/Organisation(s): Daniel Fitzner (FhG), Jan Jezek (HSRS), Jan Kolar (INGR), Thorsten Reitz (FhG) Working Group: Architecture Team / WP5 References: A5.2-D3 [3.0] A Lightweight Introduction to the HUMBOLDT Framework V3 A5.2-D3 [3.1] Specification Introduction and Overview V3 A5.2-D3 [3.2] Mediator Service Component Specification A5.2-D3 [3.3] Conceptual Schema Specification and Mapping A5.2-D3 [3.3.1] The HUMBOLDT Alignment Editor A5.2-D3 [3.4] Context Service Specification A5.2-D3 [3.5] Workflow Design and Construction Service Specification A5.2-D3 [3.6] Processing Components General Model and Implementations A5.2-D3 [3.7] Information Grounding Service Component Specification A5.3-D3 Humboldt Commons Specification / Framework Common Data Model V3 Quality Assurance: Review WP Leader: Review dependent WP leaders: Review Executive Board: Review others: Thorsten Reitz (FhG) Moses Gone (FhG) Delivery Date: Short Description: This document gives information on processing components, an essential group of software components within the HUMBOLDT Data Harmonisation Framework. The document is intended to give an overview on the functionality and scope of the individual HUMBOLDT harmonisation processing components. Further, it gives information for developers on how to extend the HUMBOLDT framework with additional processing components. 1

2 Keywords: Harmonisation, geoprocessing, web services, transformers, processing components History: Version Author(s) Status Comment 001 Jan Jezek New Initial version 002 Daniel Fitzner / Thorsten Reitz / Jan Jezek updated interface structure 003 Thorsten Reitz Updated according to RM-ODP 004 Daniel Fitzner final Finalized 2

3 Table of contents 1 Introduction Purpose of this document Abbreviations used in this document Definitions valid in this document Standards used in this document OGC Web Feature Service (WFS) OGC Geographic Markup Language (GML) OGC Web Processing Service (WPS) ISO 19115: Enterprise Viewpoint Requirements Metadata Requirements Cluster Language Translation Requirements Cluster Edge Matching Requirements Cluster Conceptual Schema Translation Requirements Cluster Coordinate Transformation Requirements Cluster Scale Transformation Requirements Cluster General Requirements Business Processes Simple data integration process Complex data integration process Computational and Service Viewpoint Types of processing components in HUMBOLDT Guidelines for WDCS Process Registration Execution Relevant Metadata hold by the WDCS Composition relevant Metadata for non-harmonisation processing components Additional metadata required for harmonisation processing components Harmonisation Categories and Generic Interfaces HUMBOLDT Processing Components Conceptual Schema Transformer Scenario Integration Example Requirements INSPIRE requirements Requirements on processing capabilities Automatic Workflow handling Interface Control Document for WPS interface References Language Transformation Specification What can be transformed and how? Multilingual User Interfaces

4 Data in multiple languages Queries Scenario Integration Architecture The HUMBOLDT Language Transforming Processor Automated Workflow Handling The HUMBOLDT Language Transformer Language Transformer operations Implementation Interface Control Document for WPS Interface Multiple Representation Merger Scenario Integration Requirements for the Component Automated Workflow Handling Implementation Interface Control Document for WPS interface Coordinate Transformation Service Scenario Integration Requirements Relationship to the INSPIRE requirements for coordinate transformation Automatic Workflow handling Implementation Interface Control Document for WPS interface The HUMBOLDT Edge Matching Service Scenario Integration Requirements for the Component Automatic Workflow handling Interface Control Document Technology Viewpoint Guidelines for Process Implementation Direct implementation on the Mediator platform Registration to the Mediator Service Libraries for OGC WPS implementation Example Implementation on the Mediator platform Registration to the WDCS Interface Control Documents for WPS interfaces Schema Transformation Coordinate Transformation Edge Matching (Data Set Cleaning) Edge Matching (CoverageAligner)

5 List of Figures Figure 1: BPMN diagram of a simple data integration process Figure 2: The general setting Figure 3: The relation of concrete Model Descriptions to the Transformation Definition Figure 4: Language Translation Architecture Figure 5: Language Transformer operations Figure 6: Processing chain constructed to fuse two data sets in Protected Areas Figure 7: Administrative Boundaries Italy 1:25.000, Ligure 1:5.000 and Rivers 1: Figure 8: Administrative Boundaries Ligure 1:5.000 and Special Protected Areas Zoning Figure 9: Administrative Boundaries Italy 1: from two sources, with a visible translation because of different reference system transformations Figure 10: The Transformer interface structure List of Tables Table 1: Language Transformer operations Table 2: Parameters of a CRS transformation Table 3: Semantic Annotation of inputs (parameter mapping)

6 1 Introduction This document describes processing components, an essential group of software components within the HUMBOLDT Data Harmonisation Framework. It is intended to give an overview on the functionality and scope of the individual HUMBOLDT harmonisation processing components and services. Further, it gives information for developers on how to extend the HUMBOLDT framework with additional components / services. 1.1 Purpose of this document This document describes processing components, an essential group of software components within the HUMBOLDT Data Harmonisation Framework. It also establishes common architectural elements for processing services to be used in the HUMBOLDT Framework, especially for harmonisation processing services, to make them useable by other components such as the Mediator Service (MS) and the Workflow Design and Construction Service (WDCS).The terms processing component and Transformer are used synonymously in the following. Specifically, this document contains information on the following processing services: Conceptual Schema Transformer (CST) : The CST is a service that is able to apply a schema transformation to a geodata set (expressed in a specific source Application Schema (A)) in order to provide a geodata set (expressed in a specific target Application Schema B). A schema mapping between schema A and schema B has to be defined beforehand in order to accomplish the transformation. Language Transformer (LT) : The LT is capable of transforming attribute values expressed in one natural language to another, as long as the words are recognized, i.e. available in one of the thesauri that the LT accesses. Multiple Representation Merger (MRM): The MRM is used to combine geographic data sets both horizontally, (e.g. across a administrative boundary) and vertically (e.g. different scales) and contains rules on how to fuse features occurring multiple times. Coordinate Transformation Service (CTS): The CTS is a simple reprojection service that handles some of the more specific reference systems used in the HUMBOLDT scenarios. Edge Matching Service (EMS): The EMS ensures geometric consistency, especially with merged data sets, by aligning up edges and points that should have an identical shape. In addition to the functional descriptions of these processing service components, this document also contains information on the implementation of such services and on requirements that they have to fulfil so that automated workflow assembly and execution become possible. From the perspective of users who are working with geoinformation, the processing components are made accessible by using Web Processing Service (WPS) clients, which become more and more common. If an end user directly wishes to use one of these services, it is therefore a relatively easy thing to accomplish using off-the-shelf software. Please note that this document, compared to other specifications, does not contain an Information Viewpoint (it references structures specified elsewhere, such as the goml defined in the Conceptual Schema Mapping Document, the Lineage Model from the Mediator Service), but does contain an Technical Viewpoint. 6

7 1.2 Abbreviations used in this document This section summarizes the abbreviations used specifically for this service component. It does not repeat information found in the introduction and specification overview document [3.0]. 1.3 Definitions valid in this document There are no specific definitions used within the scope of this document. Any definitions that are given as part of the Specification Introduction and Overview document [3.0] are also valid in this document. 1.4 Standards used in this document The Mediator Service Component makes use of some of the core standards in geoinformation for the provision of maps and other products, as well as raw data. Most notably, these are: OGC Web Feature Service (WFS) The Web Feature Service is, used both for provisioning of data to the processing service components OGC Geographic Markup Language (GML) GML 3.1 and are used as primary geodata exchange model and encoding for the processing service components OGC Web Processing Service (WPS) The WPS is used as a basic interface for the Transformer specification, which is used as the interface from the Mediator Service to processing capabilities ISO 19115:2003 Many of the Metadata elements described in ISO are also used in the HUMBOLDT metadata management. Actually, the HUMBOLDT metadata is a profile of ISO

8 2 Enterprise Viewpoint The Enterprise viewpoint describes the functional purpose of the component and it's integration with business processes. This chapter first of all provides directly relevant requirements and then makes use of BPMN. UML 2.0 Use Cases are given for the individual transformers as described in chapter Requirements Many of the requirements that were collected from the HUMBOLDT scenarios directly indicate the need to perform different types of data processing, both for data harmonisation and for domainspecific purposes. Quite a few of these are specific to the business process of the scenario, but there were also requirements that commonly occurred. These requirements were consolidated into several clusters, from which the directly relevant ones are cited in the next sections Metadata Requirements Cluster METADATA01: The SYSTEM shall be able to transform Metadata from existing metadata schemas employed in the scenarios to the HUMBOLDT, ISO, and INSPIRE schemas (OS_001, BSS_080) Language Translation Requirements Cluster LANGUAGE01: The SYSTEM shall provide a service for the translation of type names, attribute names, geographical names and attribute values from controlled vocabularies (OS_005, BSS_054, HAR_001, HAR_002, HAR_003, HAR_004) Edge Matching Requirements Cluster EMS01: The SYSTEM shall provide a service that can connect broken features such as river or transport networks (ES_003). EMS02: The SYSTEM shall provide a service that can clean a set of area geometries so that no overlap or gaps remain within the hull of the set of geometries (ES_003). EMS03: The SYSTEM shall provide a service that can match polyline features and polygon features to a reference polyline or polygon data set, even if there are only small shared geometric parts (BSS_071, ES_003). EMS04: In all edge matching processes, the SYSTEM shall guarantee that the topology of dependent data sets in relation to a data set that is modified by the service is maintained, if these dependent data sets are also part of the same harmonisation request (UPS_008, FS_008). EMS05: The SYSTEM shall ensure that a USER can define whether he will accept automatic edge matching or not (HS-Ocean new requirement). EMS06: Before any Edge Matching is performed, the SYSTEM shall perform required schema translation, coordinate transformation and generalization. 8

9 2.1.4 Conceptual Schema Translation Requirements Cluster CST01: The SYSTEM shall be able to perform transformations of data structure to transform geographic information to a harmonised schema as defined in a scenario (PAS_001, ES_004, OS_004, HAR_013). CST02: The SYSTEM shall be able to reclassify attributes according to different classifications (ES_015) CST03: The SYSTEM shall guide the USER and ask him/her for checking/validating the transcoding case by case (PAS_002). CST04: The SYSTEM shall be able to select attributes of individual features or of sets of features out of a data set by applying Filters. CST05: The SYSTEM shall be able to perform renaming of types and of attributes (see 2. in CST definition). CST06: The SYSTEM shall be able to perform reclassification (see 3. in CST definition). CST07: The SYSTEM shall be able to perform merging and splitting of Features (see 4. in CST definition). CST08: The SYSTEM shall be able to perform type conversions, specifically spatial conversions, UoM conversions and alphanumeric conversions (see 6. in CST definition). CST09: The SYSTEM shall be able to assign derived spatial properties such as centroids or bounding boxes. CST10: The SYSTEM shall be able to assign default and null values where required Coordinate Transformation Requirements Cluster CT01: The SYSTEM shall provide a service for geographical coordinate transformation. Each scenario should define their required accuracy, and if needed provide specific algorithms. CT01: The SYSTEM shall document the relative accuracy of performed transformations. Absolute accuracy can often not be defined since it is not known in advance Scale Transformation Requirements Cluster SCALE01: The SYSTEM shall be able to perform automatic generalization to a smaller scale. SCALE02: The SYSTEM shall be able to resample coverages to match coverages/grids with different resolution General Requirements PCG01: When any transformation, the SYSTEM shall add information on all modifications performed to the data set s metadata. Adding metadata to individual features is not required (ES_005, PA and others). 9

10 2.2 Business Processes Processing components support a wide range of very different business processes, different within each scenario. Nonetheless, at least for the harmonisation processing components which are the focus point of this document, there are common business processes or at least, activities Simple data integration process In this process, a single processing component is used to resolve a specific heterogeneity. This process occurs in most scenarios, especially if there is a need for off-line (before the data is served via a download service) harmonisation. Often, multiple simple processes are executed after each other, but on a single data set, to make it useable in a certain environment, such as via an INSPIRE download service. In this simple process, components as described in this document are invoked directly. Figure 1: BPMN diagram of a simple data integration process In the simple integration process as shown in Figure 1, it is assumed that there is a concrete business process that puts requirements on a given data set that this data set does not fulfil. As an example, consider a data set that needs to be made available using an INSPIRE defined data model, but is currently expressed in a proprietary source model. 10

11 This specific heterogeneity needs to be identified (first step). On the basis of the identified heterogeneity, a processing component can be chosen and configured (second step). If no suitable component is available, different means of satisfying the requirement on the data have to be found, such as manual modification using custom tools or individually defined transformations, e.g. based on XSLT. With the processing component configured, the data can now be transformed and can be used in the concrete business process Complex data integration process In this process, multiple processing components are used to resolve multiple data heterogeneities. This type of process can be used to integrate multiple data sets or to perform on-line harmonisation (i.e. direct provision of the transformed data). In a complex integration process, processing components are invoked in an indirect fashion, via a predefined or automatically created workflow that is executed by the Mediator Service. Therefore, the business process for the complex data integration process is identical to that of the Mediator Service. 11

12 3 Computational and Service Viewpoint This chapter provides information on how processing components are used in the context of the HUMBOLDT data harmonisation framework. Figure 2: The general setting The general setting is shown in Figure 2. Individual processing services are either implemented on the platform the HUMBOLDT Mediator Service is implemented and can be invoked via direct method invocation (in case of the reference implementation, via java-method invocation). Or they are wrapped as WPS. The HUMBOLDT Workflow Design and Construction Service holds metadata on the individual processing services and delivers the information necessary for execution to the HUMBOLDT Mediator Service. Section 3.2 gives details on how specific processing components must be registered such that they are accessible to the HUMBOLDT framework components, namely the HUMBOLDT Workflow Design and Construction Service. 3.1 Types of processing components in HUMBOLDT There are two types of HUMBOLDT processing components: Those, that perform some sort of task / harmonisation related to the HUMBOLDT constraint model as specified in the HUMBOLDT Commons document. These are called harmonisation processing components or harmonisation transformers. For example, a process performing spatial reference system transformation is, according to the HUMBOLDT characterisation, a harmonisation processing component. All others perform tasks that do not directly relate to the HUMBOLDT constraint model, e.g. buffer or overlay, are so-called non-harmonisation processing components. 12

13 To enable the framework to handle individual, user-defined or implemented processing components, several rules exist that these components have to fulfil. These include rules for implementation (please refer to chapter 5 for details) as well as rules for the metadata of the components. 3.2 Guidelines for WDCS Process Registration The HUMBOLDT framework has a decoupled workflow design (WDCS) and execution (MS) process. Therefore, running or executable instances of processes must be accessible to the MS, the metadata required for composition accessible to the WDCS Execution Relevant Metadata hold by the WDCS Execution relevant metadata is equal for harmonisation and non-harmonisation processing components / Transformers. What is passed from the WDCS to the MS is a set of Transformer descriptions, each containing the information on how to access the processing component. Hence, when providing (registering / implementing) a new processing component to HUMBOLDT, the information on how to access the processing component must be provided to the WDCS. This information differs depending on whether the process is implemented in java or encapsulated in a WPS. Java Transformer: In case, a Transformer is implemented in java, the only information needed is the class name. WPS Transformer: In case, the process is offered via WPS, the WPS URL and the ProcessIdentifier (as it appears in the GetCapabilities response of the WPS) must be provided Composition relevant Metadata for non-harmonisation processing components Non-harmonisation processing components can be composed by users using the workflow GUI. Therefore, WDCS must know the names and types of the inputs required by a specific processing component such that a user can specify e.g. how to instantiate a certain input named bufferdistance. Further, additional metadata on the inputs and outputs is required. This is mainly metadata related to the HUMBOLDT constraint model, e.g. whether the process changes the spatial reference system of the input, whether the output schema is the same as the input schema etc. Details can be found in the WDCS specification. In case, a process is implemented within a WPS and registered to the WDCS by providing the WPS URL and ProcessIdentifier, the information that already appears in the DescribeProcess-response (e.g. the names of the inputs, whether they are mandatory or optional etc.) of that process is automatically parsed. However, the additionally metadata related to the HUMBOLDT constraint model must be provided manually. In case of a java transformation, it is important to mention that the names and types of the parameters as expected in the implementation of the execute() operation (see chapter 5) and stored in parameters must correspond to the names and types of parameters as registered to the WDCS. This is the responsibility of the user implementing and registering a processing component and can not be checked automatically. A violation would obviously result in a runtime exception within the execute() operation of that individual process, e.g. if a parameter with a certain name is expected but due to wrong process metadata registered to the WDCS, the parameter is set via a different name. 13

14 3.2.3 Additional metadata required for harmonisation processing components For harmonisation processing components, the registration process involves all steps as required for non-harmonisation processing components but some additional information needs to be provided. This is due to the fact that, concerning harmonisation processing components, the HUMBOLDT framework aims at dynamic binding. This means, in case of a constraint violation in the presence of a user request, the WDCS automatically recognizes suitable processing components that can solve the problem and delivers this information the HUMBOLDT Mediator Service. Hence, in case of harmonisation processing components, the information must be provided which harmonisation category the process belongs to (i.e. the semantics of the process) and what the individual inputs refer to (i.e. the meaning / semantics of the inputs) Harmonisation Categories and Generic Interfaces The HUMBOLDT framework comes with a set of harmonisation categories such as SchemaTransformation, SpatialReferenceSystemTransformation etc. Each category comes with a generic signature that allows the WDCS to handle individual harmonisation processes belonging to that category in an automated way. Therefore, in case a harmonisation processing component is registered to the system, the information must be provided, which harmonisation category it belongs to and how the individual inputs relate to the inputs of the generic signature. See chapter 5 for details. 14

15 4 HUMBOLDT Processing Components This chapter gives a description for each of the harmonisation processing components specified and implemented as part of the HUMBOLDT harmonisation framework. 4.1 Conceptual Schema Transformer The Conceptual Schema Transformer is capable of transforming geodata and metadata that is expressed in one conceptual schema to another conceptual schema. A lot of organisations have developed their geospatial data structures. These have been developed to take care of their specific business processes. Many organisations, especially public administrations, have different objectives to be achieved but similar geospatial data needs, so basic data sources are very similar but they are organised and structured in a different way. The OGC WFS standard provide a common interface to geospatial data structures through a common XML encoding language, the GML, that can be used as transport format. In fact, GML is used to express geographic information in a manner that can be readily shared. Moreover, an application schema (XML schema) can be designed as an extension of the GML core schema in order to describe domain specific objects. The partners active in the HUMBOLDT project have employed a wide variety of methods on how models are described. These include more implicit descriptions, such as when handling a simple shapefile with attached tabular data, but also very explicit descriptions such as UML models. In order for a model transformation specification to be of practical use in a wide range of scenarios, a loose coupling between the transformation definitions and the model description has to be ensured. In doing so, it is possible to map models described in different ways, such as: Relational database model descriptions, such as SQL/DDL/IDL, INTERLIS 1 XML model descriptions, such as XSD, Schematron Object-oriented model descriptions, such as UML, INTERLIS 2 Specific variants of the above, such as GML Application Schema Ontology Languages, such as DAML+OIL, OWL, OWL-DL Application-specific model description languages, such as shapefile tables, EA models, Figure 3: The relation of concrete Model Descriptions to the Transformation Definition To define the exact mappings required to transform data from one conceptual schema to another, the CST requires a mapping definition expressed in the Ontology Mapping Language. This language 15

16 stems from semantic web research and can be loosely coupled to almost any Conceptual Schema language, such as UML, OWL, or GML Application Schemas. Together with the fact that the language is very expressive and allows to overcome many typical mapping mismatches, this made it a very suitable candidate. For further details on the OML and the goml profile that is used by the CST (and HALE), please refer to the Conceptual schema specification and mapping document Scenario Integration Example The following example is taken form the ERiskA scenario. Albert needs to make an updated hydrography dataset that he received available through an INSPIRE download service. However, the data set has not yet been collected to directly match the INSPIRE hydrography conceptual schema. Instead, it follows the Bavarian interpretation of the ALKIS model, which contains only very few Spatial Object Types and makes use of a lot of classification attributes. Albert knows this and has prepared a conceptual schema mapping between the internal schema and the INSPIRE hydrography one. However, he also knows that the data is of high quality otherwise and that it doesn t need other transformations applied. Together with the new data set, he sets this mapping file as the input values for the CST installation. For this, he makes use of the WPS client in his GIS application. This generic WPS client has created a user interface based on the DescribePRocess response of the CST installation. After checking that the inputs are correct, Albert starts the request. After a few minutes, the transformed data set is returned. Albert saves it and replaces the old data set offered by the hydrography download service with the updated one Requirements This section gives the requirements for this processing component, starting with the INSPIRE requirement for schema transformation INSPIRE requirements Schema Transformation is one of the harmonisation categories foreseen in INSPIRE. Within a document called Implementing Rule for Transformation Services, INSPIRE gives an abstract service model for Transformation Services in general. The HUMBOLDT Conceptual Schema Transformer corresponds to this abstract service model by applying the following operations mapping: INSPIRE Transformation Service Operation GET SERVICE METADATA Notes with respect to the CST Returns information about the capabilities of a given instance of an INSPIRE Model Transformation Service, including information which optional operations are supported, and if required, which transformation formats are supported. Each instance of a Model Transformation Service will have to declare in its GET SERVICE METADATA response which formats for transformation definitions it will accept when it implements the relevant operations and parameters. 16

17 TRANSFORM IS TRANSFORMABLE GET TRANSFORMATION PUT TRANSFORMATION Performs the actual transformation of a given data set. Within the context of model transformation, this data set can be either a Query (e.g. WFS Filter Encoding or CSW CQL), Metadata, a Vector Data Set or a Coverage Data set. This operation will check whether the input parameters are complete and consistent, i.e. whether the INPUT DATA actually conforms to a provided SOURCE MODEL. When a MODEL MAPPING is provided, it is verified that this mapping can work with the INPUT DATA in the same manner. This operation can be used to return a complete, machineprocessable definition of a named transformation. This operation can be used to upload a complete, machineprocessable definition for a named transformation. Table 2: Notes on the INSPIRE Transformation Service s operations in context of the CST Further, within INSPIRE a Technical Guidance document for Schema Transformation is foreseen but not available yet. The HUMBOLDT Conceptual Schema Transformer will be adopted to the requirements on schema transformation imposed in that document as soon as it is available Requirements on processing capabilities The requirements analysis made in the HUMBOLDT project suggests that transformation functionality can be categorised based on a classification from Letho (2007). The following table describes several of the key operations which need to be considered in implementing a transformation: Operation name Filter Description This operation allows selecting a subset of Features in a data set based on the values of the Feature s Properties. Example: Select all Features of type A where a given attribute B value is equal to a certain enumeration value Rename Renaming indicates that two spatial object types or property types are equal and that they only carry different labels. Example: Both in the target model and in the source model there is a spatial object type called River, respectively Watercourse Merge Generally, merging refers to the reduction of multiple Features or Properties to one Feature or one Property. Examples: Concatenation of attribute values such as strings Creation of one Feature with a MultiPolygon geometry out of many Features with a Polygon geometry Split Splitting, as the opposite of Merging, refer to the creation of multiple Features or property values from one source Feature or Property. Examples: 17

18 Splitting of strings, e.g. using a regular expression Creation of multiple Features with a LineString geometry from one Feature with a MultiLineString geometry Type Conversion Examples: Spatial type conversions, including from Polygons to lines and vice versa, from Polygons/Lines to Points and others Alphanumeric type conversions, such as String to Numeric or vice versa Value Conversion Examples: Converting from one Unit of Measurement to another Spatial conversions, such as simplifications to confirm to constraints of the target model Augmentation Augmentation refers to the derivation of attribute values in the target model on the basis of property values in the source model, or the assignment of default values where this is not possible. Examples: Adding derived spatial properties, e.g. a centroid Deriving values for target schema properties when there is no direct mapping available, e.g. by reasoning on the basis of other values (example: a canal is usually a man-made object) Filling in default and null property values Table 1: Some of the Transformation Types foreseen for the Model Transformation metamodel Automatic Workflow handling A need for using conceptual schema transformation is identified when the schema that a given source data set applies is not equal (one of the) to the schema(s) indicated in the ThematicConstraint as specified in the context (see HUMBOLDT Commons Specification for more information). Sometimes, the execution of MRM and EMS requires the previous execution of the CST. Further, it is recommended to execute it as one of the first steps, with only CTS possibly before it Interface Control Document for WPS interface The OGC WPS DescribeProcess document for this service can be found in chapter References [1] Schema Translation in a Web Service Based SDI, [2] OpenGIS Geography Markup Language (GML) Encoding Standard, [3] Serving North American Geologic Map Information using Open Geospatial Web Services, 18

19 pdf [4] Streaming Transformations for XML (STX), [5] mdwfs: A Concept of Web-enabling Semantic Transformation, Language Transformation Specification This section gives information on language transformation in HUMBOLDT. First, a clarification is given on what can be transformed and how. Then, the HUMBOLDT approach to language transformation of related to queries and data set is given in the form of component specification that handle such transformations What can be transformed and how? Within HUMBOLDT, the need for multilingualism / language transformation occurs on several different places. These are shortly described in the following Multilingual User Interfaces First, the user interfaces of the HUMBOLDT components are required to be available in different languages. We assume that the translation of terms appearing in user interfaces (e.g. names of buttons etc) is done manually. In this process, the Language Transformer might be used as a dictionary but this is not its main purpose Data in multiple languages Second, data / features are always expressed in a certain language. This involves the schema (i.e. the Feature Types, attribute names etc.) as well as attribute values. Since fully automated language translation of schema / Feature Types does not seem to be feasible (the terms appearing in schemas are often cryptic and e.g. based on merging of different single terms, the merging based on the grammar of the source language etc.) and since language transformation of schema elements (Feature Type names, attribute names) is handled in the process of schema translation as well, we assume that the Language Transformer as described in this chapter will be used for language transformation of attribute values only. However, only keywords / single terms are translated and not complete sentences Queries As described in the HUMBOLDT commons specification, users are enabled to specify a set of keywords describing the data they are interested in, e.g. soil or water. The language transformer might be used for translating such keywords, enabling users to discover semantically relevant data sources, even if the terms appearing in the metadata are in a different language Scenario Integration The document Humboldt Scenario: HS Border Security System Specification states the following information on the harmonization aspect of language translation: Multi-linguality: Yes, the BS agencies in Slovakia and Hungary use their systems in different languages. 19

20 Following this harmonization aspect in Border Security scenario the use case called UC03_01 Data exchange needs language translation component to be used so that the administrator is able to prepare the harmonized data to be used in case of intrusion Architecture Figure 4 gives the component architecture for language transformation in HUMBOLDT Figure 4: Language Translation Architecture The Language Transformer is the HUMBOLDT component responsible for transforming single terms, e.g. soil. Further, it offers administration functionalities (e.g. the creation / deletion of translations). The component only transforms single terms and hence, in case multiple terms shall be translated at the same time, it must be invoked multiple times. The HUMBOLDT LanguageTransformingProcessor is responsible for transforming data sets The HUMBOLDT Language Transforming Processor As described above, the HUMBOLDT language transforming processor is responsible for translating sets of terms or even data sets. It accomplishes this by calling (for each term to be translated) the HUMBOLDT Language Transformer. The Language Transforming Processor offers a WPS interface Automated Workflow Handling To the HUMBOLDT Workflow Design and Construction Service, only the Language Transforming Processor is visible. It is applied, in case a data set does not correspond to the language specified within the LanguageConstraint as part of the user context. 20

21 4.2.6 The HUMBOLDT Language Transformer The HUMBOLDT Language Transformer offers the functionality of transforming terms from one natural language to another. The translation is based on two databases called LanguageTransformerDB and GemetDB. LanguageTransformerDB is the main database of the application, user may perform CRUD operations (i.e. create, update, delete) on it. It was decided to use MS Access as a database management system. It is possible to use any management system that is supported by NHibernate for the main database. It was verified that application works correctly with MySQL and Oracle. GemetDB is used only as a backup in retrieving translations, it means that if a translation or domain is not found in the first database it is searched for in the second. Only Read capabilities are provided. Its structure and data are based on GEMET. GEMET data is placed in a MS Access database. In the first release, it is not possible to plugin other backup databases. Further, the application is case sensitive, i.e: Term and term are two different terms. However, if a case-insensitive system is used they are treated as equal terms Language Transformer operations Figure 5 shows the operations offered by the Language Transformer. An explanation is given in Table 1 below. Figure 5: Language Transformer operations DeleteTranslation GetAllTranslations GetBestTranslation GetTranslation The method deletes matching main term with all its children and every matching child. The method retrieves a collection of translations for all domains. At this moment it return only first term. The method works similarly to gettranslation but domain name does not matter. The method retrieves the best translation in a given domain. (Higher number, higher ranking). The best term is chosen based on a following algorithm. Firstly, matching main term is searched for. If it is found it is returned, if not, matching child terms are searched for. If more than one are found, the best is returned based on their ranking. If nothing is found translation backup database is queried. 21

22 PutTranslation The method works as create and update method. If a term does not exist it creates new record. It is important to state that the relations between terms are directional, i.e if a word term (main term) and its Polish translation termin (child term) are already in a database, and user puts word termin (main term) with its English translation (child term) 4 records are in the database after the operation. If a term exists in a database only its ranking is updated. Table 1: Language Transformer operations Implementation The language transformer additionally offers all operations via the OGC WPS interface. The Intergraph Geomedia SDI Pro Framework is used to provide the WPS functionality. The framework uses plugins to provide specific WPS operations. One of such a plugin is WPSPipeLanguageTransformerPlugin.dll. The core Language Transformer functionality is implemented in LanguageTransformer.dll assembly. WPSPipeLanguageTransformerPlugin.dll assembly provide interface for Geomedia SDI Pro Framework to use LanguageTransformer.dll. You can use WPSPipeLanguageTransformerPlugin.dll to directly test LT functionality. For each operation, it supports HTTP GET as defined in the OGC WPS specification of GET Interface Control Document for WPS Interface The interface control document (WPS DescribeProcess response) can be found in chapter Multiple Representation Merger The Multiple Representation Merger is capable of fusing features of data sets with a spatial overlap, such as along a common border, where water bodies are part of both data sets. It contains an algorithm that uses the (vector) geometry of features to identify multiple representations of the same real-world feature. Based on this identification, the Features are merged, with one of the geometries being selected (the one with greater detail, i.e. point density) and all other attributes being copied. If no duplicates are found for a feature, it is always returned. It is important to know that the service assumes both data sets to use the same application schema, otherwise it won't work. This means that the execution of a schema transformation (see section on CST) on one (or both) of the inputs might be required beforehand. This is also depicted in the example for the Protected Areas scenario below Scenario Integration As an example for usage of the MRM, consider this example from the Protected Areas scenario. It fuses data sets which are thematically very close (Special Protected Areas and Protected Areas), as shown in Figure 6 22

23 Figure 6: Processing chain constructed to fuse two data sets in Protected Areas Requirements for the Component 1. Functional Requirements 1. The SYSTEM must be able to work with Features and all their attributes. All Input data sets must use the same application schema. 2. The SYSTEM must be able to work with 2D Points, LineStrings, Polygons (no holes) and Surfaces (holes). 1. The SYSTEM could work with 3D Features. 2. The SYSTEM could work with MultiLineStrings and MultiPolygons. 3. The SYSTEM must be able to merge Features from different data sets based on their geometric properties. 2. Interface Requirements 1. The SYSTEM must provide threshold values to express the required likeness to the CLIENT. 1. The SYSTEM should offer geometric overlap as a threshold value for identifying multiple representations. 2. The SYSTEM should offer edge histogram overlap as a threshold value for identifying multiple representations. 2. If there is more than one occurrence of an attribute (geometric or nongeometric) in the candidate and reference features, the attribute value with the highest confidence (quality closest to specified requirements) should be copied into the output Feature's attribute. 1. If there is no confidence information available on a Feature's attribute quality, the attribute in the reference data set must be selected. 2. If there is no confidence information available and the reference Feature does not contain the attribute value, values should be combined. In the case of numerical values, the arithmetic average should be used; in textual values, the values should be concatenated together with information where each part came from. 1. The SYSTEM must be implemented as a OGC WPS

24 2. The SYSTEM must provide means to use one reference data set [1] as well as one to many candidate data sets [1..n]. 3. The SYSTEM must provide simplified means to configure the identification algorithms used, such as a single tolerance value. 4. The SYSTEM must document which modifications it did to each Feature in the resulting data set. 3. I/O Requirements 1. The SYSTEM must use the ISO extension for the Lineage defined in the Mediator Service Specification. 2. The documentation should contain unique identifiers of the Features merged into the result Feature, and should also allow to identify which attribute came from which source. 1. The SYSTEM must support GML as input and output schema. 1. The SYSTEM should support GML 3.1 as input and output schema. 2. The SYSTEM must support GML as input and output schema. 2. The SYSTEM could support KML as input and output schema. 3. The SYSTEM should support GMLPacket as input and output schema. 4. The SYSTEM could support Shapefiles as input and output schema. 4. Technical Requirements 1. The SYSTEM should use the Java Conflation Suite (JCS) for identification where possible Automated Workflow Handling The MRM is applied in case, data covering a certain bounding box is requested and only a spatial union of multiple data sets covers the area requested. In such cases, the MRM is applied if: a.) All data sets are delivered in the same schema. b.) All data sets are thematically equal / close (i.e. represent the same real world type of object, e.g. protected areas) a.) is a precondition of the MRM. b.) is obviously determined by a.): if two data sets have the same application schema, they are thematically equal / close. In case, the data sets do not have the same schema, it is determined whether there is the possibility to translate them via the HUMBOLDT CST Implementation The identification algorithms used are taken from the Java Conflation Suite (JCS) Interface Control Document for WPS interface The OGC WPS DescribeProcess document for this service can be found in chapter 6. 24

25 4.4 Coordinate Transformation Service The Coordinate Transformation Service is a WPS implementation of a service that allows to transform coordinates between various geographic reference systems, i.e. Geoids and projections. The CTS has three core usage scenarios: Input of a file that doesn't come from a service which provides CRS selection capability. Requiring usage of a specific, non-standard algorithm with high precision, which cannot be reached by generic methods. Usage of an exotic CRS that is not supported (or not well enough) by generic solutions Scenario Integration In the Regione Ligure use case for protected areas management and in the tourism valorisation use case, the CTS-WPS is used to transform data from the local EPSG:3003 (a GK system) coordinates to the global EPSG:4326 (WGS84 geographic coordinates) system. For the Portugese variants of these Use Cases, a transformation is required from EPSG:27492 to EPSG: Requirements 1. Functional Requirements 1. The SYSTEM must be able to transform any point-based geometry from one CRS to another CRS. 1. The SYSTEM must support the following CRSes as valid input CRSes: 1. For ERiskA: EPSG:31288, EPSG:31468, EPSG:31467, EPSG: For BorderSecurity: EPSG:2493, EPSG:2494, EPSG:28403, EPSG:28404, EPSG:32633, EPSG: For ProtectedAreas: EPSG:3003, EPSG: The SYSTEM must be able to transform from any of the input CRSes to EPSG: The SYSTEM must support transforming the following geometry types: Point, MultiPoint, LineString, MultiLineString, Polygon, MultiPolygon, all as defined in the Simple Features Profile of GML. 4. The SYSTEM must provide facilities to the USER to select either standard processing methods or a specific algorithm. 1. This could also be implemented by defining a transformation precision attribute. 2. The SYSTEM could also perform z value transformations from one height reference network to another. 3. It must be possible for a USER to extend the SYSTEM with additional transformation algorithms easily. 25

26 2. Interface Requirements 1. The SYSTEM should support additional algorithms as JAR plugins. 1. The SYSTEM must be implemented as a OGC WPS I/O Requirements 1. The SYSTEM's capabilities must explicitly list the supported input and output CRSes. 2. The SYSTEM should be able to handle single requests with points (which equals the size of the largest data sets in ProtectedAreas). 3. The SYSTEM must be able to work with references to WFS queries. 4. The SYSTEM must be able to work with references to WPS stored results. 5. The SYSTEM must offer an optional parameter in the interface that allows the selection of a specific algorithm for transformation. 1. The SYSTEM must support GML as input and output schema. 1. The SYSTEM should support GML 3.1 as input and output schema. 2. The SYSTEM must support GML as input and output schema. 2. The SYSTEM could support KML as input and output schema. 3. The SYSTEM could support Shapefiles as input and output schema Relationship to the INSPIRE requirements for coordinate transformation The INSPIRE Technical Guidance on Coordinate Transformation specifies, how a coordinate transformation service should be implemented in order to be INSPIRE compatible. Since, at the time of developing the HUMBOLDT Coordinate Transformation Service, the INPSIRE TG on CRS Transformation has not been available, this HUMBOLDT service differs from these requirements. This section gives some more details on these differences: The main purpose of the INSPIRE TG on coordinate transformation is to give a so-called WPS Application Profile for coordinate transformation. The profile consists of a OGC URN that uniquely identifies the process (i.e. urn:ogc:wps:1.0.0:inspire:transformcoordinates:1.0) as well as a reference DescribeProcess response. Each individual coordinate reference system transformation process aiming at being INSPIRE compatible must reference the URN and implement / conform to the reference describe process response. What is fixed within that INSPIRE document /within the process description are in principle the following names of the inputs and outputs: Names of Input parameters: SourceCRS, TargetCRS, InputData Name of the output parameter: TransformedData Every service comforming to the INSPIRE TG on coordinate transformation must conform to these names for the input/output parameters. This enables generic clients that "know" the profile to dynamically bind to a specific INSPIRE-compatible CRS Transformation Service by e.g. specyfying: instantiate the input named "SourceCRS" with the value "EPSG:31467". However, as described above, the HUMBOLDT CRS Transformation Service has been developed before the INSPIRE TG on CRS Transformation was available and hence, does not conform to what is specified there. For HUMBOLDT, this is not a problem since we have our own process descriptions 26

27 anyway. However, it is an easy task to adopt the service to the INSPIRE requirements mentioned above, if necessary. Further, the relevant document specifies that the transformation operation should have an additional boolean parameters called TestTransformation, indicating whether the transformation should actually be executed (false) or only tested (true). In case of testing, the user just receives a boolean yes/no answer on whether the actual transformation with the same instantiation of parameters would actually execute successfully. Since within HUMBOLDT, there has been no requirement for this testing functionality, it has not been implemented Automatic Workflow handling A need for using this WPS is identified by comparing the target SRS (as part of the SpatialConstraint of the given context) to the SRS used in the given source data. The source SRS can be identified either directly from the data or from its schema. The CTS must be executed before the Multiple Representation Merger and the Edge Matching Service Implementation The transformation is based on the Geoid and Projection Parameters database of the EPSG, and on the generic transformation algorithm provided by GeoTools 2.4. Any more specific algorithms can be plugged in on demand. As a side note: we chose not to implement the OGC Web Coordinate Transformation Service (WCTS) specification so that we would only have to deal with one type of interface both in the Workflow and Mediator Service. Of course, a WCTS can easily be "hidden" behind a WPS facade Interface Control Document for WPS interface The OGC WPS DescribeProcess document for this service can be found in chapter The HUMBOLDT Edge Matching Service The Edge Matching Service (EMS) is a WPS implementation of a service that aligns edges and points of vector geometries so that they will be gapless. It has two modes of operation: Align-to-Reference: In this case, all candidate data sets will be transformed so that points are moved up to a maximum distance also provided as input. As an example, consider a data set with high-quality country borders that a set of regional borders ha to use as reference. Another example would be river networks of different scale, where the higher-scale data can be used as reference. Distribute-Errors: In this case, there is no reference data set that can be used as "ground truth", therefore all geometries will be transformed. Again, no point or edge will be moved further than a client-specified amount. This approach is useful for data sets of even quality and legal bindingness. For each of the two modes of operation, the EMS has three core usage scenarios: A data set should be without gaps, but there are small gaps in between the individual features that need to be filled. 27

28 Two data sets should have identical geometry over a shared feature (such as a common administrative border), but the geometry varies. Two data sets have identical geometry for a shared feature, but the geometry is translated from the position where it should be located Scenario Integration In the Protected Areas Scenario, different data sets describing administrative boundaries and protected areas zoning are available that often share borders. An example for this is the border of the Ligure area which should be matched to the geometry of a river and also to the geometry of a protected site border. However, since all these data sets have been collected in different ways and have been generalized to different scales, they don't match up, as can be seen in the pictures below. Figure 7: Administrative Boundaries Italy 1:25.000, Ligure 1:5.000 and Rivers 1: Figure 8: Administrative Boundaries Ligure 1:5.000 and Special Protected Areas Zoning 28

29 Figure 9: Administrative Boundaries Italy 1: from two sources, with a visible translation because of different reference system transformations Requirements for the Component 1. Functional Requirements 1. The SYSTEM must be able to work with 2D LineStrings, Polygons (no holes) and Surfaces (holes). 2. A provided reference geometry must not be modified by the SYSTEM. 3. The SYSTEM must be capable of modifying a candidate data set with Polygons in such a way that the results represents a complete coverage, without gaps or overlaps (usage scenario 1). 4. The SYSTEM must be capable of modifying a geometry in such a way that parts it shares with a reference geometry are fully matched (usage scenario 2). 1. The SYSTEM must be able to identify parts of a candidate geometry that should be shared with parts of a reference geometry. 1. It should be possible for the CLIENT to provide a maximum tolerance value for the matching distance. 5. The SYSTEM must be capable of identifying and rectifying a relative translation of a complete geometry (usage scenario 3). 2. Interface Requirements 1. The SYSTEM must be implemented as a OGC WPS The SYSTEM must provide means to use an optional reference data set [0..1] as well as one to many candidate data sets [1..n]. 1. If no reference data set is defined and exactly 1 candidate data set is provided, the SYSTEM should eliminate any gaps in that data set (usage scenario 1). 3. It must be possible to specify the maximum distance that a point or a segment may be moved from it's original position. 4. The SYSTEM must document which modifications it did to each Feature in the resulting data set. 29

30 3. I/O Requirements 1. The SYSTEM must use the ISO extension for the Lineage defined in the Mediator Service Specification. 2. The documentation must contain the maximum distance a point was moved as well as the average distance the points of the geometry were moved. 1. The SYSTEM must support GML as input and output schema. 1. The SYSTEM should support GML 3.1 as input and output schema. 2. The SYSTEM must support GML as input and output schema. 2. The SYSTEM could support KML as input and output schema. 3. The SYSTEM should support GMLPacket as input and output schema. 4. The SYSTEM could support Shapefiles as input and output schema. 4. Technical Requirements 1. The SYSTEM should use the Java Conflation Suite (JCS) for MREP identification where possible Automatic Workflow handling A need for using this WPS is identified in the following scenarios: Two (or more) data sets have been merged using the Multiple Representation Merger (MRM) (not expressed in a Constraint, but known to the WDCS). In this case, the second usage scenario applies, and possible the third one; One data set needs to be gapless for a subsequent step in processing (This could be expressed as a QualityConstraint stating that no gaps and overlaps are allowed). In this case, the first usage scenario described above applies; The EMS must be executed after the Coordinate Transformation Service and the Multiple Representation Merger (MRM). Since a prerequisite to MRM execution is Conceptual Schema Translation, that step must also have happened before Interface Control Document The OGC WPS DescribeProcess document for this service can be found in the chapter 6. 30

31 5 Technology Viewpoint This chapter gives information on how new processing components need to implemented in order to be handled by the framework. 5.1 Guidelines for Process Implementation There are three main approaches to provide a particular processing functionality to the HUMBOLDT framework such that it can be handled and executed by the framework. They are: Direct implementation of the Transformer interface on the platform for which the Mediator Service to be used has been implemented. In the case of the reference implementation, this is the java platform. This approach is especially suitable for specific tasks where it is for some reasons not convenient to transfer data through web service. WPS - OGC Web Processing Service is gaining attraction as an interface for the description and execution of transformation capabilities. It provides Web Service advantages like low coupling and is considered the preferred way to implement particular processes in HUMBOLDT. WS-*/SOAP Processing components might also offer their functionality via WSDL and SOAP binding. However, we focus on the OGC WPS interface in this document Direct implementation on the Mediator platform The direct implementation of a processing component on the Mediator platform requires the implementation of the generic ExecutableTransformer interface. 31

32 Figure 10: The Transformer interface structure Figure 10 gives the generic Transformer and ExecutableTransformer interfaces. Since most of the operations to be implemented are generic and not specific to the individual processing, an abstract implementation of ExecutableTransformer is provided as part of the framework, i.e. AbstractExecutableTransformer. Therefore, implementing a new transformation in HUMBOLDT only requires extending AbstractExecutableTransformer and implementing the execute() operation. The execute() operation then holds the actual transformation algorithm. The operation execute() does not have any parameters since they are stored in parameters (i.e. a mapping of parameter names to values). They can be read out in execute(). A reference to the result shall be stored in result after the execution was successful. All other operations and parameters appearing in the interfaces and the abstract class AbstractExecutableTransformer are only relevant for execution Registration to the Mediator Service Once the new Transformer is implemented and compiled, it needs to be registered with the HUMBOLDT Mediator Service (MS). The MS has a plugin mechanism available that only requires the Transformer be placed in a JAR and that be placed in plugins. The MS will then scan this directory at 32

A5.2-D3 [3.5] Workflow Design and Construction Service Component Specification

A5.2-D3 [3.5] Workflow Design and Construction Service Component Specification Title: A5.2-D3 [3.5] Workflow Design and Construction Service Component Specification Author(s)/Organisation(s): Daniel Fitzner (FHG), Moses Gone (FHG) Working Group: Architecture Team / WP5 References:

More information

A5.2-D3 [3.5] Workflow Design and Construction Service Component Specification. Eva Klien (FHG), Christine Giger (ETHZ), Dániel Kristóf (FOMI)

A5.2-D3 [3.5] Workflow Design and Construction Service Component Specification. Eva Klien (FHG), Christine Giger (ETHZ), Dániel Kristóf (FOMI) Title: A5.2-D3 [3.0] A Lightweight Introduction to the HUMBOLDT Framework V3.0 Author(s)/Organisation(s): Daniel Fitzner (FhG), Thorsten Reitz (FhG) Working Group: Architecture Team / WP5 References: A5.2-D3

More information

Framework specification, logical architecture, physical architecture, requirements, use cases.

Framework specification, logical architecture, physical architecture, requirements, use cases. Title: A5.2-D3 3.3.1 Alignment Editor Specification Editor(s)/Organisation(s): Thorsten Reitz (Fraunhofer IGD) Contributing Authors: Thorsten Reitz (Fraunhofer IGD), Marian de Vries (TUD) References: A1.8-D4

More information

Title: Author(s)/Organisation(s): Working Group: References: Quality Assurance: A5.2-D3 [3.7] Information Grounding Service Component Specification

Title: Author(s)/Organisation(s): Working Group: References: Quality Assurance: A5.2-D3 [3.7] Information Grounding Service Component Specification Title: A5.2-D3 [3.7] Information Grounding Service Component Specification Author(s)/Organisation(s): Ana Belén Antón/ETRA Working Group: Architecture Team/WP05 References: A1.8-D5 User Involvement Document,

More information

Conceptual schema matching with the Ontology Mapping Language: requirements and evaluation

Conceptual schema matching with the Ontology Mapping Language: requirements and evaluation 0 Conceptual schema matching with the Ontology Mapping Language: requirements and evaluation Marian de Vries / Thorsten Reitz AGILE workshop 2009 2008 Humboldt Consortium http://www.esdi-humboldt.eu Dr.

More information

Challenges in Geospatial Data Harmonisation:

Challenges in Geospatial Data Harmonisation: Challenges in Geospatial Data Harmonisation: Examples and Approaches from the HUMBOLDT project AGILE Workshop 2009 Astrid Fichtinger, Eva Klien, Christine Giger Overview The HUMBOLDT Project Data harmonisation

More information

HUMBOLDT Application Scenario: Protected Areas

HUMBOLDT Application Scenario: Protected Areas CC by Erlend Schei Copyright by Kecko Copyright by Michael Bezzina CC by Gunnar Ries Copyright by Michael Bezzina Copyright by Michael Bezzina Copyright by Michael Bezzina CC by fs999 CC by Jordan Nielsen

More information

ECP-2007-GEO OneGeology-Europe. Annex 1: Cookbook

ECP-2007-GEO OneGeology-Europe. Annex 1: Cookbook ECP-2007-GEO-317001 OneGeology-Europe Annex 1: Cookbook for creating multilingual metadata records using the OneGeology-Europe Metadata system (MIcKA) Authors: Lucie Kondrová, Robert Tomas, Štěpán Kafka

More information

Extension of INSPIRE Download Services TG for Observation Data

Extension of INSPIRE Download Services TG for Observation Data Extension of INSPIRE Download Services TG for Observation Data Simon Jirka (52 North) 14 th June 2014, MIG Workshop on WCS-based INSPIRE Download Services Agenda Motivation Sensor Web Proposed Update for

More information

Approaches & Languages for Schema Transformation

Approaches & Languages for Schema Transformation Approaches & Languages for Schema Transformation Findings of HUMBOLDT & follow-up Activities INSPIRE KEN Workshop on Schema Transformation Paris, France, 08.10.2013 Thorsten Reitz Esri R&D Center Zurich

More information

Introduction to INSPIRE. Network Services

Introduction to INSPIRE. Network Services Introduction to INSPIRE. Network Services European Commission Joint Research Centre Institute for Environment and Sustainability Digital Earth and Reference Data Unit www.jrc.ec.europa.eu Serving society

More information

SERVO - ACES Abstract

SERVO - ACES Abstract 1 of 6 12/27/2004 2:33 PM 2 of 6 12/27/2004 2:33 PM Implementing GIS Grid Services for the International Solid Earth Research Virtual Observatory Galip Aydin (1), Marlon Pierce (1), Geoffrey Fox (1), Mehmet

More information

Proposed update of Technical Guidance for INSPIRE Download services based on SOS

Proposed update of Technical Guidance for INSPIRE Download services based on SOS Proposed update of Technical Guidance for INSPIRE Download services based on SOS Organised by: Simon Jirka, Alexander Kotsev, Michael Lutz Dr. Simon Jirka (jirka@52north.org) 52 North GmbH Workshop - The

More information

IHO S-100 Framework. The Essence. WP / Task: Date: Author: hansc/dga Version: 0.6. Document name: IHO S-100 Framework-The Essence

IHO S-100 Framework. The Essence. WP / Task: Date: Author: hansc/dga Version: 0.6. Document name: IHO S-100 Framework-The Essence WP / Task: 4.4.1. Date: 2015-09-25 Author: hansc/dga Version: 0.6 Document name: IHO S-100 Framework-The Essence IHO S-100 Framework Version 0.6 The Essence Document information More recent versions of

More information

Metadata of geographic information

Metadata of geographic information Metadata of geographic information Kai Koistinen Management of environmental data and information 4.10.2017 Topics Metadata of geographic information What is metadata? Metadata standards and recommendations

More information

Cataloguing GI Functions provided by Non Web Services Software Resources Within IGN

Cataloguing GI Functions provided by Non Web Services Software Resources Within IGN Cataloguing GI Functions provided by Non Web Services Software Resources Within IGN Yann Abd-el-Kader, Bénédicte Bucher Laboratoire COGIT Institut Géographique National 2 av Pasteur 94 165 Saint Mandé

More information

Consolidation Team INSPIRE Annex I data specifications testing Call for Participation

Consolidation Team INSPIRE Annex I data specifications testing Call for Participation INSPIRE Infrastructure for Spatial Information in Europe Technical documents Consolidation Team INSPIRE Annex I data specifications testing Call for Participation Title INSPIRE Annex I data specifications

More information

Guidelines for the encoding of spatial data

Guidelines for the encoding of spatial data INSPIRE Infrastructure for Spatial Information in Europe Guidelines for the encoding of spatial data Title Status Creator Date 2012-06-15 Subject Publisher Type Description Contributor Format Source Rights

More information

The GIGAS Methodology

The GIGAS Methodology The GIGAS Methodology Pier Giorgio Marchetti European Space Agency Earth Observation Programme Ground Segment Department pier.giorgio.marchetti@esa.int GIGAS Objectives GIGAS has the goal to promote the

More information

Compass INSPIRE Services. Compass INSPIRE Services. White Paper Compass Informatics Limited Block 8, Blackrock Business

Compass INSPIRE Services. Compass INSPIRE Services. White Paper Compass Informatics Limited Block 8, Blackrock Business Compass INSPIRE Services White Paper 2010 Compass INSPIRE Services Compass Informatics Limited Block 8, Blackrock Business Park, Carysfort Avenue, Blackrock, County Dublin, Ireland Contact Us: +353 1 2104580

More information

Open Geospatial Consortium

Open Geospatial Consortium Open Geospatial Consortium Date: 28-March-2011 Reference number of this document: 10-195 Editors: OGC Aviation Domain Working Group Requirements for Aviation Metadata Copyright 2011 Open Geospatial Consortium.

More information

Intelligent Geospatial Feature Discovery System (igfds) User Guide

Intelligent Geospatial Feature Discovery System (igfds) User Guide Supported by: DOE DE-NA0001123 Developed, operated, and maintained by: CSISS at GMU Intelligent Geospatial Feature Discovery System (igfds) User Guide Version 1.0 Center for Spatial Information Science

More information

The European Commission s science and knowledge service. Joint Research Centre

The European Commission s science and knowledge service. Joint Research Centre The European Commission s science and knowledge service Joint Research Centre GeoDCAT-AP The story so far Andrea Perego, Antonio Rotundo, Lieven Raes GeoDCAT-AP Webinar 6 June 2018 What is GeoDCAT-AP Geospatial

More information

OGC Simple Features (for SQL and XML/GML)

OGC Simple Features (for SQL and XML/GML) Daniel Wirz, Department of Geography - GIS Division, University of Zurich mailto:wirz@geo.unizh.ch January 2004 What,...? Introduction Develop publicly available geoprocessing specifications. Open interfaces

More information

Leveraging OGC Services in ArcGIS Server. Satish Sankaran, Esri Yingqi Tang, Esri

Leveraging OGC Services in ArcGIS Server. Satish Sankaran, Esri Yingqi Tang, Esri Leveraging OGC Services in ArcGIS Server Satish Sankaran, Esri Yingqi Tang, Esri GIS Creating and Managing Geo Information Products - Proprietary - Open Specifications - Standards Dissemination of Geo

More information

Automatic Creation of INSPIRE Meta-information from SWE Services

Automatic Creation of INSPIRE Meta-information from SWE Services S@NY Automatic Creation of INSPIRE Meta-information from SWE Services Désirée Hilbring, Fraunhofer IITB hilbring@iitb.fraunhofer.de Agile 2009 Challenges in Geospatial Data Harmonization, Hanover Copyright

More information

GOVERNMENT GAZETTE REPUBLIC OF NAMIBIA

GOVERNMENT GAZETTE REPUBLIC OF NAMIBIA GOVERNMENT GAZETTE OF THE REPUBLIC OF NAMIBIA N$7.20 WINDHOEK - 7 October 2016 No. 6145 CONTENTS Page GENERAL NOTICE No. 406 Namibia Statistics Agency: Data quality standard for the purchase, capture,

More information

Download Service Implementing Rule and Technical Guidance

Download Service Implementing Rule and Technical Guidance Download and Transformation Draft Implementing Rules Presentation for INSPIRE Initiatives Download Service Implementing Rule and Technical Guidance Olaf Østensen Statens kartverk Norwegian Mapping Authority

More information

INSPIRE & Linked Data: Bridging the Gap Part II: Tools for linked INSPIRE data

INSPIRE & Linked Data: Bridging the Gap Part II: Tools for linked INSPIRE data Making the Web an Exploratory Place for Geospatial Knowledge INSPIRE & Linked Data: Bridging the Gap Part II: Tools for linked INSPIRE data Michalis Alexakis Spiros Athanasiou Nikos Georgomanolis Giorgos

More information

Guidelines for the encoding of spatial data

Guidelines for the encoding of spatial data INSPIRE Infrastructure for Spatial Information in Europe Guidelines for the encoding of spatial data Title D2.7: Guidelines for the encoding of spatial data, Version 3.1 Creator INSPIRE Drafting Team "Data

More information

DEVELOPING A NEW GEOGRAPHICAL OBJECT DATABASE. EXPERIENCES FROM IDEA TO DELIVERING DATASETS TOP10NL

DEVELOPING A NEW GEOGRAPHICAL OBJECT DATABASE. EXPERIENCES FROM IDEA TO DELIVERING DATASETS TOP10NL DEVELOPING A NEW GEOGRAPHICAL OBJECT DATABASE. EXPERIENCES FROM IDEA TO DELIVERING DATASETS TOP10NL NICO J. BAKKER Topografische Dienst Kadaster Bendienplein 5 7815 SM Emmen, The Netherlands nbakker@tdkadaster.nl

More information

GISCI GEOSPATIAL CORE TECHNICAL KNOWLEDGE EXAM CANDIDATE MANUAL AUGUST 2017

GISCI GEOSPATIAL CORE TECHNICAL KNOWLEDGE EXAM CANDIDATE MANUAL AUGUST 2017 GISCI GEOSPATIAL CORE TECHNICAL KNOWLEDGE EXAM CANDIDATE MANUAL AUGUST 2017 This document provides information about the GISCI Geospatial Core Technical Knowledge Exam, now a requirement for GISCI GISP

More information

Spatial Data on the Web

Spatial Data on the Web Spatial Data on the Web Tools and guidance for data providers The European Commission s science and knowledge service W3C Data on the Web Best Practices 35 W3C/OGC Spatial Data on the Web Best Practices

More information

Data Harmonisation Put into Practice by the HUMBOLDT Project

Data Harmonisation Put into Practice by the HUMBOLDT Project Data Harmonisation Put into Practice by the HUMBOLDT Project Astrid Fichtinger 1, Joachim Rix 3, Ulrich Schäffler 1, Ines Michi 2, Moses Gone 3 and Thorsten Reitz 3 1 Technische Universität München (astrid.fichtinger@bv.tum.de;

More information

INFACTORY : A RESTFUL API SERVER FOR EASILY CREATING INDOORGML

INFACTORY : A RESTFUL API SERVER FOR EASILY CREATING INDOORGML INFACTORY : A RESTFUL API SERVER FOR EASILY CREATING INDOORGML Hyemi Jeong, Hyung-gyu Ryoo, Ki-Joune Li Dept. of Computer Science&Engineering, Pusan National University, Kumjeong-Gu, 46241, Pusan, South

More information

Open Geospatial Consortium Inc.

Open Geospatial Consortium Inc. Open Geospatial Consortium Inc. Date: 2010-02-15 Reference number of this OpenGIS Project Document: OGC 09-147 Version: 0.0.1 Category: OpenGIS Interface Standard Editor: Peter Baumann WCS Extension --

More information

ISA Action 1.17: A Reusable INSPIRE Reference Platform (ARE3NA)

ISA Action 1.17: A Reusable INSPIRE Reference Platform (ARE3NA) ISA Action 1.17: A Reusable INSPIRE Reference Platform (ARE3NA) Expert contract supporting the Study on RDF and PIDs for INSPIRE Deliverable D.EC.3.2 RDF in INSPIRE Open issues, tools, and implications

More information

Leveraging OGC Standards on ArcGIS Server

Leveraging OGC Standards on ArcGIS Server Leveraging OGC Standards on ArcGIS Server Satish Sankaran Interoperability and Standards Team James Michel III ESRI Intel Team ArcGIS Server Complete Interoperable Server-Based GIS Desktop Explorer Web

More information

Geographic Information Fundamentals Overview

Geographic Information Fundamentals Overview CEN TC 287 Date: 1998-07 CR 287002:1998 CEN TC 287 Secretariat: AFNOR Geographic Information Fundamentals Overview Geoinformation Übersicht Information géographique Vue d'ensemble ICS: Descriptors: Document

More information

Initial Operating Capability & The INSPIRE Community Geoportal

Initial Operating Capability & The INSPIRE Community Geoportal INSPIRE Conference, Rotterdam, 15 19 June 2009 1 Infrastructure for Spatial Information in the European Community Initial Operating Capability & The INSPIRE Community Geoportal EC INSPIRE GEOPORTAL TEAM

More information

The coastal data in the regional and national territorial data repertory. Genova 24 Aprile 2012 Anna Cerrato Regione Liguria

The coastal data in the regional and national territorial data repertory. Genova 24 Aprile 2012 Anna Cerrato Regione Liguria The coastal data in the regional and national territorial data repertory Genova 24 Aprile 2012 Anna Cerrato Regione Liguria www.rndt.gov.it Experience from data providers in using transformation tools

More information

INSPIRE overview and possible applications for IED and E-PRTR e- Reporting Alexander Kotsev

INSPIRE overview and possible applications for IED and E-PRTR e- Reporting Alexander Kotsev INSPIRE overview and possible applications for IED and E-PRTR e- Reporting Alexander Kotsev www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting legislation The European data puzzle 24

More information

Study and guidelines on Geospatial Linked Data as part of ISA Action 1.17 Resource Description Framework

Study and guidelines on Geospatial Linked Data as part of ISA Action 1.17 Resource Description Framework DG Joint Research Center Study and guidelines on Geospatial Linked Data as part of ISA Action 1.17 Resource Description Framework 6 th of May 2014 Danny Vandenbroucke Diederik Tirry Agenda 1 Introduction

More information

Spatial Data on the Web

Spatial Data on the Web Spatial Data on the Web Tools and guidance for data providers Clemens Portele, Andreas Zahnen, Michael Lutz, Alexander Kotsev The European Commission s science and knowledge service Joint Research Centre

More information

Functional Description of Geoprocessing Services as Conjunctive Queries 1

Functional Description of Geoprocessing Services as Conjunctive Queries 1 D. Fitzner, J. Hoffmann 83 Functional Description of Geoprocessing ervices as Conjunctive Queries Daniel Fitzner, Jörg Hoffmann Institut für Geoinformatik, Münster fitzner@uni-muenster.de Universität Innsbruck,

More information

GeoDCAT-AP Representing geographic metadata by using the "DCAT application profile for data portals in Europe"

GeoDCAT-AP Representing geographic metadata by using the DCAT application profile for data portals in Europe GeoDCAT-AP Representing geographic metadata by using the "DCAT application profile for data portals in Europe" Andrea Perego, Vlado Cetl, Anders Friis-Christensen, Michael Lutz, Lorena Hernandez Joint

More information

OGC WCS 2.0 Revision Notes

OGC WCS 2.0 Revision Notes Open Geospatial Consortium Inc. Date: 2010-02-15 Reference number of this document: Version: 1.0.0 Category: OpenGIS IS Revision Notes Editors: Peter Baumann, Steven Keens OGC WCS 2.0 Revision Notes Copyright

More information

Open Geospatial Consortium Inc.

Open Geospatial Consortium Inc. Open Geospatial Consortium Inc. Date: 2016-12-05 Reference number of this OGC document: OGC 07-036r1 Version: 3.2.2 Category: OpenGIS Standard Editor: Clemens Portele OpenGIS Geography Markup Language

More information

KNOWLEDGE-BASED MULTIMEDIA ADAPTATION DECISION-TAKING

KNOWLEDGE-BASED MULTIMEDIA ADAPTATION DECISION-TAKING K KNOWLEDGE-BASED MULTIMEDIA ADAPTATION DECISION-TAKING Dietmar Jannach a, Christian Timmerer b, and Hermann Hellwagner b a Department of Computer Science, Dortmund University of Technology, Germany b

More information

Name type specification definitions part 1 basic name

Name type specification definitions part 1 basic name Open Geospatial Consortium Inc. Date: 2010-03-31 Reference number of this document: OGC 09-048r3 OGC Name of this document: http://www.opengis.net/doc/pol-nts/def-1/1.1 Version: 1.1 Category: OpenGIS Policy

More information

Motions from the 91st OGC Technical and Planning Committee Meetings Geneva, Switzerland Contents

Motions from the 91st OGC Technical and Planning Committee Meetings Geneva, Switzerland Contents Motions from the 91st OGC Technical and Planning Committee Meetings Geneva, Switzerland Contents "The Open Geospatial Consortium and EarthCube White Paper... 2 Vote for OGC Sensor Observation Service 2.0

More information

DATA VALIDATION AGAINST SCHEMA AND SOURCE DATA

DATA VALIDATION AGAINST SCHEMA AND SOURCE DATA DATA VALIDATION AGAINST SCHEMA AND SOURCE DATA didier.bouteloup@ign.fr; dominique.laurent@ign.fr 3 June 2016 ign.fr Context IGN has performed data validation twice On test INSPIRE data (2013-2014) On ELF

More information

Workshop Data Modelling [en]

Workshop Data Modelling [en] Workshop Data Modelling [en] Thorsten Reitz, wetransform INSPIRE and Beyond 2018 24.05.2018 Wetransform GmbH - Why do we create data models at all? - What processes can we use to create models? - What

More information

Application Development in Web Mapping 1.

Application Development in Web Mapping 1. University of West Hungary, Faculty of Geoinformatics László Kottyán Application Development in Web Mapping 1. module ADW1 Web Technologies and Geospatial Standards SZÉKESFEHÉRVÁR 2010 The right to this

More information

Addressing the needs of INSPIRE: The Challenges of improving Interoperability within the European Union

Addressing the needs of INSPIRE: The Challenges of improving Interoperability within the European Union Addressing the needs of INSPIRE: The Challenges of improving Interoperability within the European Union Andrew Coote Facilitator, Addresses Thematic Working Group andrew.coote@consultingwhere.com Disclaimer

More information

When using this architecture for accessing distributed services, however, query broker and/or caches are recommendable for performance reasons.

When using this architecture for accessing distributed services, however, query broker and/or caches are recommendable for performance reasons. Integration of semantics, data and geospatial information for LTER Abstract The long term ecological monitoring and research network (LTER) in Europe[1] provides a vast amount of data with regard to drivers

More information

International Organization for Standardization Technical Committee 211 (ISO/TC211)

International Organization for Standardization Technical Committee 211 (ISO/TC211) Esri Support for Geospatial Standards: Open Geospatial Consortium (OGC) International Organization for Standardization Technical Committee 211 (ISO/TC211) An Esri White Paper April 2015 Copyright 2015

More information

Schema Transformation as a Tool for Data Reuse in Web Service Environment

Schema Transformation as a Tool for Data Reuse in Web Service Environment Schema Transformation as a Tool for Data Reuse in Web Service Environment Lassi Lehto Department of Geoinformatics and Cartography Finnish Geodetic Institute Masala, Finland lassi.lehto@fgi.fi Abstract

More information

Leveraging OGC Services in ArcGIS Server. Satish Sankaran Yingqi Tang

Leveraging OGC Services in ArcGIS Server. Satish Sankaran Yingqi Tang Leveraging OGC Services in ArcGIS Server Satish Sankaran ssankaran@esri.com Yingqi Tang ytang@esri.com Agenda Interoperability Enablers OGC and esri OGC Web Services ArcGIS and OGC Web Services - @ version

More information

Proposed Revisions to ebxml Technical Architecture Specification v ebxml Business Process Project Team

Proposed Revisions to ebxml Technical Architecture Specification v ebxml Business Process Project Team 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 Proposed Revisions to ebxml Technical Architecture Specification v1.0.4 ebxml Business Process Project Team 11

More information

INSPIRE & Environment Data in the EU

INSPIRE & Environment Data in the EU INSPIRE & Environment Data in the EU Andrea Perego Research Data infrastructures for Environmental related Societal Challenges Workshop @ pre-rda P6 Workshops, Paris 22 September 2015 INSPIRE in a nutshell

More information

Metadata allows. Metadata Existing Guidelines. Data to be found Starts interoperability. Decision making based on Quality Relevance Time Geography

Metadata allows. Metadata Existing Guidelines. Data to be found Starts interoperability. Decision making based on Quality Relevance Time Geography Metadata Existing Guidelines ADQ AIXM Workshop 10 December 2013 Eduard Porosnicu EUROCONTROL DSR/CMN/IM Metadata allows Data to be found Starts interoperability Decision making based on Quality Relevance

More information

PODS Lite. Technical Overview and Guide

PODS Lite. Technical Overview and Guide PODS Lite Technical Overview and Guide Introduction Since 1998, the Pipeline Open Data Standard (PODS) Association has been focused on providing a comprehensive, open, vendor-neutral, highly scalable,

More information

Proposed Revisions to ebxml Technical. Architecture Specification v1.04

Proposed Revisions to ebxml Technical. Architecture Specification v1.04 Proposed Revisions to ebxml Technical Architecture Specification v1.04 Business Process Team 11 May 2001 (This document is the non-normative version formatted for printing, July 2001) Copyright UN/CEFACT

More information

The GeoPortal Cookbook Tutorial

The GeoPortal Cookbook Tutorial The GeoPortal Cookbook Tutorial Wim Hugo SAEON/ SAEOS SCOPE OF DISCUSSION Background and Additional Resources Context and Concepts The Main Components of a GeoPortal Architecture Implementation Options

More information

The cadastral data and standards based on XML in Poland

The cadastral data and standards based on XML in Poland The cadastral data and standards based on XML in Poland Jarosław Bydłosz, Piotr Parzych AGH University of Science and Technology Cracow, Poland 1 XML XML Extensible Markup Language Extensible Markup Language

More information

Welcome. to Pre-bid meeting. Karnataka State Spatial Data Infrastructure (KSSDI) Project, KSCST, Bangalore.

Welcome. to Pre-bid meeting. Karnataka State Spatial Data Infrastructure (KSSDI) Project, KSCST, Bangalore. Welcome to Pre-bid meeting Karnataka State Spatial Data Infrastructure (KSSDI) Project, KSCST, Bangalore. DEVELOPMENT OF KARNATAKA STATE SPATIAL DATA INFRASTRUCTURE (KSSDI) PROJECT Objective: To develop

More information

METAINFORMATION INFRASTRUCTURE FOR GEOSPATIAL INFORMATION

METAINFORMATION INFRASTRUCTURE FOR GEOSPATIAL INFORMATION 2010/2 PAGES 1 7 RECEIVED 15. 6. 2009 ACCEPTED 2. 3. 2010 T. KLIMENT METAINFORMATION INFRASTRUCTURE FOR GEOSPATIAL INFORMATION ABSTRACT Tomáš KLIMENT email: tomas.kliment@stuba.sk Research field: Spatial

More information

Project European CDDA and INSPIRE : scope, transformation workflow and mapping rules

Project European CDDA and INSPIRE : scope, transformation workflow and mapping rules Project European CDDA and INSPIRE : scope, transformation workflow and mapping rules INSPIRE Conference 2014 Workshop: Implementing Existing European Spatial Data of Designated Areas Based on the INSPIRE

More information

Using ESRI data in Autodesk ISD Products

Using ESRI data in Autodesk ISD Products GI13-3 Using ESRI data in Autodesk ISD Products 1.5 hr. Class 02-Dec-03 3:30pm - 5:00pm Session Description: We will focus on using data in a variety of ESRI formats within the Autodesk GIS product line,

More information

Infrastructure for Spatial Information in Europe. Proposed action for update of MIWP: Alternative encodings for INSPIRE data

Infrastructure for Spatial Information in Europe. Proposed action for update of MIWP: Alternative encodings for INSPIRE data INSPIRE Infrastructure for Spatial Information in Europe Proposed action for update of MIWP: Alternative encodings for INSPIRE data Type Creator MIWP Action fiche DG ENV Date/status/version 20/11/2017

More information

Interoperability and Standards Supports in ArcGIS

Interoperability and Standards Supports in ArcGIS Esri International User Conference San Diego, California Technical Workshops July 26, 2012 Interoperability and Standards Supports in ArcGIS Satish Sankaran, Esri Yingqi Tang, Esri Agenda Esri s participation

More information

SDMX self-learning package No. 5 Student book. Metadata Structure Definition

SDMX self-learning package No. 5 Student book. Metadata Structure Definition No. 5 Student book Metadata Structure Definition Produced by Eurostat, Directorate B: Statistical Methodologies and Tools Unit B-5: Statistical Information Technologies Last update of content December

More information

Application of the Catalogue and Validator tools in the context of Inspire Alberto Belussi, Jody Marca, Mauro Negri, Giuseppe Pelagatti

Application of the Catalogue and Validator tools in the context of Inspire Alberto Belussi, Jody Marca, Mauro Negri, Giuseppe Pelagatti Application of the Catalogue and Validator tools in the context of Inspire Alberto Belussi, Jody Marca, Mauro Negri, Giuseppe Pelagatti Politecnico di Milano giuseppe.pelagatti@polimi.it spatialdbgroup.polimi.it

More information

Compositional Model Based Software Development

Compositional Model Based Software Development Compositional Model Based Software Development Prof. Dr. Bernhard Rumpe http://www.se-rwth.de/ Seite 2 Our Working Groups and Topics Automotive / Robotics Autonomous driving Functional architecture Variability

More information

All Fields marked with * are mandatory. *Ben Caradoc-Davies

All Fields marked with * are mandatory. *Ben Caradoc-Davies All Fields marked with * are mandatory. Change Request #: 303 Assigned OGC Document #: Name: Organization: Email: 13-048r1 *Ben Caradoc-Davies *CSIRO *Ben.Caradoc-Davies@csiro.au Document Name/Version:

More information

Opus: University of Bath Online Publication Store

Opus: University of Bath Online Publication Store Patel, M. (2004) Semantic Interoperability in Digital Library Systems. In: WP5 Forum Workshop: Semantic Interoperability in Digital Library Systems, DELOS Network of Excellence in Digital Libraries, 2004-09-16-2004-09-16,

More information

INTERNATIONAL HYDROGRAPHIC ORGANIZATION

INTERNATIONAL HYDROGRAPHIC ORGANIZATION INTERNATIONAL HYDROGRAPHIC ORGANIZATION IHO GUIDELINE STANDARD FOR CREATING S-100 PRODUCT SPECIFICATIONS PART A Version 0.1 2018-01-31 Special Publication No. S-??? Guideline for Creating an S-100 Product

More information

4/7/2009. Model: Abstraction of reality following formal rules e.g. Euclidean space for physical space

4/7/2009. Model: Abstraction of reality following formal rules e.g. Euclidean space for physical space Model: Abstraction of reality following formal rules e.g. Euclidean space for physical space At different levels: mathematical model (Euclidean space) conceptual design model (ER model) data model (design)

More information

Managing Learning Objects in Large Scale Courseware Authoring Studio 1

Managing Learning Objects in Large Scale Courseware Authoring Studio 1 Managing Learning Objects in Large Scale Courseware Authoring Studio 1 Ivo Marinchev, Ivo Hristov Institute of Information Technologies Bulgarian Academy of Sciences, Acad. G. Bonchev Str. Block 29A, Sofia

More information

A GML SCHEMA MAPPING APPROACH TO OVERCOME SEMANTIC HETEROGENEITY IN GIS

A GML SCHEMA MAPPING APPROACH TO OVERCOME SEMANTIC HETEROGENEITY IN GIS A GML SCHEMA MAPPING APPROACH TO OVERCOME SEMANTIC HETEROGENEITY IN GIS Manoj Paul, S. K. Ghosh School of Information Technology, Indian Institute of Technology, Kharagpur 721302, India - (mpaul, skg)@sit.iitkgp.ernet.in

More information

Standards, GML and AIXM. Dr. David Burggraf Vice President Galdos Systems Inc

Standards, GML and AIXM. Dr. David Burggraf Vice President Galdos Systems Inc Standards, and AIXM Dr. David Burggraf Vice President Galdos Systems Inc Copyright Galdos Systems Inc. May 6, 2010 Geography Markup Language: What is it? A modeling language for geographic features A set

More information

AN APPROACH ON DYNAMIC GEOSPAITAL INFORMATION SERVICE COMPOSITION BASED ON CONTEXT RELATIONSHIP

AN APPROACH ON DYNAMIC GEOSPAITAL INFORMATION SERVICE COMPOSITION BASED ON CONTEXT RELATIONSHIP AN APPROACH ON DYNAMIC GEOSPAITAL INFORMATION SERVICE COMPOSITION BASED ON CONTEXT RELATIONSHIP Dayu Cheng a,b*, Faliang Wang b a China University of Mining and Technology, Xuzhou, China b National Geomatics

More information

Suggestions for writing Abstract Test Suites (ATS) for INSPIRE conformance testing for Metadata and Network Services

Suggestions for writing Abstract Test Suites (ATS) for INSPIRE conformance testing for Metadata and Network Services Suggestions for writing Abstract Test Suites (ATS) for INSPIRE conformance testing for Metadata and Network Services MIWP-5 Workshop 02. December 2014 Sven Böhme, Federal Agency for Cartography and Geodesy

More information

user manual GeoViewer DB Netze Fahrweg

user manual GeoViewer DB Netze Fahrweg user manual GeoViewer DB Netze Fahrweg Register of Infrastructure Updated: Juli 2018 Release: 1.11.0 Page 1 von 32 Content 1 List of illustrations 3 2 Basics 4 2.1 Components 4 2.1.1 Interfaces 4 2.1.2

More information

Designing a System Engineering Environment in a structured way

Designing a System Engineering Environment in a structured way Designing a System Engineering Environment in a structured way Anna Todino Ivo Viglietti Bruno Tranchero Leonardo-Finmeccanica Aircraft Division Torino, Italy Copyright held by the authors. Rubén de Juan

More information

RESEARCH ON REMOTE SENSING INFORMATION PROCESSING SERVICES BASED ON SEMANTIC WEB SERVICES

RESEARCH ON REMOTE SENSING INFORMATION PROCESSING SERVICES BASED ON SEMANTIC WEB SERVICES RESEARCH ON REMOTE SENSING INFORMATION PROCESSING SERVICES BASED ON SEMANTIC WEB SERVICES Qian Li a, *, Haigang Sui a, Yuanyuan Feng a, Qin Zhan b, Chuan Xu a a State Key Lab of Information Engineering

More information

Workflow Management in Spatial Studies:

Workflow Management in Spatial Studies: Workflow Management in Spatial Studies: Just an extra document or something with intelligence? Auteur: John Stuiver, Wageningen University Centre for Geo-Information Spatial information Answers to questions

More information

Note: For the creation of an application schema several software tools can be used. Enterprise Architect is one of the tools that can be used.

Note: For the creation of an application schema several software tools can be used. Enterprise Architect is one of the tools that can be used. 1.0 Definitions 1.1 Application Schema - An application schema is a fundamental element of any S-100 based product specification. The application schema serves two purposes: - It achieves a common and

More information

Rich Hilliard 20 February 2011

Rich Hilliard 20 February 2011 Metamodels in 42010 Executive summary: The purpose of this note is to investigate the use of metamodels in IEEE 1471 ISO/IEC 42010. In the present draft, metamodels serve two roles: (1) to describe the

More information

An Architecture for Semantic Enterprise Application Integration Standards

An Architecture for Semantic Enterprise Application Integration Standards An Architecture for Semantic Enterprise Application Integration Standards Nenad Anicic 1, 2, Nenad Ivezic 1, Albert Jones 1 1 National Institute of Standards and Technology, 100 Bureau Drive Gaithersburg,

More information

INSPIRE Infrastructure for Spatial Information in Europe

INSPIRE Infrastructure for Spatial Information in Europe INSPIRE Infrastructure for Spatial Information in Europe INSPIRE Domain Model Title Creator INSPIRE Domain Model IOC Services Team Date 23-03-2010 Subject Status Publisher Type Description Format Source

More information

Leveraging metadata standards in ArcGIS to support Interoperability. David Danko and Aleta Vienneau

Leveraging metadata standards in ArcGIS to support Interoperability. David Danko and Aleta Vienneau Leveraging metadata standards in ArcGIS to support Interoperability David Danko and Aleta Vienneau Leveraging Metadata Standards in ArcGIS for Interoperability Why metadata and metadata standards? Overview

More information

Esri Support for Geospatial Standards

Esri Support for Geospatial Standards APRIL 2017 ArcGIS Is Open and Interoperable Esri Support for Geospatial Standards Copyright 2017 Esri All rights reserved. Printed in the United States of America. The information contained in this document

More information

The AAA Model as Contribution to the Standardisation of the Geoinformation Systems in Germany

The AAA Model as Contribution to the Standardisation of the Geoinformation Systems in Germany The AAA Model as Contribution to the Standardisation of the Geoinformation Systems in Germany Markus SEIFERT, Germany Key words: ISO, CEN, OGC, AdV, Spatial Data Infrastructure SUMMARY Germany is a classic

More information

Automatic Test Markup Language <ATML/> Sept 28, 2004

Automatic Test Markup Language <ATML/> Sept 28, 2004 Automatic Test Markup Language Sept 28, 2004 ATML Document Page 1 of 16 Contents Automatic Test Markup Language...1 ...1 1 Introduction...3 1.1 Mission Statement...3 1.2...3 1.3...3 1.4

More information

This document is a preview generated by EVS

This document is a preview generated by EVS TECHNICAL REPORT RAPPORT TECHNIQUE TECHNISCHER BERICHT CEN/TR 15449-5 April 2015 ICS 07.040; 35.240.70 English Version Geographic information - Spatial data infrastructures - Part 5: Validation and testing

More information

SEXTANT 1. Purpose of the Application

SEXTANT 1. Purpose of the Application SEXTANT 1. Purpose of the Application Sextant has been used in the domains of Earth Observation and Environment by presenting its browsing and visualization capabilities using a number of link geospatial

More information

XML and Inter-Operability in Distributed GIS

XML and Inter-Operability in Distributed GIS XML and Inter-Operability in Distributed GIS KIM Do-Hyun and KIM Min-Soo, Republic of Korea Key words: GML(Geography Markup Language), Interoperability, GIS. SUMMARY Web GIS (Geographic Information Systems)

More information

FP7-INFRASTRUCTURES Grant Agreement no Scoping Study for a pan-european Geological Data Infrastructure D 4.4

FP7-INFRASTRUCTURES Grant Agreement no Scoping Study for a pan-european Geological Data Infrastructure D 4.4 FP7-INFRASTRUCTURES-2012-1 Grant Agreement no. 312845 Scoping Study for a pan-european Geological Data Infrastructure D 4.4 Report on recommendations for implementation of the EGDI Deliverable number D4.4

More information