SDSFIE Quality (SDSFIE-Q)

Size: px
Start display at page:

Download "SDSFIE Quality (SDSFIE-Q)"

Transcription

1 Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Quality (SDSFIE-Q) 12 December 2016 Prepared By: The Installation Geospatial Information and Services Governance Group (IGG) For: The Assistant Secretary of Defense (Energy, Installations & Environment) 2016

2 12 Dec 2016 SDSFIE Quality Executive Summary The Assistant Secretary of Defense for Energy, Installations, and Environment (ASD(EI&E), has issued this Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Quality (SDSFIE-Q) standard as part of the governance of installation geospatial information and services (IGI&S) under the authority granted in DoDI The IGI&S Governance Group (IGG) developed this standard in order to foster coordinated and integrated approaches for IGI&S across the Department. SDSFIE-Q specifies over-arching guidance for how DoD will implement a tiered approach to quality, including data quality processes, measures, and metrics for vector, raster, and geospatial services developed and used by the installation geospatial information and services (IGI&S) community. This standard does not include a required schema or data model, and therefore is not registered in the DoD IT Standards Registry (DISR) as a traditional IT standard would be. Instead, SDSFIE-Q is modeled after the ISO conceptual model of quality for geographic data (ISO is a normative reference for several DISR mandated geospatial standards which are related to the SDSFIE family of standards). SDSFIE-Q defines an IGI&S data quality framework, with enterprise-level and Component-level elements. When implemented together, this model for data quality will ensure IGI&S data and services align with IGI&S mission requirements, conform to DISR mandates, and are understandable, trusted and interoperable in accordance with DoDI SDSFIE-Q includes a formal process for defining measures to assess the quality and completeness of data, as well as the reporting of metrics associated with these data quality measures using the SDSFIE Metadata (SDSFIE-M) standard and other applicable means. Adherence to the data quality guidance and processes in this document are required any time IGI&S data or services are created, updated, maintained, or shared. SDSFIE-Q also includes certain specific data quality processes, measures, and metrics which apply individually to IGI&S vector data (SDSFIE-V), raster data (SDSFIE-R), or services (SDSFIE-S), as the case may be. i

3 12 Dec 2016 SDSFIE Quality Revision History Description Date Version Initial Draft 5 Mar 2015 DRAFT Draft 14 Oct 2015 DRAFT Draft 17 Dec 2015 DRAFT Draft 19 Jan 2016 DRAFT Draft 21 March 2016 DRAFT Section 1.7: Added Terms and Definitions Updated Table 1 to read Metadata Codes Section 3.1: Updated ADS Manager role definition Section 3.5.2: Updated methods for determining discoverability metrics Section 5.1: Updated DCS General Guidance section Section 5.2: Moved DCS Minimum Content section to DCS General Guidance Appendix B: Updated IGI&S Metrics table Appendix C: Added Metadata Examples for repository and aggregate Section 1.4: Added references to the DoD CIO Memo and DoDI Section 3.2.3: Removed services from DCS requirement Section : Added section for SDSFIE-Q Metrics Registry Appendix B: Added descriptions for each field in the metrics table. 16 May 2016 DRAFT 22 Jun 2016 FINAL DRAFT (DQWG) Section 3.2.3: Updated to reference other terminology for DCSs. Section Expanded SDSFIE Quality Contract Language section and included reference for Annex C: SDSFIE Quality Contract Language Examples (TBD) Section 3.4.2: Updated to include language to refer to Component guidance regarding use of automated tools. 16 Aug 2016 FINAL DRAFT (IGG) ii

4 12 Dec 2016 SDSFIE Quality Table of Contents Executive Summary... i Part 1: Overview of SDSFIE-Q Introduction... 1 Purpose...1 Authority...2 Scope...2 References...2 Acronyms...3 Document Maintenance...4 Terms and Definitions ISO Components of a Quality Framework... 7 ISO Data Quality Evaluation Process Data Quality Units s Evaluation Methods Results Metaquality ISO Data Quality Reporting IGI&S Quality Management Framework Defined Roles...16 Quality Framework Elements: Quality Management Tools Authoritative Data Source (ADS) Quality Management Plans Data Content Specification (DCS) SDSFIE Quality Contract Language Quality Framework Elements: Data Generation Create and Acquire Data Maintain Data Create Metadata Quality Framework Elements: Data Evaluation Quality Metrics Evaluate Data Quality Report Quality Results - Metadata Quality Framework Elements: End Use Releasability Discoverability iii

5 12 Dec 2016 SDSFIE Quality Feedback Part 2: SDSFIE-V Data Quality Guidance SDSFIE-V Introduction IGI&S Vector Datasets Common Installation Picture (CIP) Dataset SDSFIE-V Data Content Specifications (DCS) DCS General Guidance...27 DCS Validation SDSFIE-V Quality Metrics SDSFIE-V Data Quality Reporting: SDSFIE Metadata CIP Metadata Requirements...27 Part 3: SDSFIE-R Data Quality Guidance SDSFIE-R Introduction SDSFIE-R Data Quality Documentation: SDSFIE Metadata Part 4: SDSFIE-S Data Quality Guidance SDSFIE-S Introduction SDSFIE-S Data Quality Documentation: SDSFIE Metadata Appendix A: ISO Data Quality Standard s Appendix B: IGG Data Quality Metrics Appendix C: Metadata Examples Annex A: Data Content Specification (DCS) General Guidance... A1 Annex B: Data Content Specification (DCS) Example... B1 Annex C: SDSFIE Quality Contract Language Examples... C1 iv

6 12 Dec 2016 SDSFIE Quality Table of Figures Figure 1: ISO Components of a Quality Framework... 7 Figure 2: ISO Data Evaluation Process... 8 Figure 3: Data Quality Units... 8 Figure 4: IGI&S Quality Management Framework Figure 5: Quality s vs. Quality Metrics v

7 12 Dec 2016 SDSFIE Quality Table of Tables Table 1: Metadata Codes for Data Quality Scope (from ISO 19115)... 9 Table 2: Data Quality RACI Chart Table 3: Data Quality Scope Codes for IGI&S Data Table 4: ISO Data Quality Standard s Table 5: IGG Data Quality Metrics Table 6: Metadata Example - Aggregate Scope Table 7: Metadata Example: Repository Scope vi

8 12 Dec 2016 SDSFIE Quality Part 1: Overview of SDSFIE-Q 1 Introduction The Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) are a family of IT standards (models, specifications), which define a Department of Defense (DoD)-wide set of semantics intended to maximize interoperability of geospatial information and services for installation, environment and civil works missions. The SDSFIE family consists of seven parts, defined in the SDSFIE Governance Plan v 1.0, May 13, The SDSFIE Quality standard (SDSFIE-Q) is one of the seven parts of SDSFIE. It specifies over-arching guidance for how DoD will implement a tiered approach to quality, including data quality processes, measures, and metrics for vector, raster, and geospatial services developed and used by the installation geospatial information and services (IGI&S) community as defined in DoDI SDSFIE-Q does not include a required schema or data model, and therefore is not registered in the DoD IT Standards Registry (DISR) as a traditional IT standard would be. Instead, SDSFIE-Q is modeled after the ISO conceptual model of quality for geographic data (ISO is a normative reference for several DISR mandated geospatial standards which are related to the SDSFIE family of standards 1 ). SDSFIE-Q defines an IGI&S data quality framework, with enterprise-level and Component-level elements. When implemented together, this model for data quality will ensure IGI&S data and services align with IGI&S mission requirements, conform to DISR mandates, and are understandable, trusted and interoperable in accordance with DoDI SDSFIE-Q includes a formal process for defining measures to assess the quality and completeness of data, as well as the reporting of metrics associated with these data quality measures using the SDSFIE Metadata (SDSFIE-M) standard and other applicable means. Adherence to the data quality guidance and processes in this document are required any time IGI&S data or services are created, updated, maintained, or shared. SDSFIE-Q also includes data quality processes, measures, and metrics which specifically apply to IGI&S vector data (SDSFIE-V), raster data (SDSFIE-R), and services (SDSFIE-S). SDSFIE-Q is designed to align with all data standards established or accepted for use by the IGI&S community. This document is organized into four parts: Part 1: Quality guidance for all IGI&S data and services. Part 1 consists of two main concepts: An overview of ISO foundational concepts that shape SDSFIE-Q. (section 2) The SDSFIE-Q IGI&S Quality Management Framework. (section 3) Part 2: Quality guidance specific to vector data - SDSFIE-V. (sections 4-7) Part 3: Quality guidance specific to raster data - SDSFIE-R. (sections 8-9) Part 4: Quality guidance specific to services - SDSFIE-S. (sections 10-12) Purpose The purpose of this document is to establish over-arching guidance for how DoD will implement a tiered approach to quality, including data quality processes, measures, and metrics for vector, raster, and geospatial services. This guidance is designed to ensure the IGI&S community provides trusted, 1 The NMF/NMIS 2.x form the basis of SDSFIE-M and the quality model used by these standards are an implementation of the concepts in ISO Thus the ISO conceptual model is used as a basis for SDSFIE-Q in order to align with these concepts and schemas in NMF/NMIS and SDSFIE-M. 1

9 12 Dec 2016 SDSFIE Quality authoritative, and accurate geospatial data and services to the DoD for decision making. Implementing SDSFIE-Q is necessary to meet the requirements of DoDI , which states that data will be made visible, accessible, understandable, trusted, and interoperable throughout their lifecycle for all authorized users. Authority In accordance with DoDI , this standard applies to the management of DoD installations and environment to support military readiness in the Active, Guard, and Reserve Components with regard to facility construction, sustainment, and modernization including the operation and sustainment of military test and training ranges as well as the US Army Corps of Engineers Civil Works community. Its applicability for other interested organizations is suggested but not mandatory. DoDI grants authority to the ASD(EI&E) to develop, manage, and publish IGI&S standards and requires that these standards be coordinated through the Geospatial Intelligence Standards Working Group (GWG). SDSFIE-Q is conformal with ISO 19157, the current spatial data quality standard referenced by other geospatial data standards mandated for DoD use by the GWG and the Joint Enterprise Standards Committee (JESC). In accordance with the procedures in DoDI , the ASD(EI&E) develops new or revised standards based upon input from the IGI&S Governance Group (IGG), the official standards consensus body for IGI&S. While the IGG is the defacto author of SDSFIE- Q, it becomes mandatory for the IGI&S community once it is formally issued by the ASD(EI&E). Scope This standard is intended to be enterprise-level guidance, consisting of a quality management framework which includes elements to be implemented at the Department (enterprise) level, the DoD Components level, and organizational levels below these as directed by the next highest level. This framework includes a set of processes, quality measures, and corresponding metrics (e.g. minimum standards) for spatial data quality. These are intended to guide and constrain the spatial data quality management policy, guidance and plans issued by the Components IGI&S Programs, but not necessarily specify the means by which to achieve them. This plan describes compliance with documented standards, database referential integrity, and data quality reporting elements. It outlines management processes to define and document clear metrics and guidance for data quality. This document describes processes needed to ensure quality data management or data collection and maintenance at a Department-wide level. It identifies the important elements that should be covered in IGI&S Program level quality management plans (or guidance). These elements can also be used as measures to evaluate the completeness of the Components guidance and conformance to the framework prescribed herein. References ASD(EI&E), Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Metadata (SDSFIE-M): Implementation Guidance, Version 1.0., 8 September 2015 DoD Chief Information Officer Memorandum, Department of Defense Information Enterprise Architecture, Version 2.0, August 10, 2012 DoD Instruction , Installation Geospatial Information and Services (IGI&S), 9 April 2015 DoD Instruction , Sharing Data, Information, and Information Technology (IT) Services in the Department of Defense, 5 August 2013 DoD Instruction , Unique Identification (UID) Standards for Support DoD Net-Centric Operations, 4 vember 2015 DoD Directive , National Geospatial-Intelligence Agency (NGA), 29 July 2009 ISO , Geographic information Metadata -- Part 1: Fundamentals ISO 19157:2013, Geographic information -- Data quality 2

10 12 Dec 2016 SDSFIE Quality DUSD(I&E), Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Governance Plan, Version 1.0, 13 May 2014 DUSD(I&E), Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Metadata (SDSFIE-M): Conceptual Schema, Version 1.0.2, 28 August 2014 ADS ASD(EI&E) CIP CIO DCG DCS DISDI DISR DoD DoDM DLS DoDD DoDI DUSD(I&E) EI&E GEOINT GIO GWG HQ I&E IGG IGI&S ISO JESC NGA NMF NMIS NSG OSD QAP RACI RAM Acronyms Authoritative Data Source Assistant Secretary of Defense for Energy, Installations, and Environment Common Installation Picture Chief Information Officer Data Content Guidance Data Content Specification Defense Installations Spatial Data Infrastructure DoD IT Standards Registry Department of Defense DoD Manual Data Layer Specification Department of Defense Directive Department of Defense Instruction Deputy Under Secretary of Defense for Installations and Environment Energy, Installations, and Environment Geospatial Intelligence Geospatial Information Officer Geospatial Intelligence Standards Working Group Headquarters Installations and Environment IGI&S Governance Group Installation Geospatial Information and Services International Standards Organization Joint Enterprise Standards Committee National Geospatial-Intelligence Agency National System for Geospatial-Intelligence (NSG) Metadata Foundation NSG Metadata Implementation Specification National System for Geospatial-Intelligence Office of the Secretary of Defense Quality Assurance Plan Responsible, Approval, Consulted, Informed Responsibility Assignment Matrix RPAD Real Property Asset Database RPIR Real Property Inventory Requirements SDSFIE Spatial Data Standards for Facilities, Infrastructure, and Environment SDSFIE-M Spatial Data Standards for Facilities, Infrastructure, and Environment Metadata SDSFIE-Q SDSFIE-R Spatial Data Standards for Facilities, Infrastructure, and Environment - Quality Spatial Data Standards for Facilities, Infrastructure, and Environment - Raster SDSFIE-S Spatial Data Standards for Facilities, Infrastructure, and Environment - Services 3

11 12 Dec 2016 SDSFIE Quality SDSFIE-V SMIS Spatial Data Standards for Facilities, Infrastructure, and Environment - Vector SDSFIE Metadata Implementation Specification Document Maintenance This document will be reviewed and updated as needed, with each action documented in a revision history log. When changes occur, the version number will be updated to the next increment and the date, owner making the change, and change description will be recorded in the revision history log of the document. Terms and Definitions Accessible Data and services can be accessed via the global information grid (GIG) by users and applications in the enterprise. Data and services are made available to any user or application except where limited by law, policy, security classification, or operational necessity. [DoD CIO Memorandum] Accuracy Closeness of agreement between a test result or measurement result and the true value. [ISO :2006] Aggregate A data quality scope used to evaluate metrics applied to a dataset that is aggregated from other datasets. te: Data quality for OSD and Component aggregated CIP datasets will be evaluated using the aggregate scope. Authoritative Data Source A recognized or official data source with a designated mission statement, source, or product to publish reliable and accurate data for subsequent use by customers. An authoritative data source may be the functional combination of multiple separate data sources. [DoDD ] Common Installation Picture The distinct minimum set of geospatial features and imagery necessary to provide a foundational map depicting DoD installations and sites. The purpose of the CIP is to provide a readily available, standardized map background to serve as the basis for planning and execution of EI&E responsibilities and functions. [DoDI ] Conformance Fulfillment of specified requirements. [ISO 19105] Conformance Quality Level Threshold value or set of threshold values for data quality results used to determine how well a dataset meets the criteria set forth in its DCS or user requirements. [ISO 19157] Correctness Correspondence with the universe of discourse. [ISO 19157] Data Content Specification Detailed description of a dataset or dataset series together with additional information that will enable it to be created, supplied to and used by another party. [ISO 19131] te: The above definition is for a data product specification. In SDSFIE-Q data product specifications are called data content specifications. 4

12 12 Dec 2016 SDSFIE Quality Data Quality Basic Generic data quality measure used as a basis for the creation of specific data quality measures. [ISO 19157] te: Data quality basic measures are abstract data types. The cannot be used directly when reporting data quality. Data Quality Element A component describing a certain aspect of the quality of geographic data. Data Quality Result The output of a data quality evaluation. Data Quality Scope Specifies the extent, spatial and/or temporal, and/or common characteristic(s) that identify the data on which data quality is to be evaluated. Data Quality Unit The combination of a scope and one or more data quality elements. Dataset Identifiable collection of data. [ISO 19115] Data Steward An organization within an authoritative source that is charged with the collection and maintenance of authoritative data. Direct Evaluation Method Method of evaluating the quality of a dataset based on inspection of the items within the dataset. [ISO 19157] Feature Abstraction of real world phenomena. [ISO 19101] Feature Attribute Characteristic of a feature. [ISO 19101] Feature type Class of features having common characteristics. [ISO 19156] IGI&S The subset of GI&S activities that apply to the management of DoD installations and environment to support military readiness in the Active, Guard, and Reserve Components with regard to facility construction, sustainment, and modernization, including the operation and sustainment of military test and training ranges, and which support DoD business enterprise priorities as defined in the DoD business enterprise architecture (BEA). IGI&S supports and is enabled by geospatial engineering and general engineering as defined in Joint Publication [DoDI ] Interoperability The ability of systems, units, or forces to provide data, information, materiel, and services to and accept the same from other systems, units, or forces and to use the data, information, materiel, and services so exchanged to enable them to operate effectively together. IT and National Security Systems interoperability includes both the technical exchange of information and the end-to-end operational effectiveness of that exchange of information as required for mission accomplishment. More than just 5

13 12 Dec 2016 SDSFIE Quality information exchange, it includes systems, processes, procedures, organizations, and missions over the lifecycle and must be balanced with information assurance. [DoDI ] A description of the type of evaluation being assessed to determine a data quality result. Metric A specific, measurable indicator that addresses data quality compliance. Metadata te: metrics are created when measures are assigned a scope. Information describing the characteristics of data, data or information about data, or descriptive information about an entity s data, data activities, systems and holdings. For example, discovery metadata allows data assets to be found using enterprise search capabilities. Metadata can be structural (specifying the format structure), semantic (specifying the meaning), or descriptive (providing amplifying or interpretive information) for data, information, or IT services. [DoDI ] Metaquality Information describing the quality of data quality. [ISO 19157] Quality Degree to which a set of inherent characteristics fulfills requirements. [ISO 19157] Repository A data quality scope used to evaluate metrics applied to a data repository. te: Data quality evaluations for an ADS as a whole will use the repository scope. Standalone Quality Report Free text document providing fully detailed information about data quality evaluations, results and measures used. [ISO 19157] Trusted Users and applications can determine and assess the suitability of the source because the pedigree, security level and access control level or each data asset or service is known and available. [DoD CIO Memorandum] Understandable Users and applications can comprehend the data, both structurally and semantically, and readily determine how the data may be used for their specific needs. [DoD CIO Memorandum] Visible The property of being discoverable. All data assets (intelligence, non-intelligence, raw, and processed) are advertised or made visible by providing metadata, which describes the asset. [DoD CIO Memorandum] 6

14 12 Dec 2016 SDSFIE Quality 2 ISO Components of a Quality Framework The data quality concepts defined in ISO 19157: 2013 Geographic Information: Data Quality provide the foundation for the structure of SDSFIE-Q. SDSFIE-Q conforms to the ISO conceptual model of quality for geographic data. This model defines components for measuring, evaluating and reporting data quality based on the ISO data quality elements. Figure 1 (below), from ISO (Figure 1), provides an overview of these components. This section will describe each ISO quality framework component, providing the background necessary to understand the IGI&S Quality Management Framework introduced in section 3. Figure 1: ISO Components of a Quality Framework ISO Data Quality Evaluation Process As shown in Figure 2 (ISO Figure 12) below, the data quality evaluation process consists of establishing data quality units, defining measures, determining the proper evaluation method and reporting the results of the evaluation in metadata. Data quality should be evaluated when data is created or updated, to determine compliance with a data content specification or to determine compliance with user requirements. 7

15 12 Dec 2016 SDSFIE Quality Figure 2: ISO Data Evaluation Process (ISO Figure 12) Data Quality Units In accordance with ISO 19157, there are two aspects of data quality; a scope and a defined data quality element. The scope and data quality element pair is referred to as the data quality unit. A data quality unit must be defined before determining measures and evaluation methods (see Figure 2, Step 1). Figure 3 below provides examples of data quality units. Figure 3: Data Quality Units 8

16 12 Dec 2016 SDSFIE Quality Data Quality Scope The data quality scope identifies the extent (spatial, temporal, subset with defined characteristics, etc.) of the data that will be evaluated. Scope is constrained in metadata by the MD_ScopeCode element in ISO 19115:2003, which is the ISO standard for metadata, and shall be reported in the DQ_Scope element. Table 1 below contains the full list of ISO scope codes. Refer to section for the subset of scope code values that will be used for IGI&S data. MD_ScopeCode attribute attributetype collectionhardware collectionsession dataset series nongeographicdataset dimensiongroup feature featuretype propertytype fieldsession software service model tile metadata initiative sample document repository aggregate product collection coverage application Table 1: Metadata Codes for Data Quality Scope (from ISO 19115) Definition information applies to the attribute value information applies to the characteristic of a feature information applies to the collection hardware class information applies to the collection session information applies to the dataset information applies to the series information applies to non-geographic data information applies to a dimension group information applies to a feature information applies to a feature type information applies to a property type information applies to a field session information applies to a computer program or routine information applies to a capability which a service provider entity makes available to a service user entity through a set of interfaces that define a behaviour, such as a use case information applies to a copy or imitation of an existing or hypothetical object information applies to a tile, a spatial subset of geographic data information applies to metadata information applies to an initiative information applies to a sample information applies to a document information applies to a repository information applies to an aggregate resource metadata describing an ISO data product specification information applies to an unstructured set information applies to a coverage information resource hosted on a specific set of hardware and accessible over a network ISO Data Quality Elements Once a scope is defined, a data quality element must be specified. The data quality element describes the specific aspect of data quality that is being evaluated. ISO divides data quality elements into six categories: completeness, logical consistency, positional accuracy, thematic accuracy, temporal accuracy, and usability. The following sections define these data quality elements in accordance with ISO Completeness Completeness is defined as the presence and absence of features, their attributes and relationships. Completeness is comprised of the following elements: 9

17 12 Dec 2016 SDSFIE Quality Commission data that is excess or additional data beyond what is required. Omission data that is absent or missing from the data layer Logical Logical is defined as the adherence to the ruleset for the data structure, attribution and relationships. Logical consists of: Conceptual consistency adherence to rules of the conceptual schema. Domain consistency adherence of values to the value domains. Format consistency degree to which data is stored in accordance with the physical structure of the dataset. Topological consistency correctness of the explicitly encoded topological characteristics of a dataset Positional Accuracy Positional Accuracy is defined as the accuracy of the position of features within a designated spatial reference system. Positional accuracy includes both absolute and relative accuracy. Positional accuracy consists of the following data quality elements: Absolute or external accuracy - the closeness of reported coordinate values to values accepted as true. Relative or internal accuracy - the closeness of the relative positions of features in a dataset to their respective relative positions accepted as true Temporal Quality Temporal Quality is defined as the quality of the temporal aspect of the data layer. Temporal Quality addresses the following elements: Accuracy of a time measurement closeness of reported time measurements to true values. Temporal consistency correctness of the order of events. Temporal validity validity of data with respect to time Thematic Accuracy Thematic Accuracy is defined as the accuracy or correctness of attributes. Thematic Accuracy consists of: Classification correctness correctness of the classes of features or attributes. n-quantitative attribute correctness correctness of an attribute. Quantitative attribute accuracy accuracy of an attribute compared to a known value Usability The usability element is based on user requirements and is therefore less structured than the other elements. Usability can be used to describe specific quality information about a dataset s suitability for a particular application or conformance to a set of requirements. For SDSFIE-Q, usability is defined as the adherence to a data content specification s A measure is a description of the type of evaluation being assessed to determine a data quality result. A measure will be evaluated for a specific data quality unit. Data quality measures can be derived from a basic measure (see Step 2 in Figure 2). A basic measure is a generic data quality measure used as a 10

18 12 Dec 2016 SDSFIE Quality basis for the creation of specific data quality measures. Basic measures are used for count-related and general statistical measures that share commonalities. Appendix A contains a summarized list of measures and their corresponding basic measures, where applicable, as identified in ISO Annex G. Examples of measures include: Rate of missing items Number of duplicate feature instances Conceptual schema compliance Number of invalid self-intersect errors Number of incorrectly classified features When creating measures that are not included in ISO Annex G, the user should structure the new measures using the standard basic measures. When measures are associated with a specific data quality unit (scope and element), they are referred to as metrics within this document (see section 3.4.1) Evaluation Methods An evaluation method is the procedure used to evaluate the defined measure for a data quality unit (see Step 3 in Figure 2). As described in ISO 19157, data may be evaluated using direct, indirect or aggregation-based methods. 1. Direct evaluation requires a full inspection or sampling of the data, either with or without external reference data. The results of a direct evaluation will normally be quantitative. External direct evaluation indicates that external reference data is used for the evaluation. Internal direct evaluation indicates no external data was referenced. 2. Indirect evaluation involves using inherent knowledge along with existing information, such as the source and date, to determine the quality of the data. Indirect evaluation is not recommended for SDSFIE-Q since the results may be subjective, inconsistent and not repeatable. 3. Data quality results may be aggregated or derived from the results of existing evaluations. These results can be aggregated from different scopes and data quality elements to provide an aggregated result Results The output of a data quality evaluation is a result (see Step 5 in Figure 2). Each data quality element evaluated can have multiple results for a dataset. ISO identifies the following types of results: Quantitative Result A single value or multiple values based on the measure being evaluated. Conformance Result Used when the results of a data quality evaluation are compared with a known conformance quality level. The conformance quality level should be specified in a data content specification or user requirements. Descriptive Result A textual statement describing the subjective data quality evaluation result. Descriptive results are not recommended for SDSFIE-Q since the results are based on a subjective, inconsistent and not repeatable evaluation method. Coverage Result The result of a data quality evaluation, organized as a coverage Metaquality Metaquality is used to evaluate the results of a data quality evaluation, or to compare results from multiple data quality evaluations. Metaquality can evaluate confidence, representativity or homogeneity of a dataset or multiple datasets. 11

19 12 Dec 2016 SDSFIE Quality ISO Data Quality Reporting Reporting the results of a data quality evaluation provides users with information to determine the overall quality of a dataset, including compliance with data content specifications or satisfaction of other user requirements. The results of a data quality evaluation will be reported in metadata using the DQ_DataQuality element. A stand-alone data quality report may also be required if the data quality results are aggregated or derived. The stand-alone report should be used to provide additional details that cannot be expressed in the metadata. 12

20 12 Dec 2016 SDSFIE Quality 3 IGI&S Quality Management Framework The SDSFIE quality management framework for IGI&S data and services is built on the components of data quality management established in ISO and discussed in the previous section. This framework consists of defined IGI&S roles (clear boxes in Figure 4) corresponding to various objectives (elements) described in SDSFIE-Q. The elements of the quality management framework as shown in Figure 4, are grouped into four categories: Quality Management Tools (section 3.2) Data Generation (section 3.3) Data Evaluation (section 3.4) End Use (section 3.5) Figure 4: IGI&S Quality Management Framework 13

21 12 Dec 2016 SDSFIE Quality Most SDSFIE-Q roles and responsibilities flow from the authorities described in section 1.2, specifically from DoDI These roles and responsibilities are outlined in Table 2 through the use of a responsibility assignment matrix (RAM), also known as a RACI matrix. A RACI matrix describes the participation by various roles in completing tasks or deliverables for a project or business process 2. RACI is an acronym derived from the four key responsibilities used in this document 3 : Responsible (R) - Roles that have a significant responsibility in performing the objective as well as government roles that have responsibility over an objective. Approval (A) Roles that have review and approval authority. There may be several levels of review in the same objective. Consulted (C) Roles that collaborate and consult at some point during the process but do not have approval authority. Informed (I) Roles that are socialized concerning the activities or artifacts but do not have approval, review nor consulting responsibilities. 2 Margaria, Tiziana (2010). Leveraging Applications of Formal Methods, Verification, and Validation: 4th International Symposium on Leveraging Applications, Isola 2010, Heraklion, Crete, Greece, October 18 21, 2010, Proceedings, Part 1. Springer. p ISBN te that the set selected for this document is most commonly used for decision-making. The A is often used to mean Accountable, but, in our case Approval is a better choice. 14

22 Applicability: SDSFIE Part (V, R, S) IGI&S Governance Group (IGG) OASD(EI&E) Geospatial Information Officer (GIO) Component IGI&S Programs Data Quality WG ADS Manager Policy Proponents Data Requirements Proponents Project-Based Data Generator IGI&S Program Analysts Users 12 Dec 2016 SDSFIE Quality Table 2: Data Quality RACI Chart Quality Framework Elements Quality Management Tools Develop Quality Management Plans Establish and Maintain Authoritative Data Source *Develop Data Content Specifications Specify SDSFIE Quality Contract Language Data Generation V,R,S A A R,A C C C C I I I V,R,S I I A I R C C I I I V,S A R R C C C C I I,C I V,R,S I I C I C C C A,R I I *Create and Acquire Data V,R,S A R R C R I R R R I Data Evaluation End Use *Maintain Data V,R,S I R R C I I R R R I *Create Metadata V,R,S A R R C I C R R R I *Establish Metrics V,R,S A R R,A R,C R C C I I I Evaluate Data Quality V,R,S I I R,A I R I I R R I *Report Quality Results V,R,S A R R C R R I *Determine Releasability V,R,S I R R I I R C R R I *Make Data Discoverable V,R,S I R R I R I R R R I Provide Feedback V,R,S C C R,A C C R R R R R te: Responsibilities with an asterisk (*) denote objectives that will happen at OSD level and at Component level, and thus may have different responsibilities for a given role at each level 15

23 12 Dec 2016 SDSFIE Quality Defined Roles The roles identified in data quality management are as follows: IGI&S Governance Group (IGG) OASD(EI&E) Geospatial Information Officer (GIO) Component IGI&S Programs Data Quality Working Group The IGG is responsible for establishing data quality guidance to promote the adoption and refinement of SDSFIE-Q. The IGG will maintain the metadata content standard and provide guidance for using the metadata standard to describe data quality. The IGG will create guidance for and review Component quality managment plans in accordance with SDSFIE-Q. The role of the GIO is established in DoDI Specific responsibilities involve leadership and support of the IGG regarding data quality processes and quality management plans, leadership of the data quality working group, and coordination with the DoD GEOINT Manager. The GIO is also responsible for publishing data content specifications at the OSD level in coordination with the Data Quality Working Group. The DoD Component IGI&S Programs are responsible for coordinating and directing the creation, maintenance, and distribution of IGI&S data at the Component level. Component IGI&S Programs will develop Component level data quality metrics and guidance through quality management plans and data content specifications. The IGG created the Data Quality Working Group to establish enterprise processes and management structures for evaluating data quality. Data Quality Working Group responsibilities include: Formulate an enterprise approach to IGI&S data quality. Develop a quality management framework that can be integrated with IGI&S Program management processes. Identify IGI&S requirements for data quality and establish measures and processes to evaluate data maturity and compliance. Authoritative Data Source (ADS) Manager Policy Proponents Data Requirements Proponents A person, office, or organization responsible for maintaining the IGI&S ADS for each installation. The ADS Manager role is Component-specific. Refer to Component guidance for more information about how the role is implemented. A policy proponent is a DoD organization that has responsibility for enterprise data requirements and guidance. A policy proponent is typically at the Component headquarters or OSD level. Policy proponents are not normally responsible for creating and maintaining spatial data. An agency, department, activity or organization that has primary responsibility for material or subject matter expertise in its area of interest and has lead responsibility for coordinating the collection, coverage and stewardship, including maintenance and update, of a specific spatial data theme or mission dataset. Data proponents are not typically directly in the line of authority for the IGI&S Programs, but create and maintain spatial data. Relations must be developed with data proponents so that the data they create is compliant with enterprise spatial data requirements. Data requirement proponents are responsible for providing clear data requirements and specifications for data. 16

24 12 Dec 2016 SDSFIE Quality Project-Based Data Generator IGI&S Program Analysts Users The government official responsible for generating and approving contract arrangements for creation and maintenance of IGI&S data. Care must be taken to ensure the contracted data creation and maintenance services are controlled and quality of all deliverables is reviewed so that they meet specifications and enterprise standards. The government bears some quality assurance responsibilities in all contracted arrangements, but contractors should be held responsible for understanding data requirements and specifications. Spatial data are frequently created and maintained by in-house personnel. Sometimes these persons are under the direct control of an IGI&S program, but they also can be controlled by other business lines or mission areas (e.g. public works, environmental, etc.). Care must be taken to ensure the data creation and maintenance activities are controlled and quality of all deliverables is reviewed so that they meet all spatial data creation and maintenance metrics and guidance. Installation personnel who create/maintain IGI&S spatial data are responsible for understanding data requirements and specifications. Data users are those persons who need to discover spatial data resources, develop content, or make information release decisions. Spatial data enduse can be constrained by certain quality considerations, and end users bear responsibilities associated with these considerations in their end use such as: Consulting metadata to know specific constraints on data being used. Understanding spatial data constraint implications on decision making. Reviewing data content products before they are released. Quality Framework Elements: Quality Management Tools Quality Management tools are methods used to define and execute quality control and assurance through guidance, contract language, data content specifications (DCS) and metrics. The following quality management actions are identified in SDSFIE-Q: Establish and Maintain ADS (section 3.2.1) Develop Quality Management Plans (section 3.2.2) Develop data content specification (DCS) (section 3.2.3) Specify SDSFIE Quality Contract Language (section 3.2.4) Authoritative Data Source (ADS) In accordance with DoDI , the Components shall establish a trusted authoritative data source (ADS) for all geospatial data and products produced, acquired, or maintained to fulfill EI&E missions for each installation. Components will determine how to implement the ADS, including at which level the ADS resides (i.e. headquarters, MAJCOM, installation). The ADS is expected to contain vector and raster data, especially those data which are essential to implementation of the DoD policies and requirements described in DoDI as being related to the ADS requirement. The IGG intends to develop implementation guidance for the ADS, which will provide 17

25 12 Dec 2016 SDSFIE Quality further detail on what the ADS will contain and thus what quality measures or processes should be applied to it. Components shall ensure that data and products included in their ADS pass validation checks for data quality compliance as established in the ADS Implementation Guidance and as may be defined in the Component s quality management plan. Because the ADS is expected to include IGI&S data which will be created or acquired through contracts or projects not directly executed by the IGI&S Programs, it is important that each Component develop processes by which such data can be standardized, normalized, and validated for conformance with all applicable standards. Ideally this validation should occur before the data is accepted into the ADS, but it can occur at a later stage. Quality evaluations should also be performed when ADS content is maintained or updated. Aggregating data within an ADS can greatly aid the screening of IGI&S data to identify data quality issues and rapidly implement corrections. The following metrics for ADS compliance will be evaluated in SDSFIE-Q: Pass or Fail: The Component ADS passed validation checks specified in the ADS Implementation Guidance (see section ) The percentage of installations for which the Component has established ADSs The percentage of installations covered by imagery with a raster resolution compliant with SDSFIE-R standards The percentage of installations covered by imagery with a raster temporal accuracy compliant with SDSFIE-R standards The percentage of required data layers (as specified in Component guidance) not included in the Component ADS See Appendix B for the full list of IGG Data Quality Metrics Quality Management Plans Each DoD Component shall develop a quality management plan to ensure consistency and quality of IGI&S data and services. The quality management plan shall guide all levels of the Component with respect to implementation of SDSFIE-Q, providing mission-specific guidance, metrics or other direction as needed. The quality management plan shall include the following information: A description of how the Component is implementing ADSs for each installation, including at what level the ADS(s) resides. The processes by which the Component will ensure each ADS is standardized, normalized, and validated for conformance with all applicable standards. Roles and responsibilities for data quality management at the Component level. Data quality units in addition to those established in SDSFIE-Q. Metrics and associated measures reported to the Component headquarters and the process for identifying and establishing new metrics. Components should refer to the standardized ISO measures listed in Appendix A to aid in developing new metrics requirements. Data content specifications, as required in SDSFIE-Q, defined at the HQ level shall be included or incorporated by reference. The process for collecting and maintaining layer metadata. Component quality management plans will be reviewed and validated by the IGG through the document approval process defined in the SDSFIE Governance Plan (section 3.4.1) to ensure they are consistent with DoD policy and IGG guidance or processes Quality Management Plan Implementation Scorecard Each Component s progress toward implementation of SDSFIE-Q will be evaluated using the following Implementation Scorecard categories, as established in the SDSFIE Governance Plan (section ): 18

26 12 Dec 2016 SDSFIE Quality Green: A quality management plan exists and a high level of implementation progress has been made. Yellow: A quality management plan exists and a significant level of implementation progress has been made. Red: quality management plan exists or little or no implementation progress has been made. The timing of scorecard implementation will be determined by future IGG guidance and procedures Data Content Specification (DCS) OSD and Component IGI&S Programs shall create DCSs for IGI&S vector data. A DCS (alternatively known as a Quality Assurance Plan (QAP), Data Layer Specification (DLS), or Data Content Guidance (DCG))documents the format and quality expectations for a geospatial data layer (or set of layers). The quality criteria in the DCS shall be used to direct and control quality during the generation, collection or maintenance of geospatial features. The DCS shall also specify metrics and data quality evaluation procedures for each layer. See section 5 for DCS format, content and quality reporting guidance SDSFIE Quality Contract Language Data quality is inextricably linked with data production or procurement. Whenever IGI&S data or services are procured, DoD activities should use contract language which specifies the data collection and maintenance expectations in acquisition-binding terms. The contract provisions/requirements should ensure deliverables conform to all standardization and quality requirements specified in the SDSFIE family of standards, including SDSFIE-Q. For example, DoD contracts for installation geospatial data, installation master plans or integrated natural resources management plans (INRMPs) should include a section which requires that GIS data used by the contractor and submitted to the government in any final deliverables must be formatted in accordance with the Component s current, registered Adaptation of SDSFIE-V and applicable DCSs. The deliverables should include metadata formatted in accordance with the current version of SDSFIE-M, and should also be populated to conform to SDSFIE-Q requirements as well as any applicable Component quality management plans. Annex C of this document (to be published) contains more specific guidelines for how to cite SDSFIE standards in DoD contracts pertaining to IGI&S. Quality Framework Elements: Data Generation Data generation includes all activities associated with creating or acquiring geospatial data and metadata resources, and generating geospatial services. Data can be created or obtained using internal or contracted resources Create and Acquire Data Spatial data products can be created, generated or acquired in many ways. Examples include collection by analysts in the field, heads-up digitizing, the result of analysis performed on another data layer or purchase Maintain Data Data maintenance includes routinely assessing that data is current and complete, and enforcing version control. The data steward is responsible for ensuring that the data is maintained at an interval established by the Component IGI&S Program Create Metadata All IGI&S data and services should be properly documented with SDSFIE-M compliant metadata in accordance with the SDSFIE-M Implementation Guidance. Metadata shall be populated when data is created (or acquired). Metadata shall be reviewed and updated, as necessary, when data is maintained. For data quality reporting guidance using metadata, refer to section

27 12 Dec 2016 SDSFIE Quality Quality Framework Elements: Data Evaluation Quality Metrics The IGG, Component IGI&S Programs and ADS Managers are responsible for identifying and establishing data quality metrics based on policy drivers and business requirements. Metrics are specific, measureable indicators that address data quality compliance. Metrics are created when measures (as defined by ISO 19157) are assigned a corresponding scope, as shown in the example in Figure 5 below. In this example, the number of buildings reported in the Real Property Asset Database (RPAD) are compared to the number of buildings included in the feature type for building in the Common Installation Picture (CIP) dataset. The data unit consists of the feature type scope and the omission data quality element. The rate of missing items is being measured, which is based on the error rate basic measure. Since the measure is associated with a scope, a metric is created: the percentage of buildings reported in the RPAD that do not have a corresponding geospatial representation. Figure 5: Quality s vs. Quality Metrics Data quality metrics shall be defined for each data quality unit established in SDSFIE-Q, DCSs or Component quality management plans. A list of identified OSD and Component level IGI&S data quality metrics, including their scope, measures and evaluation methods is included in Appendix B. Components may include additional metrics in their Component-level quality management plan, such as those required for Component headquarters reporting. A summarized list of standard measures established in ISO 19157, including the associated basic measure (where applicable), is included in Appendix A. This list should be used for reference when creating new metrics IGI&S Data Quality Scope When creating metrics and reporting data quality results, the data quality scope must be identified. The type of metrics defined for IGI&S data are dependent on these identified scope levels. The data quality scope of the data quality unit defines the extent of the data being evaluated. This is normally equivalent to the scope of the metadata record, however, the data quality scope must be at the same or lower hierarchy as the scope of the metadata record. In SDSFIE-Q, the data quality result scope will always be identical to the data quality scope, due to the limitations of the standard procedures and tools for collecting metadata. Table 3 lists the data quality scopes and corresponding administration levels 20

28 12 Dec 2016 SDSFIE Quality established in SDSFIE-Q. Components shall establish additional scopes in their quality management plans and DCSs, such as feature and attribute that are not included in this list. The following scopes, listed in hierarchical order, are defined for IGI&S data: Repository: For the purpose of SDSFIE-Q, the installation ADS is a data repository. Metrics that measure the quality and completeness of the ADS as a whole will be evaluated at this scope. Data quality results reported at the aggregate or feature type level may also be derived or aggregated for the repository. Aggregate: Data and services that are compiled at the OSD level (i.e. complete CIP dataset) or are necessary for Component or IGG reporting processes shall be reported using the aggregate data quality scope. Metrics that measure the quality of the aggregated dataset CIP) as a whole will be reported at this scope. Quality results reported at the feature type scope could be rolled up to the aggregate scope. Dataset (Vector): The dataset scope is used for vector data only when the data being evaluated is ad hoc and not part of an aggregated dataset or repository. The dataset scope will not be used when measuring CIP and ADS datasets. Feature Type: Data quality that is measured at the layer level shall be reported using the feature type scope. Layer level metrics required by OSD or the Component headquarters will be outlined in OSD or Component-level DCSs for vector data. Feature type data quality results may be rolled up to the aggregate or repository level. Dataset (Raster): Data quality measured for raster data may be reported as a dataset in DQ_Scope. Document: Data quality measured for supporting documentation, such as a standalone data quality report or a Component quality management plan will be reported as a document in DQ_Scope. Metadata: When reporting the quality of metadata associated with IGI&S data and services, the metadata scope should be used. Table 3: Data Quality Scope Codes for IGI&S Data Administration Data Quality SDSFIE Use Description Level Scope Part ADS dataset (raster) R To create metrics and report data quality for the imagery associated with an ADS. ADS featuretype V To create metrics and report data quality for individual feature types within an ADS. ADS repository V,R,S To create metrics and report data quality for the ADS as a whole. Component HQ dataset (raster) R To create metrics and report data quality for raster datasets created by the Component. Component HQ featuretype V To create metrics and report data quality for individual Component CIP layers or feature types in ad hoc datasets. Component HQ service S To create metrics and report data quality for Component services. Component HQ metadata V,R,S To create metrics and report data quality for metadata. Component HQ document V,R,S To create metrics and report data quality for a Component document (Component Quality Management Plans, DCS). Component HQ aggregate V To create metrics and report data quality for the Component CIP as a whole. Component HQ dataset (vector) V To create metrics and report data quality for non-cip or ADS 21

29 12 Dec 2016 SDSFIE Quality vector datasets comprised of more than one feature type. OSD featuretype V To create metrics and report data quality for individual CIP layers or feature types in non- CIP ad hoc vector datasets. OSD service S To create metrics and report data quality for services. OSD metadata V,R,S To create metrics and report data quality for metadata. OSD aggregate V To create metrics and report data quality for the CIP as a whole. OSD dataset (vector) V To create metrics and report data quality for non-cip vector datasets comprised of more than one feature type Process for Establishing Metrics The IGG shall determine which metrics are included in SDSFIE-Q. The IGG will review and revise metrics annually, but will consider ad hoc revisions outside of the annual cycle as needed. As EI&E business requirements or other drivers for IGI&S quality are identified, IGG members may request new metrics for inclusion in SDSFIE-Q, following the IGG consensus process (outlined in the SDSFIE Governance Plan, v1.0, section To propose a new metric, the submitter must provide a request to the IGG, including the following information: 1. Description of Metric: What data quality element is being measured, what measure and basic measure are used, how often, by whom, reported to whom. 2. Scope: The part of the standard (i.e. vector, raster or services) and the scope of the data (see section ) to which the metric applies. 3. Applicability: The administrative level (i.e. Component, ADS, headquarters) to which the metric applies. 4. Justification: The business requirement(s) driving the recommendation. The submitter should specify either the OSD policy driver or the Component requirement. Once the new metric is reviewed through the IGG consensus process, it will either be considered during the annual review period (see section ) or identified as an ad hoc requirement (see section ) Annual Metrics Review Process The IGG will meet annually to review all properly submitted metric revision requests through the IGG consensus process. The IGG members will make approval recommendations for each revision request, which the IGG Chair will use to determine the proper action to follow. The following recommendations may be made: a. To accept an SDSFIE-Q revision. b. To implement the revision at the Component level. c. To reject the revision. If the revision is approved by the IGG, the DISDI Program will draft a revised SDSFIE-Q document. The IGG members will review and comment on the updated document in accordance with the document review process (established in the SDSFIE Governance Plan v1.0, section 3.4.1). Once IGG consensus is reached, a new version of SDSFIE-Q will be recommended for approval by the ASD(EI&E) Ad Hoc Metrics Review Process If an IGG member identifies a business requirement for a new metric and either the IGG or the Component submits a justification for implementing the metric outside of the annual review cycle, the IGG will follow the Ad Hoc Metric Process. The submitted request shall follow the guidelines established in 22

30 12 Dec 2016 SDSFIE Quality section The IGG members will make an approval recommendation, which the IGG Chair will use to determine the proper action to follow. The following recommendations may be made: a. To accept an SDSFIE-Q revision outside of the annual review cycle. b. To accept the change and allow Component implementation but defer the revision of SDSFIE-Q until the annual review period. c. To implement the revision at the Component level. d. To reject the revision Evaluate Data Quality IGI&S data should be routinely assessed through validation and defined data quality evaluation processes, which will evaluate metrics established in SDSFIE-Q, DCSs and in Component quality management plans. These processes will identify deficiencies that should result in action plans to correct these deficiencies. Corrective actions could include isolation, replacement or correction of data. DoD Components or OSD may require the use of automated tools for assessment. Refer to Component level guidance for Component-specific tool requirements IGI&S Data Evaluation Methods Data quality metrics for IGI&S data and services may be assessed using evaluation methods identified in this document and in Component quality management plans. As described in section 2.2.2, data may be evaluated using direct or aggregation-based methods. Indirect evaluation methods are not recommended for SDSFIE-Q due to the subjectiveness of the evaluation. Examples of each evaluation method include, but are not limited to: Direct evaluation examples: o o Direct external evaluation will be used to identify the number of excess features in the CIP Building layer in relation to the buildings reported to the Real Property Inventory. The RPAD has a specific number of buildings that can be measured against the CIP building layer. In this example, the scope is feature type and the data quality element being evaluated is completeness commission. The evaluated measure is the number of excess items. Another example is if the scope is the repository ADS level and the data quality element being evaluated is completeness omission, an error count measure will be evaluated using a direct internal evaluation method to report in the metadata if the ADS is missing any required layers. Aggregation and derivation examples: o Aggregation would be used to determine the quality of the CIP dataset by aggregating the data quality results from each Component CIP submittal. Quality measures of a Component CIP can be aggregated from the data quality results provided by the individual installations. Using this method, quality results reported at the feature type scope could be rolled up to the aggregate scope IGI&S Data Quality Validation At times it is necessary to validate the results of a data quality evaluation against a minimum acceptable level of compliance. The minimum acceptable level of compliance must be defined in the appropriate implementation guidance for the data quality unit being evaluated, i.e. quality management plans, DCSs, ADS Implementation Guidance. For a dataset to pass a validation check, it must meet or exceed the minimum acceptable data quality result for the data quality unit. Refer to Parts 2, 3 and 4 of this document for specific validation requirements and resources that apply to each SDSFIE part (vector, raster or services). 23

31 12 Dec 2016 SDSFIE Quality Report Quality Results - Metadata Component IGI&S Programs and ADS Managers are responsible for consolidating and reporting data quality evaluation results to Component headquarters or OSD. This information shall be reported by populating data quality metadata elements defined by SDSFIE-M. A data quality report may also be produced, but the report will not replace the metadata reporting requirement. SDSFIE Metadata (SDSFIE-M) is the metadata specification for all IGI&S data and shall be adhered to per the SDSFIE-M Implementation guidance. Data quality metadata (DQ_DataQuality) and the standalone report are not mandatory for all IGI&S data, but may be required depending on the SDSFIE part (vector, raster or services) or dataset being evaluated. Refer to SDSFIE-M section for a detailed description of the data quality metadata elements. The metadata must include the scope being evaluated (DQ_Scope). At a minimum, either the lineage element (LI_Lineage) or a report for at least one data quality element (DQ_Element) shall also be populated. The data quality reporting elements are described in detail in section If DQ_Element is populated, a conformance result (DQ_ConformanceResult) quantitative result (DQ_QuantitativeResult) or coverage result (QE_CoverageResult) must be populated for each report. A coverage result (QE_CoverageResult) is the result of a data quality measure organizing the measured values as a coverage. An additional standalone quality report is recommended if any of the data quality results are aggregated or derived from other datasets or subsets of the dataset. Appendix C provides example metadata records with data quality elements populated. Elements that are mandatory in SDSFIE-M are marked with an (M) SDSFIE-Q Metrics Registry All metrics established in SDSFIE-Q are assigned an SDSFIE-Q metric identifier (e.g. IGG3, DCS25). This identifier correlates to the scope, ISO standard measure (where applicable), evaluation method and acceptable quality level for each metric. When reporting a data quality result for a metric in the metadata, the SDSFIE-Q metric ID must be recorded in MD_Identifier:code. Using the SDSFIE-Q metric ID eliminates the need to populate multiple metadata elements and standardizes the reporting method. The SDSFIE-Q metric IDs will be generated and stored in an accessible on-line registry in accordance with SDSFIE-M. These IDs are currently provided in Appendix B, as well as other supporting SDSFIE-Q documentation (e.g. DCS General Guidance (Annex A)). Quality Framework Elements: End Use The Component IGI&S Programs are not always the end users of their spatial data. In accordance with DoD policy, spatial data should be made visible, accessible, understandable, trusted, and interoperable throughout their lifecycles for all authorized users (to the maximum extent allowed by law or DoD policy). The following end use actions are identified: Determine Releasability (section 3.5.1) Make Data Discoverable (section 3.5.2) Provide Feedback (section 3.5.3) Releasability Component IGI&S Programs will develop and disseminate guidance for spatial data storage, handling, and release according to information assurance and OPSEC policy (see references cited in DoDI regarding handling of controlled unclassified IGI&S, such as DoDD E, DoDI , Volume 4 of DoD Manual (DoDM) , and DoDI ). Data users are responsible for understanding and adhering to applicable storage, handling, and release guidance Discoverability 24

32 12 Dec 2016 SDSFIE Quality Data stewards have a responsibility to make spatial data available and discoverable and provide information to the end user about the quality of the data. Components shall ensure that data and services are available to the DoD community in accordance with OPSEC policy. Data and metadata can be discoverable though Component IGI&S Portals, map viewers, websites, or through web map services Reporting Discoverability Reporting metrics for discoverability requires an analysis of how the data or services are being used and by whom. Data and services are made accessible to users in a variety of ways. Some examples are a direct request from the Component IGI&S Program, direct download through a website function or access to a web map service (hosted by the Component IGI&S Program or other authorized sources). Discoverability metrics should be reported by assessing the number of users who are downloading, viewing or accessing the data within a one year period. When IGI&S data or services are rehosted by authorized organizations, the rehosting organization should provide the Component with an estimated number of users. Recommended methods for estimating usage information include: The number of users that requested the data directly from the Component IGI&S Program. The number of users that requested access to a service directly from the Component IGI&S Program. The number of authorized users that have active accounts and access to a Component map viewer. The number of connections (new and existing) that are made to a map viewer. The number of connections (new and existing) that are made to a service. The estimated number of users accessing the data or service through an authorized host. The number of hits to a website within a specified time period Feedback A valuable tool for maintaining data quality is feedback from users. This means that data must first be accessible and understandable to users, then users must be given mechanisms by which to comment on the quality and completeness of the data. Component IGI&S Programs shall have a feedback mechanism in place to capture end user comments and questions. Examples of feedback mechanisms include a website function, an attribute built into a database, or comments received via a help desk or phone number. Users of the data at all administration levels are responsible for providing feedback. The following metrics for Feedback will be evaluated in SDSFIE-Q: Pass or Fail: The Component has a user feedback mechanism ( feedback, customer surveys, etc.) for all data discovery methods See Appendix B for the full list of IGG Data Quality Metrics The following metrics for Discoverability will be evaluated in SDSFIE-Q: Pass or Fail: The Component has an enterprise level map viewer Pass or Fail: The Component has a process and mechanisms for data requests and data delivery The percentage of CIP data layers not visible on the Component enterprise map viewer The percentage of CIP data layers not accessible through a web map service The percentage of data layers not available to fulfill a validated request The number of data requests for each CIP layer within a one year period The number of users viewing each CIP layer annually The number of users accessing or downloading each CIP layer within a one year period See Appendix B for the full list of IGG Data Quality Metrics 25

33 12 Dec 2016 SDSFIE Quality Part 2: SDSFIE-V Data Quality Guidance 4 SDSFIE-V Introduction SDSFIE- V is the vector IGI&S data model. This includes the current SDSFIE Specification Document, as well as the current Gold Logical Data Model and all Component HQ-level Adaptation Logical Data Models. SDSFIE-V data quality shall be evaluated for compliance with ADS requirements, OSD requirements, DCSs and Component quality management plans. Part 2 of this document describes data quality assessment, guidance and reporting requirements that are specific to vector data. IGI&S Vector Datasets Vector data used in IGI&S Programs exists in several forms; CIP, Component CIP, additional data that conforms to the SDSFIE-V standard, data that is included in the Component ADS that is not SDSFIE-V compliant and data that is created or acquired for a specific purpose or project. Vector datasets that are required for upward reporting must adhere to more stringent collection and quality requirements than datasets that are project specific Common Installation Picture (CIP) Dataset The CIP dataset is a collection of Component IGI&S geospatial data and metadata. The purpose of the CIP is to provide a readily available, standardized map background to serve as the basis for planning and execution of EI&E missions, as described in DoDI Typical CIP layers include real property, master planning, infrastructure and environmental data. The layers included in the CIP dataset are aggregated at the OSD level and, therefore, additional data quality requirements apply. Metrics for the aggregated CIP dataset will be evaluated at the OSD level. The DCS for each layer defines metrics that are specific to the layer. 5 SDSFIE-V Data Content Specifications (DCS) The following metrics for the aggregated CIP dataset will be evaluated in SDSFIE-Q: The percentage of data layers required for the CIP that were submitted by each Component The percentage of reported Real Property assets that do not have a geospatial representation in the CIP The percentage of CIP layers that passed all validation checks required by the DCSs The percentage of CIP data layers that have SDSFIE-M compliant metadata records The percentage of CIP layers that do not have metadata with at least one data quality reporting element populated See Appendix B for the full list of IGG Data Quality Metrics Members of the IGI&S Governance Group (IGG) are expected to develop their own DCS documents (QAPs, DLSs, DCGs) in order to standardize the collection and maintenance of IGI&S data. A DCS also provides essential information needed to properly structure metadata records, including well-defined metrics and corresponding evaluation methods. The DISDI Program will create a DCS at the OSD level for each data layer in SDSFIE Gold, as well as a DCS General Guidance document. A DCS is a detailed description of a dataset or data layer together with additional information that will enable it to be created, supplied to and used by other organizations. Component DCSs must be established for all data layers as described in Component quality management plans. An example DCS is included for reference in Annex B. The following metrics for DCSs will be evaluated in SDSFIE-Q: The percentage of data layers that are required for collection by a Component headquarters that do not have a DCS The percentage of data layers required for the CIP that do not have a DCS The percentage of data layers in SDSFIE-V Gold that do not have a DCS See Appendix B for the full list of IGG Data Quality Metrics 26

34 12 Dec 2016 SDSFIE Quality DCS General Guidance The DCS General Guidance document (SDSFIE-Q: Annex A) defines data collection requirements, metadata requirements, data handling procedures, releasablility requirements and feedback mechanisms that are applicable to all feature types in SDSFIE-V Gold or registered Adaptations thereof. This document also includes data collection guidance that applies to subsets of SDSFIE-V data (e.g. CIP or real property feature types). The DCS General Guidance document shall be used in conjunction with feature type-specific DCSs. The DCS General Guidance document also includes metrics and evaluation methods that will be used to assess data quality for IGI&S data. Mandatory metrics that apply to all CIP feature types or all real property feature types are identified in the DCS General Guidance document. Each feature type-specific DCS includes additional mandatory metrics for that feature type. DCS Validation The DISDI Program will develop rule expression language for each CIP layer to enforce compliance with DCS requirements. The Component will be able to validate their data against the DCS requirements prior to a CIP submission. Components may also develop validation tools to assist with adherence to quality management plans and Component DCSs. 6 SDSFIE-V Quality Metrics To evaluate IGI&S vector data quality, OSD and Components shall establish metrics for CIP, Component CIP and ADS layers based on processes established in section The required metrics for each CIP layer may vary depending on collection requirements, OSD and Component policy or data use. For this reason, the DCS for each CIP layer will describe the required metrics. The scope for layer specific metrics will be featuretype. When metrics are established that measure the CIP or ADS as a whole, aggregate and repository scopes are used. 7 SDSFIE-V Data Quality Reporting: SDSFIE Metadata All IGI&S vector data shall have current and complete metadata. It is the responsibility of the organization creating or maintaining an IGI&S data layer to ensure that current and complete metadata exists for that layer. IGI&S vector data that is not part of the Common Installation Picture (CIP) shall adhere to the SDSFIE-M standard and SDSFIE-M Implementation Guidance with no additional requirements. The scope metadata element (DQ_Scope) from SDSFIE-M shall be populated based on the scopes identified in section One of either: the lineage element (LI_Lineage) or one or more data quality reporting elements (DQ_Element) from SDSFIE-M shall be populated. CIP Metadata Requirements IGI&S CIP data layers shall adhere to additional metadata requirements to accurately document data quality. For CIP layers, the DCS for each layer will define which data quality reporting elements (DQ_Element) are mandatory. For data quality reporting elements (DQ_Element) to be correctly populated, the DCS for each IGI&S layer mandated for collection shall include well-defined metrics, and corresponding evaluation methods, for applicable data quality reporting elements. The metrics shall be mapped to the appropriate data quality element so that the quality of the data layer can be consistently reported in the metadata. Data quality results for CIP layers shall be reported as both conformance results (DQ_ConformanceResult) and quantitative results (DQ_QuantitativeResult). A conformance result is determined by comparing a CIP layer against a DCS. The DCS and the explanation of the result must be cited in the metadata, along with an indication of whether or not the layer passed the evaluation. Quantitative results are the values obtained from applying data quality measures defined in the DCS. 27

35 12 Dec 2016 SDSFIE Quality Part 3: SDSFIE-R Data Quality Guidance 8 SDSFIE-R Introduction SDSFIE-R is the IGI&S standard for raster data. This includes the draft SDSFIE-R document and its annex, the Raster Standards Compendium. SDSFIE-R data quality shall be evaluated for compliance with ADS requirements, OSD requirements, and Component quality management plans. Part 3 of this document will describe data quality assessment, guidance and reporting requirements that are specific to raster data. This section will be completed after the SDSFIE-R document is written. 9 SDSFIE-R Data Quality Documentation: SDSFIE Metadata All IGI&S raster data that is acquired or generated should have current and complete metadata. It is the responsibility of the organization creating the raster dataset to ensure that current and complete metadata exists for that layer. Raster data that is created by an IGI&S Program shall adhere to the SDSFIE-M standard and implementation guidance with no additional requirements. DQ_Scope shall be populated based on the scopes identified in section For raster data, the data quality scope will be dataset or tile, depending on the extent of the dataset. In accordance with SDSFIE-M, either the lineage element (LI_Lineage) or one or more data quality reporting elements (DQ_Element) shall be populated. The following metrics for raster metadata will be evaluated in SDSFIE-Q: The percentage of raster datasets that have SDSFIE-M compliant metadata records See Appendix B for the full list of IGG Data Quality Metrics Data quality results for raster data will be reported in either DQ_ConformanceResult or DQ_QuantitativeResult. Raster data quality can also be represented as a coverage result (QE_CoverageResult), which is the result of a data quality measure organizing the measured values as a coverage. 28

36 12 Dec 2016 SDSFIE Quality Part 4: SDSFIE-S Data Quality Guidance 10 SDSFIE-S Introduction SDSFIE- S is the IGI&S standard for data services. This section will be completed after the SDSFIE-S document is written. 11 SDSFIE-S Data Quality Documentation: SDSFIE Metadata TBD 29

37 12 December 2016 SDSFIE Quality Appendix A: ISO Data Quality Standard s The table below contains a list of standardized data quality measures and is a summarized version of Annex D in ISO These measures should be referenced when establishing new data quality metrics. Element Name ISO Identifier Table 4: ISO Data Quality Standard s Name Basic Definition Commission 1 excess item error indicator indication that an item is incorrectly present in the data Boolean Commission 2 number of excess items error count number of items within the dataset or sample that should not have been present Integer Commission 3 rate of excess items error rate Commission 4 number of duplicate feature instances number of excess items in the dataset or sample in relation to the number of items that should have been present Value type error count total number of exact duplications of feature instances within the data Integer Omission 5 missing item error indicator indicator that shows that a specific item is missing in the data Boolean Omission 6 number of missing items Omission 7 rate of missing items error rate Conceptual Conceptual Conceptual Conceptual Conceptual Conceptual Domain Domain conceptual schema non-compliance conceptual schema compliance number of items not compliant with the rules of the conceptual schema number of invalid overlaps of surfaces non-compliance rate with respect to the rules of the conceptual schema compliance rate with the rules of the conceptual schema value domain nonconformance value domain conformance error count count of all items that should have been in the dataset or sample and are missing Integer number of missing items in the dataset or sample in relation to the number of items that should have been present error indicator indication that an item is not compliant to the rules of the relevant conceptual schema Boolean correctness indicator error count indication that an item complies with the rules of the relevant conceptual schema count of all items in the dataset that are not compliant with the rules of the conceptual schema Real Real Boolean Integer error count total number of erroneous overlaps within the data Integer error rate correct items rate number of items in the dataset that are not compliant with the rules of the conceptual schema in relation to the total number of these items supposed to be in the dataset number of items in the dataset in compliance with the rules of the conceptual schema in relation to the total number of items error indicator indication of if an item is not in conformance with its value domain Boolean correctness indicator indication that an item is conforming to its value domain Real Real Boolean 30

38 12 December 2016 SDSFIE Quality Element Name Domain Domain Domain Format Format Format Topological Topological Topological Topological Topological Topological Topological ISO Identifier Name Basic Definition number of items not in conformance with their value domain value domain conformance rate value domain nonconformance rate physical structure conflicts number of physical structure conflicts physical structure conflict rate number of faulty pointcurve connections rate of faulty pointcurve connections number of missing connections due to undershoots number of missing connections due to overshoots number of invalid slivers number of invalid selfintersect errors number of invalid selfoverlap errors Value type error count count of all items in the dataset that are not in conformance with their value domain Integer correct items rate error rate number of items in the dataset that are in conformance with their value domain in relation to the total number of items in the dataset. number of items in the dataset that are not in conformance with their value domain in relation to the total number of items error indicator indication that items are stored in conflict with the physical structure of the dataset Boolean error count error rate count of all items in the dataset that are stored in conflict with the physical structure of the dataset number of items in the dataset that are stored in conflict with the physical structure of the dataset divided by the total number of items Real Real Integer error count number of faulty point-curve connections in the dataset Integer error rate error count error count number of faulty link node connections in relation to the number of supposed link node connections count of items in the dataset, within the parameter tolerance, that are mismatched due to undershoots count of items in the dataset, within the parameter tolerance, that are mismatched due to overshoots Real Real Integer Integer error count count of all items in the dataset that are invalid sliver surfaces Integer error count count of all items in the data that illegally intersect with themselves Integer error count count of all items in the data that illegally self-overlap Integer Absolute or external accuracy 28 mean value of positional uncertainties (1D, 2D and 3D) mean value of the positional uncertainties for a set of positions where the positional uncertainties are defined as the distance between a measured position and what is considered as the corresponding true position Absolute or external accuracy 128 bias of positions (1D, 2D and 3D) bias of the positions for a set of positions where the positional uncertainties are defined as the deviation between a measured position and what is considered as the corresponding true position 31

39 12 December 2016 SDSFIE Quality Element Name Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy ISO Identifier Name Basic Definition mean value of positional uncertainties excluding outliers (2D) number of positional uncertainties above a given threshold rate of positional uncertainties above a given threshold error count 32 covariance matrix 33 linear error probable 34 standard linear error linear map accuracy at 90% significance level linear map accuracy at 9 significance level linear map accuracy at 99% significance level near certainty linear error root mean square error (RMSE) absolute linear error at 90 % significance level of biased vertical data (Alternative 1) Absolute linear error at 90 % significance level of biased vertical data (Alternative 2) circular standard deviation LE50 or LE50 (r) LE68.3 or LE68.3(r) LE90 or LE90(r) LE95 or LE95(r) LE99 or LE99(r) LE99.8 or LE99.8(r) CE39.4 for a set of points where the distance does not exceed a defined threshold, the arithmetical average of distances between their measured positions and what is considered as the corresponding true positions number of positional uncertainties above a given threshold for a set of positions. The errors are defined as the distance between a measured position and what is considered as the corresponding true position. number of positional uncertainties above a given threshold for a set of positions in relation to the total number of measured positions. symmetrical square matrix with variances of point coordinates on the main diagonal and covariance between these coordinates as off-diagonal elements half length of the interval defined by an upper and a lower limit, in which the true value lies with probability 50 % half length of the interval defined by an upper and a lower limit, in which the true value lies with probability 68,3 % half length of the interval defined by an upper and a lower limit, in which the true value lies with probability 90 % half length of the interval defined by an upper and a lower limit, in which the true value lies with probability 95 % half length of the interval defined by an upper and a lower limit, in which the true value lies with probability 99 % half length of the interval defined by an upper and a lower limit, in which the true value lies with probability 99,8 % standard deviation, where the true value is not estimated from the observations but known a priori absolute vertical accuracy of the data s coordinates, expressed in terms of linear error at 90 % probability given that a bias is present absolute vertical accuracy of the data s coordinates, expressed in terms of linear error at 90 % probability given that a bias is present radius describing a circle, in which the true point location lies with the probability of 39,4 % Value type Integer Real 43 circular error probable CE50 radius describing a circle, in which the true point location lies with the probability of 50 % 44 circular error at 90% significant level CE90 radius describing a circle, in which the true point location lies with the probability of 90 % 32

40 12 December 2016 SDSFIE Quality Element Name Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Absolute or external accuracy Relative or internal accuracy Relative or internal accuracy Accuracy of a time measurement Accuracy of a time measurement Accuracy of a time measurement Accuracy of a time measurement Accuracy of a time measurement Accuracy of a time measurement Temporal Classification Correctness ISO Identifier Name Basic Definition circular error at 9 significant level circular near certainty error root mean square error of planimetry (RMSEP) absolute circular error at 90 % significance level of biased data (Alternative 2) absolute circular error at 90 % significance level of biased data Value type CE95 radius describing a circle, in which the true point location lies with the probability of 95 % CE99.8 radius describing a circle, in which the true point location lies with the probability of 99,8 % radius of a circle around the given point, in which the true value lies with probability P 50 uncertainty ellipse 51 confidence ellipse 52 relative vertical error 53 relative horizontal error time accuracy at 68.3% significance level time accuracy at 50% significance level time accuracy at 90% significance level time accuracy at 9 significance level time accuracy at 99% significance level time accuracy at 99.8% significance level LE68.3 or LE68.3(r) LE50 or LE50 (r) LE90 or LE90(r) LE95 or LE95(r) LE99 or LE99(r) LE99.8 or LE99.8(r) absolute horizontal accuracy of the data s coordinates, expressed in terms of circular error at 90 % probability given that a bias is present absolute horizontal accuracy of the data s coordinates, expressed in terms of circular error at 90% probability given that a bias is present 2D ellipse with the two main axes indicating the direction and magnitude of the highest and the lowest uncertainty of a 2D point 2D ellipse with the two main axes indicating the direction and magnitude of the highest and the lowest uncertainty of a 2D point evaluation of the random errors of one relief feature to another in the same dataset or on the same map/chart. It is a function of the random errors in the two elevations with respect to a common vertical datum. evaluation of the random errors in the horizontal position of one feature to another in the same dataset or on the same map/chart half length of the interval defined by an upper and a lower limit, in which the true value for the time instance lies with probability 68,3 % half length of the interval defined by an upper and a lower limit, in which the true value for the time instance lies with probability 50 % half length of the interval defined by an upper and a lower limit, in which the true value for the time instance lies with probability 90 % half length of the interval defined by an upper and a lower limit, in which the true value for the time instance lies with probability 95 % half length of the interval defined by an upper and a lower limit, in which the true value for the time instance lies with probability 99 % half length of the interval defined by an upper and a lower limit, in which the true value for the time instance lies with probability 99.8 % 159 chronological error error indicator indication that an event is incorrectly ordered against the other events Boolean 60 number or incorrectly classified features error count number of incorrectly classified features Integer 33

41 12 December 2016 SDSFIE Quality Element Name Classification Correctness Classification Correctness Classification Correctness Classification Correctness n-quantitative attribute correctness n-quantitative attribute correctness n-quantitative attribute correctness Quantitative attribute accuracy Quantitative attribute accuracy Quantitative attribute accuracy Quantitative attribute accuracy Quantitative attribute accuracy Quantitative attribute accuracy ISO Identifier Name Basic Definition 61 misclassification rate error rate number of incorrectly classified features relative to the number of features that should be there 62 misclassification matrix matrix that indicates the number of items of class (i) classified as class (j) Integer 63 relative misclassification matrix 64 kappa coefficient Usability element 101 Usability element 102 Usability element 103 number of incorrect attribute values rate of correct attribute values rate of incorrect attribute values attribute value uncertainty at 68.3% significance level attribute value uncertainty at 50% significance level attribute value uncertainty at 90% significance level attribute value uncertainty at 9 significance level attribute value uncertainty at 99% significance level attribute value uncertainty at 99.8% significance level data product specification passed data product specification fail count data product specification pass count matrix that indicates the number of items of class (i) classified as class (j) divided by the number of items of class (i) coefficient to quantify the proportion of agreement of assignments to classes by removing misclassifications Value type error count total number of erroneous attribute values within the relevant part of the dataset Integer correct items rate error rate LE68.3 or LE68.3(r) LE50 or LE50 (r) LE90 or LE90(r) LE95 or LE95(r) LE99 or LE99(r) LE99.8 or LE99.8(r) correctness indicator error count correct items count number of correct attribute values in relation to the total number of attribute values number of attribute values where incorrect values are assigned in relation to the total number of attribute values half length of the interval defined by an upper and a lower limit, in which the true value for the quantitative attribute lies with probability 68,3 % half length of the interval defined by an upper and a lower limit, in which the true value for the quantitative attribute lies with probability 50 % half length of the interval defined by an upper and a lower limit, in which the true value for the quantitative attribute lies with probability 90 % half length of the interval defined by an upper and a lower limit, in which the true value for the quantitative attribute lies with probability 95 % half length of the interval defined by an upper and a lower limit, in which the true value for the quantitative attribute lies with probability 99 % half length of the interval defined by an upper and a lower limit, in which the true value for the quantitative attribute lies with probability 99,8 % indication that all requirements in the referred data product specification are fulfilled number of data product specification requirements that are not fulfilled by the current product/dataset number of the data product specification requirements that are fulfilled by the current product/dataset Real Real Real Real Real Boolean Integer Integer 34

42 12 December 2016 SDSFIE Quality Element Name ISO Identifier Name Basic Definition Value type Usability element 104 data product specification fail rate error rate number of the data product specification requirements that are not fulfilled by the current product/dataset in relation to the total number of data product specification requirements Real Usability element 105 data product specification pass rate correct items rate number of the data product specification requirements that are fulfilled by the current product/dataset in relation to the total number of data product specification requirements Real 35

43 12 December 2016 SDSFIE Quality Appendix B: IGG Data Quality Metrics The table below contains metrics established for SDSFIE-Q. This is only a list of higher level metrics and does not contain metrics specific to individual feature types (with the exception of discoverability metrics). The DCS General Guidance and the DCS for a data layer will contain feature type-specific metrics. Each metric in the table below contains descriptive information that is necessary to complete a data quality evaluation. The concepts are described in detail in the SDSFIE-Q document. A brief description and reference to the corresponding section in SDSFIE-Q (if applicable) are indicated below: 1. Category: Describes the section of the SDSFIE-Q document that corresponds to the metric. 2. SDSFIE Part: Describes the SDSFIE Part to which the metric applies (vector, raster or services). 3. Reporting Level: Describes the reporting scope of the metric (i.e. who is responsible for the data quality evaluation). The reporting level for each metric listed below is Component, OSD or OSD/Component. 4. Data Quality Scope: Describes the scope code that will be used when reporting the data quality evaluation results in metadata. (for a list of IGI&S data quality scopes, see section ) 5. Element: Describes the data quality metadata element that is being evaluated. (see section ) 6. Data Quality Unit: The combination of a Data Quality Scope and a metadata element. (see section 2.1.1) 7. ISO ID: A code indicating the ISO standard measure from which the metric was derived. Appendix A includes a table with all of the ISO standard measures. (see Appendix A) 8. SDSFIE-Q Metric ID: A code assigned to each metric established for SDSFIE-Q. This identifier correlates to the scope, ISO standard measure (where applicable), evaluation method and acceptable quality level for each metric. When reporting a data quality result for a metric in the metadata, the SDSFIE-Q metric ID must be recorded in MD_Identifier:code. (see section ) 9. Metric Description: The description of what the metric is evaluating. (See section 3.4.1) 10. Recommended Evaluation Method: The suggested method for evaluating each metric. (see section3.4.2) 11. Acceptable Quality Level: The minimum or maximum required result to Pass each metric. Metrics highlighted in blue indicate metaquality metrics, which are an aggregation of the results of other metrics (as indicated). 36

44 12 December 2016 SDSFIE Quality Data Quality Unit Table 5: IGG Data Quality Metrics Category SDSFIE Part Reporting Level Data Quality Scope Element ISO ID SDSFIE-Q Metric ID DCS V Component aggregate Omission 7 IGG1 DCS V Component aggregate Omission 7 IGG2 DCS V OSD aggregate Omission 7 IGG3 DCS V OSD aggregate Omission 7 IGG4 Raster R Component dataset CIP CIP CIP CIP V V V V OSD/ Component OSD/ Component OSD/ Component OSD/ Component aggregate Conceptual Conceptual 9 IGG5 12 IGG6 aggregate Omission 7 IGG7 aggregate Omission 7 IGG8 aggregate Omission 7 IGG9 Metric Description The percentage of data layers that are required for collection by a Component HQ that do not have a DCS The percentage of data layers required for the Component CIP that do not have a DCS The percentage of data layers required for the CIP that do not have a DCS The percentage of data layers in SDSFIE-V Gold that do not have a DCS. The raster dataset has an SDSFIE-M compliant metadata record The percentage of CIP data layers that do not have SDSFIE-M compliant metadata records The percentage of required CIP data layers that do not have metadata with at least one data quality reporting element populated The percentage of reported Real Property assets for the CIP that do not have a geospatial representation The percentage of required data layers not included in the CIP Recommended Evaluation Method Direct External: Comparison between the feature types in the Component SDSFIE-V Adaptation and the published Component DCSs. Direct External: Comparison between the feature types in the Component CIP and the published Component DCSs. Direct External: Comparison between the feature types in the CIP and the published OSD DCSs. Direct External: Comparison between the feature types in SDSFIE-V Gold and the published OSD DCSs. Direct Internal: Check of SDSFIE- M mandatory elements Direct Internal: Check of SDSFIE- M mandatory elements for each submitted CIP layer Direct Internal: Check of SDSFIE- M mandatory elements for each submitted CIP layer Aggregation of DCS 8 result for each CIP Real Property layer Direct External: Comparison of CIP submittal against the list of required CIP layers. Acceptable Quality Level (maximum) (maximum) (maximum) (maximum) (maximum) (maximum) (maximum) (maximum) (maximum) 37

45 12 December 2016 SDSFIE Quality Category CIP CIP CIP SDSFIE Part V V V Reporting Level OSD/ Component OSD/ Component OSD/ Component Data Quality Scope aggregate Element Conceptual ISO ID SDSFIE-Q Metric ID 12 IGG10 aggregate IGG11 aggregate IGG12 Discoverability V Component aggregate Omission 7 IGG13 Discoverability Discoverability Discoverability V, S V V OSD/ Component OSD/ Component OSD/ Component aggregate Omission 7 IGG14 aggregate Omission 7 IGG15 aggregate IGG16 Discoverability V,R,S OSD repository Omission 5 IGG17 Discoverability V,R,S OSD repository Omission 5 IGG18 Discoverability V,R,S OSD repository Omission 5 IGG19 Discoverability V,S OSD/ Component feature type IGG20 Metric Description The percentage of CIP data layers that are not compliant with an approved SDSFIE-V adaptation Which CIP layers have the most data quality errors? Which CIP layers have the least data quality errors? The percentage of CIP data layers not visible via a map viewer. The percentage of CIP data layers not accessible through a web map service The percentage of data layers not available to fulfill a validated request Which CIP data layers are most widely distributed, requested and/or viewed? The Component does not have an enterprise level map viewer The Component does not have a process and mechanisms for data requests and data delivery The Component does not have a user feedback mechanism ( feedback, customer surveys, etc.) for all data discovery methods The number of data requests for each CIP layer within a one year period Recommended Evaluation Method Direct Internal: Check to confirm that the CIP submittal has the correct schema Aggregation: Comparison of the results for metric DCS51 for each CIP layer Aggregation: Comparison of the results for metric DCS51 for each CIP layer Direct External: Comparison between the layers visible on a map viewer and the list of required CIP layers Direct External: Comparison between the layers accessible via a web map service and the list of required CIP layers Direct External: Comparison between the layers available by request and the feature types included in SDSFIE-V Gold or an approved Adaptation Aggregation: Comparison of the sum of metrics IGG20, IGG21 and IGG22 for each required CIP layer. Direct External: Pass if the Component has an enterprise map viewer Direct External: Check for availability and accessibility of a data request process Direct External: Pass if the Component has an established feedback mechanism for data, raster and services See section Acceptable Quality Level (maximum) (maximum) (maximum) (maximum) Pass Pass Pass 38

46 12 December 2016 SDSFIE Quality Category Discoverability Discoverability SDSFIE Part V,S V,S Reporting Level OSD/ Component OSD/ Component Data Quality Scope Element ISO ID SDSFIE-Q Metric ID feature type IGG21 feature type IGG22 ADS V OSD repository Omission 7 IGG23 ADS V,R,S OSD repository Omission 5 IGG24 ADS V,R,S Component repository Usability 101 IGG25 ADS V Component repository Omission 7 IGG26 ADS R Component repository ADS R Component repository Conceptual Conceptual 12 IGG27 12 IGG28 Metric Description The number of users viewing each CIP layer annually The number of users accessing or downloading each CIP layer within a one year period The percentage of geospatial features representing Real Property assets in an installation ADS that have not been reconciled by the Component against the RPAD The Component does not have an established ADS for vector, raster and services The Component ADS has passed validation checks specified in the ADS Implementation Guidance The percentage of required data layers not included in the Component ADS. The percentage of installations not covered by imagery with a raster resolution compliant with SDSFIE-R standards The percentage of installations not covered by imagery with a raster temporal accuracy compliant with SDSFIE-R standards Recommended Evaluation Method See section See section Direct External: Comparison of geospatial features against the assets reported to the RPAD Direct External: Pass if the Component has established an ADS for each installation Direct Internal: Refer to the ADS Implementation Guidance for ADS validation requirements Direct External: Comparison of the data layers in the Component ADS in relation to the data layers required by Component guidance Direct External: Comparison of imagery footprints against DoD site boundaries. Direct External: Check to ensure raster resolution meets minimum requirements specified in SDSFIE- R. Direct External: Comparison of imagery footprints against DoD site boundaries. Direct External: Check to ensure temporal accuracy meets minimum requirements specified in SDSFIE- R. Acceptable Quality Level (maximum) Pass Pass (maximum) (maximum) (maximum) 39

47 12 Dec 2016 SDSFIE-Quality Appendix C: Metadata Examples The following metadata examples display the DQ_DataQuality elements in SDSFIE-M for both an aggregate (e.g. CIP) and repository (e.g. ADS) data quality scope. Refer to the DCS General Guidance (Annex A) for metadata examples reported at the feature type scope. In the examples below, red text indicates that the element is mandatory in SDSFIE-M. DQ_DataQuality scope: DQ_Scope Table 6: Metadata Example - Aggregate Scope Metadata Element Example Comment level: MD_ScopeCode leveldescription standalonequalityreport: DQ_StandaloneQualityReportInformation reportreference: CI_Citation title: CharacterString date: CI_Date date: Date datetype: CI_DateTypeCode abstract: CharacterString report: DQ_Element nameof: CharacterString measureidentification: MD_Identifier authority Title: CharacterString date: CI_Date aggregate notapplicable Common Installation Picture (CIP) Data Quality Report 13-May-16 Creation Report containing detailed descriptions of procedures used to evaluate the CIP dataset and descriptions of the quality results in a report format. Logical Conceptual IGI&S Metrics Registry date: Date 5/13/2016 datetype: CI_DateTypeCode citedresponsibleparty: CI_ResponsibleParty code: CharacterString Creation DISDI Program IGG6 Optional. Will be added to SDSFIE-M (v. 2.0) Conditional: "report" or "lineage" role is mandatory. In this example, report was chosen, so DQ_Element is now mandatory. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. I&E community preferred method to identify the measure. Currently optional: This code must correspond to the SDSFIE-Q Metric ID identified in the metrics registry 40

48 12 Dec 2016 SDSFIE-Quality Metadata Element Example Comment measuredescription: CharacterString evaluationmethodtype: DQ_EvaluationMethodTypeCode evaluationmethoddescription: CharacterString result: DQ_Result report: DQ_Element The percentage of required CIP data layers that do not have SDSFIE-M compliant metadata records Quantitative Result value: Record valueunit: UnitOf nameof: CharacterString measureidentification: MD_Identifier authority Title: CharacterString date: CI_Date ne Completeness Omission IGI&S Metrics Registry date: Date 5/13/2016 datetype: CI_DateTypeCode citedresponsibleparty: CI_ResponsibleParty code: CharacterString measuredescription: CharacterString evaluationmethodtype: DQ_EvaluationMethodTypeCode evaluationmethoddescription: CharacterString result: DQ_Result Creation DISDI Program IGG7 The percentage of required CIP data layers that do not have metadata with at least one data quality reporting element populated Quantitative Result value: Record 8% valueunit: UnitOf ne SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. Optional, but may be required by a DCS Conditional: "report" or "lineage" role is mandatory. In this example, report was chosen, so DQ_Element is now mandatory. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. I&E community preferred method to identify the measure. Currently optional: This code must correspond to the SDSFIE-Q Metric ID identified in the metrics registry SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. Optional, but may be required by a DCS 41

49 12 Dec 2016 SDSFIE-Quality DQ_DataQuality scope: DQ_Scope Table 7: Metadata Example: Repository Scope Metadata Element Example Comment level: MD_ScopeCode leveldescription standalonequalityreport: DQ_StandaloneQualityReportInformation reportreference: CI_Citation title: CharacterString date: CI_Date date: Date datetype: CI_DateTypeCode abstract: CharacterString report: DQ_Element nameof: CharacterString measureidentification: MD_Identifier authority Title: CharacterString date: CI_Date repository notapplicable Completeness Omission IGI&S Metrics Registry date: Date 5/13/2016 datetype: CI_DateTypeCode citedresponsibleparty: CI_ResponsibleParty code: CharacterString Creation DISDI Program IGG24 Optional. Will be added to SDSFIE-M (v. 2.0) Conditional: "report" or "lineage" role is mandatory. In this example, report was chosen, so DQ_Element is now mandatory. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. I&E community preferred method to identify the measure. Currently optional: This code must correspond to the SDSFIE-Q Metric ID identified in the metrics registry measuredescription: CharacterString evaluationmethodtype: DQ_EvaluationMethodTypeCode evaluationmethoddescription: CharacterString result: DQ_Result value: Record valueunit: UnitOf The Component does not have an established ADS for vector, raster and services Quantitative Result FALSE Boolean SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. Optional, but may be required by a DCS 42

50 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Annex A: Data Content Specification (DCS) General Guidance SDSFIE-Q Annex A: Data Content Specification (DCS) General Guidance 12 Dec 2016 Prepared By: The Installation Geospatial Information and Services Governance Group (IGG) For: The Assistant Secretary of Defense (Energy, Installations & Environment) 2016 A-1

51 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Revision History Description Date Version Initial Draft 6 May 2016 DRAFT Incorporated changes from 8 Jun comment adjudication session Section 3.1.1: Added section for populating the sdsid Section 3.2: Updated real property inventory system to APSR Table 3: Changed all acceptable quality levels to 9 or - Split DCS1 and DCS2 into two metrics each (DCS3 and DCS4 were added) - Added metrics for numbers and percentages where both were not previously included - Renumbered SDSFIE-Q Metric IDs Section 4.3: Added description of the SDSFIE-Q Metric Identifier Added language clarifying which metrics would be used to report conformance to a DCS Section 5: Removed entire section Section 2.1: Add language indicating that Components may but are not required to follow the same format as OSD DCSs. 22 Jun 2016 FINAL DRAFT (DQWG) 16 Aug 2016 FINAL DRAFT (IGG) A-2

52 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Table of Contents 1 Introduction Purpose Authority Document Maintenance References Acronyms DCS Content DCS Format DCS Data Collection Requirements General Collection Requirements SDSFIE-V Gold Foundational Attributes Populating sdsid Real Property Source Selection Data Quality Evaluation Feature Type Metrics DCS Conformance Metrics Data Quality Metadata: Reporting Results...17 Appendix A: Metadata Example A-3

53 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Table of Tables Table 1: SDSFIE Gold Attributes... 8 Table 2: Common Sources for IGI&S Data Table 3: IGI&S Feature Type Metrics Table 4: IGI&S DCS Conformance Metrics A-4

54 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance 1 Introduction The Spatial Data Standards for Facilities, Infrastructure, and Environment Quality standard (SDSFIE-Q) requires that data content specifications (DCS) be created for all IGI&S vector data. As defined in ISO Geographic Information Geomatics, a DCS is a detailed description of a dataset or data layer, together with additional information that will enable it to be created, supplied to, and used by other organizations. The DCS is a technical document that provides essential information for data collection and metadata population. A DCS describes the ideal dataset or how a dataset should be. In contrast, the actual state of the dataset and how closely it conforms to the DCS are described in a dataset s associated metadata. The DCS General Guidance provides a framework of concepts that apply to all feature types in the SDSFIE Vector Standard (SDSFIE-V), so common elements do not need to be repeated in each feature type specific DCS. Therefore, the DCS General Guidance must be used in conjunction with each DCS. In combination, the DCS General Guidance and the DCSs collectively define minimum acceptable quality levels for IGI&S data. Components should use these documents to assist in developing Component-level guidance. The DISDI Program will initially create DCSs for each feature type included in the Common Installation Picture (CIP), and eventually for every feature type in SDSFIE-V Gold. Purpose This general guidance document provides information for developing DCSs, including enterprise level guidance for data collection, source selection, and data quality evaluation procedures applicable to all feature types in SDSFIE-V Gold and registered Adaptations thereof. This document also provides guidance for reporting data quality evaluation results in SDSFIE-M compliant metadata. The individual DCS produced for each feature type will provide additional guidance that is specific to the feature type. Refer to Component level guidance for any additional Component-specific requirements. Authority In accordance with DoDI , this guidance applies to installation geospatial information and services (IGI&S). IGI&S is applicable to the management of DoD installations and environment to support military readiness in the Active, Guard, and Reserve Components with regard to facility construction, sustainment, and modernization including the operation and sustainment of military test and training ranges as well as the US Army Corps of Engineers Civil Works community. Its applicability for other interested organizations is suggested but not mandatory. This guidance is derived from the SDSFIE-Q standard, which establishes a framework for evaluating the quality of IGI&S data and defines the requirement for DCSs at the OSD and Component levels. Document Maintenance This document will be reviewed and updated as needed, with each action documented in a revision history log. When changes occur, the version number will be updated to the next increment and the date, owner making the change, and change description will be recorded in the revision history log of the document. References ASD(EI&E), Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Quality (SDSFIE-Q): DRAFT DoD Instruction , Real Property Inventory (RPI) and Forecasting, 17 Jan 2014 DoD Instruction , Installation Geospatial Information and Services (IGI&S), 9 Apr 2015 A-5

55 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance DUSD(I&E), Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Metadata (SDSFIE-M): Conceptual Schema, Version 1.0.2, 28 August 2014 ISO Reference Geographic Information Geomatics ISO Reference Geographic Information Data quality Real Property Inventory Management (RPIM), v. 8.1 Acronyms ADS APSR CADD CIP DCS DISDI DoD DoDI EI&E FOIA GCS GIO GPS GUID HQ IGG IGI&S ISO MIM NGA NMF OASD OSD PKI RPAD RPI RPIM RPUID SDSFIE SDSFIE-M SDSFIE-V SDSFIE-Q SOP UUID Authoritative Data Source Accountable Property System of Record Computer Automated Drafting and Design Common Installation Picture Data Content Specification Defense Installations Spatial Data Infrastructure Department of Defense Department of Defense Instruction Energy, Installations, and Environment Freedom of Information Act Geographic Coordinate System Geospatial Information Officer Global Positioning System Globally Unique Identifier Headquarters IGI&S Governance Group Installation Geospatial Information and Services International Organization for Standardization Military Installation Map National Geospatial-Intelligence Agency National System for Geospatial Intelligence Metadata Foundation Office of the Assistant Secretary of Defense Office of the Secretary of Defense Public Key Infrastructure Real Property Asset Database Real Property Inventory Real Property Information Model Real Property Unique Identifier Spatial Data Standards for Facilities, Infrastructure and Environment Spatial Data Standards for Facilities, Infrastructure and Environment - Metadata Spatial Data Standards for Facilities, Infrastructure and Environment - Vector Spatial Data Standards for Facilities, Infrastructure and Environment - Quality Standard Operating Procedure Universally Unique Identifier A-6

56 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance 2 DCS Content DCS Format OSD level DCSs will include the following key sections and content: 1. Introduction The introduction will provide the overview of the layer (or feature type). This includes the name, definition, description and note established in SDSFIE-V along with other information to characterize the layer. The introduction will include the scope of the DCS, along with any policies, regulations or other requirements for collecting and maintaining the layer. A detailed source citation of DoD, Federal, or other policy must be outlined for each feature type. 2. Data Collection Requirements This section contains the description and rules to collect and attribute the layer and metadata and will include sub-sections describing the applicability of the ISO Data Quality categories (completeness, logical consistency, positional accuracy, thematic accuracy, and temporal accuracy). This section will also describe the source selection criteria if it differs from the criteria in the DCS General Guidance document. 3. Data Quality Evaluation and Reporting The data quality evaluation and reporting section shall identify required metrics and evaluation methods, including existing toolsets, for the data layer and provide guidance for populating the SDSFIE-M data quality metadata elements. This section will also describe the acceptable quality levels related to each metric for DCS conformance. Component level DCSs must include the content specified above. The Component level DCSs may, but are not required, to follow the same format as the OSD level DCSs. DCS Data Collection Requirements Each DCS will outline the information required to maintain the data layer as per IGG guidelines and in a standard format. The ISO categories, which are defined in SDSFIE-Q Section , must be described for each layer. This guidance and the DCS for each feature type will define metrics that relate to these data quality categories. Examples are provided for reference: Completeness is defined as the presence or absence of features, their attributes and relationships. Component datasets must include all required features as determined by the installation or in a known database of record and must contain all SDSFIE-V mandatory elements.. For example, the SDSFIE-V feature type Fence should include all features reported as Real Property Assets and must include the mandatory attributes RPUID, RPSUID, and Fence Use Type. Logical is the adherence to the ruleset for the data structure, attribution and relationships, including conceptual consistency, domain consistency, format consistency and topological consistency. This element requires that the data layer follow the rules established in SDSFIE-V. For example, all features in the SDSFIE feature type LandParcel should be complete, connected polygons, void of dangling lines, and should maintain consistent topological relationships with all other LandParcel features without overlapping polygons. Positional Accuracy is the accuracy of the position of features within a designated spatial reference system, including absolute and relative accuracy. For example, when using the SDSFIE feature type Building, all features should maintain positional accuracy in relation to other features in the Building feature type. Temporal Quality describes the quality of the temporal aspect of the data layer, including accuracy of a time measurement, temporal consistency and temporal validity. For example, the SDSFIE feature type Installation should reflect the most current conditions either by survey, through a means of source upload, or other resources as described by the Real Property Inventory Requirements (RPIR). A-7

57 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Thematic Accuracy is the accuracy or correctness of attributes. This element requires that data layers contain accurate attributes and adhere to the defined SDSFIE classification for the feature type. For example, a Bridge feature that includes attribute values for National Bridge Inventory Bridge Design Type must maintain the accuracy of the classification of the bridge and the values for both verticalclearanceclosed and verticalclearanceopen for navigation in the channel when the bridge is closed and open. Usability is used to determine overall adherence to the DCS requirements and will be defined and established in detail in each DCS. 3 General Collection Requirements SDSFIE-V Gold Foundational Attributes As defined in the SDSFIE-V 4.0 Implementation Guidance, all SDSFIE-V 4.0 feature types must contain the SDSFIE-V Gold foundational attributes. Components must provide a justification for profiling any nonmandatory attributes (as part of their Adaptation). Table 1 below contains a list of SDSFIE-V Gold foundational attributes. Each individual feature type may have additional Gold attributes. The DCS for each feature type may include Gold attributes in addition to those listed in Table 1 according to the feature type modeling. The field Attribute Guidance describes requirements for populating each attribute. With the exception of installationid and sdsid, Components may modify or provide their own attribute guidance for the SDSFIE-V foundational attributes in the Component DCS. Attribute Name Definition, Description and te Data Type Nullable Mandatory Attribute Guidance featuredescription A narrative describing the feature String(Max) Yes Yes featurename The common name of the feature String (80) Yes Yes The official name of the feature found on installation maps or in authoritative databases of record. installationid mediaid metadataid featureidpk sdsid The code assigned by the DoD Component used to identify the site or group of sites that make up an installation. Used to link the record to associated multimedia records that reference data Used to represent or link to feature level metadata. Primary Key. A unique, user-defined identifier for each record or instance of an entity. A unique identifier for all entities in the SDSFIE String (11) Yes The installation ID should match the Installation Code reported to the RPI. String (40) Yes Yes String (80) Yes String (40) Yes GUID Yes Yes GUID. See section Table 8: SDSFIE Gold Attributes Populating sdsid The sdsid is an identifier that is intended to be practically unique and used to identify a single entity record from the time of its creation (or first distribution) so that the entity record can be determined to refer to the same record in a future distribution of the dataset containing the record. The sdsid shall be populated as a globally unique identifier (GUID) value. The term GUID typically refers to various implementations of the universally unique identifier (UUID) standard (IETF-RFC-4122). This guidance specifically refers to the UUID version 2 as that is the implementation used by Microsoft and is common among database systems. A-8

58 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance A UUID is simply a 128-bit value. The meaning of each bit is defined by any of several variants. For human-readable display, many systems use a canonical format employing hexadecimal text with inserted hyphen characters. For example, 123e4567-e89b-12d3-a The intent of UUIDs is to enable distributed systems to uniquely identify information without significant central coordination. In this context, the word unique should be taken to mean "practically unique" rather than "guaranteed unique". Once an entity record is created the sdsid value for the record can be assigned. The sdsid value must be assigned before distribution beyond the installation context (i.e., to Component HQ or the DISDI Program Office). Once an sdsid value is distributed, the value must not change. In practice, the sdsid shall be a version 4 UUID, meaning that it is a random UUID. Using the Python uuid package, this corresponds to the uuid.uuid4() method. The following example code for Esri-based Python will populate the sdsid column in every row where it is not already assigned: import arcpy import uuid arcpy.env.workspace = C:\Geodata\MyGeodatabase.gdb fc = MyFeatureClass rows = arcpy.updatecursor(fc) for row in rows: if row.sdsid is ne: row.sdsid = '{' + str(uuid.uuid4()) + '}' rows.updaterow(row) Real Property For all real property feature types, Components shall ensure that there is a match between features/records in the feature type and the Component accountable property system of record (APSR). The Real Property Unique Identifier (RPUID) and Real Property Site Unique Identifier (RPSUID) attributes must be populated so that IGI&S data can be linked to the data in the Real Property Asset (RPA) Database (RPAD), a requirement of DoDI DCSs will identify any additional required real property attribution specific to the feature type, such as Real Property Network Identifier or RPA Interest Type. Source Selection Components shall evaluate potential data sources and establish criteria for which sources are most appropriate for a given layer. Component level DCSs should specify the selection criteria for the source used to create and maintain both the geographic representation of the feature and the attribution. Components should consider any data source requirements they have established for their Authoritative Data Source (ADS) if applicable to a given feature type DCS. Table 2 below describes common sources that may be used to locate existing geospatial data, collect new data or populate attributes for existing data. Refer to the DCS for additional sources for each feature type. The source material used should be captured in the layer s metadata under LI_Lineage>LI_Source. All dataset processing steps should be captured in LI_ProcessStep. 4 4 All references to metadata elements in this document refer to elements of SDSFIE-M. A-9

59 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Table 9: Common Sources for IGI&S Data Source Description Example Use Conditions/Requirements Existing Installation Geospatial Data Existing geospatial data available within the Component installation or IGI&S Program. Using an existing buildings dataset as a starting point for a dataset update or revision. Geographic Representation and Attribution External Geospatial Organization Data Tabular and spatial data obtained from organizations outside of the Component. Examples include the National Geospatial- Intelligence Agency (NGA) and Military installation Maps (MIMs). Downloading Digital Aeronautical Flight Information File (DAFIF) data from NGA. Geographic Representation and Attribution Traditional Survey A traditional survey is a field survey performed by a state licensed surveyor. The field survey will identify boundary features such as fences, rivers, lakes, roads, etc. and presence or absence of any monuments that were found or set. Using a geospatial file or recorded legal description to determine a boundary of a feature. Geographic Representation Depending on the format of the survey result, the GIS analyst may need to export or georeference the data. Computer Aided Drafting and Design (CADD) CADD files include real property master plans, construction design or as-built drawings. Using CADD files to determine the location of utilities on an installation. Geographic Representation Must be georeferenced. Planimetric Data Plainimetric data is created from surveying or photogrammetric processes. Planimetric data represent only the horizontal position of features on the Earth's surface, which show geographic objects, as well as natural and cultural physical features and entities. Using aerial photography to digitize road features. Geographic Representation GPS Differentially corrected GPS field survey data. Collecting GPS data to determine the boundary of a military range. Geographic Representation and Attribution Imagery Imagery includes orthorectified aerial and satellite imagery, which will be used for manual or automated feature extraction. Using satellite imagery to extract water features on an installation. Geographic Representation The minimum resolution requirement is 1 meter. The heads-up digitizing extraction scale shall not exceed 1:1,200. Hardcopy Documents Existing installation hard-copy maps, blue-prints, master planning maps, etc. Georeferencing a map from an installation master plan and digitizing locations of future projects. Geographic Representation and Attribution Must be georeferenced Tabular Data Attribute data available in tabular form that may be obtained from Component business systems. Using the Component real property system to populate the interest type of an asset in the geospatial data. Geographic Representation and Attribution Must include coordinates to be used for geographic representation. Standard Operating Procedures (SOP) A set of procedural instructions, which may include maps and spatial data for an installation. Using the process in an SOP to create data or locating geospatial data included in the SOP. Geographic Representation and Attribution A-10

60 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance 4 Data Quality Evaluation Feature Type Metrics Table 3 provides a list of metrics, suggested evaluation methods, and acceptable quality levels that will be used to evaluate IGI&S vector data at the feature type scope. The metrics and evaluation results will be reported in the metadata using the data quality elements. The metrics in Table 3 are derived from the ISO standard measures provided in SDSFIE-Q Appendix A (see ISO Identifier column). Each DCS will identify which of these metrics are required for the specific feature type. Metrics that are mandatory for all CIP feature types or all CIP real property feature types are indicated in Table 3. Component-level DCSs should include all mandatory metrics established in the OSD-level DCS for that feature type. A-11

61 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Table 10: IGI&S Feature Type Metrics Data Quality Element ISO Identifier SDSFIE-Q Metric Identifier DCS General Guidance Metric Description Suggested Evaluation Method Acceptable Quality Level Mandatory for CIP Completeness: Commission 2 DCS1 The number of features included in the layer that are excess to the Real Property Inventory (i.e. features in the dataset that are not reported to the RPI) Direct External: Comparison against database of known features (i.e. APSR) Real Property feature types only Completeness: Commission 3 DCS2 The percentage of features included in the layer that are excess to the Real Property Inventory (i.e. features in the dataset that are not reported to the RPI) Direct External: Comparison against database of known features (i.e. APSR) (maximum) Real Property feature types only Completeness: Commission 2 DCS3 The number of features included in the layer that are excess to real world features (i.e. features in the dataset that are not real world features) Direct External: Visual or automated inspection against imagery Completeness: Commission 3 DCS4 The percentage of features included in the layer that are excess to real world features (i.e. features in the dataset that are not real world features) Direct External: Visual or automated inspection against imagery (maximum) Completeness: Commission 4 DCS5 The number of duplicated features in the layer. (Features that either have identical geometries or identical attribution within a specified search tolerance) Direct Internal: Geoprocessing tool to identify duplicate geometries or identical attributes Yes Completeness: Commission DCS6 The percentage of duplicated features in the layer. (Features that either have identical geometries or identical attribution within a specified search tolerance) Direct Internal: Geoprocessing tool to identify duplicate geometries or identical attributes (maximum) Yes Completeness: Omission 6 DCS7 The number of reported Real Property assets for each data layer that do not have a geospatial representation Direct External: Comparison against database of known features (i.e. APSR) Real Property feature types only Completeness: Omission 7 DCS8 The percentage of reported Real Property assets for each data layer that do not have a geospatial representation Direct External: Comparison against database of known features (i.e. APSR) (maximum) Real Property feature types only Completeness: Omission 6 DCS9 The number of actual Real Property assets for each data layer that do not have a geospatial representation Direct External: Visual inspection against imagery Direct External: Comparison against database of known features (i.e. APSR) Completeness: Omission 7 DCS10 The percentage of actual Real Property assets for each data layer that do not have a geospatial representation Direct External: Visual inspection against imagery Direct External: Comparison against database of known features (i.e. APSR) (maximum) A-12

62 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Data Quality Element Completeness: Omission Completeness: Omission Completeness: Omission ISO Identifier SDSFIE-Q Metric Identifier 6 DCS11 7 DCS12 6 DCS13 DCS General Guidance Metric Description The number of Real Property assets for each data layer that do not have an RPUID The percentage of Real Property assets for each data layer that do not have an RPUID The number of unpopulated values in non-nullable attribute fields ( NULL or blank values) Suggested Evaluation Method Direct Internal: Analysis of attributes (manual or tool) Direct Internal: Analysis of attributes (manual or tool) Acceptable Quality Level (maximum) Mandatory for CIP Real Property feature types only Real Property feature types only Direct Internal: Analysis of attributes (manual or tool) Completeness: Omission 7 DCS14 The percentage of unpopulated values in nonnullable attribute fields ( NULL or blank values) Direct Internal: Analysis of attributes (manual or tool) (maximum) Logical : Domain 16 DCS15 The number of features included in the layer that have values in operationalstatus that are not conformant with the domain values in a registered Component Adaptation Direct Internal: Analysis of attributes (manual or tool) Logical : Domain Logical : Domain Logical : Domain Logical : Topological Logical : Topological Logical : Topological Logical : Topological Logical : Topological 18 DCS16 16 DCS17 18 DCS18 The percentage of features included in the layer that have values in operationalstatus that are not conformant with the domain values in a registered Component Adaptation The number of features included in the layer that have values in rpinterest that are not conformant with the domain values in a registered Component Adaptation The percentage of features included in the layer that have values in rpinterest that are not conformant with the domain values in a registered Component Adaptation Direct Internal: Analysis of attributes (manual or tool) (maximum) Direct Internal: Analysis of attributes (manual or tool) Direct Internal: Analysis of attributes (manual or tool) (maximum) DCS19 The total number of topological errors in the layer Direct Internal: Aggregation of topological errors Yes DCS20 The percentage of topological errors in the layer Direct Internal: Aggregation of topological errors 25 DCS21 DCS22 26 DCS23 The number of features that are invalid sliver surfaces The percentage of features that are invalid sliver surfaces The number of features included in the layer that have self-intersect errors Direct Internal: Geoprocessing tool to identify polygons under a specified thinness ratio. All polygons under the minimum thinness ratio defined in the DCS should be evaluated and verified. Direct Internal: Geoprocessing tool to identify polygons under a specified thinness ratio. All polygons under the minimum thinness ratio defined in the DCS should be evaluated and verified. Direct Internal: Geoprocessing tool to identify invalid geometry (maximum) (maximum) Yes A-13

63 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Data Quality Element ISO Identifier SDSFIE-Q Metric Identifier DCS General Guidance Metric Description Suggested Evaluation Method Acceptable Quality Level Mandatory for CIP Logical : Topological DCS24 The percentage of features included in the layer that have self-intersect errors Direct Internal: Geoprocessing tool to identify invalid geometry (maximum) Logical : Topological 27 DCS25 The number of features that incorrectly intersect other features Direct Internal: Geoprocessing tool to identify intersecting polygons in conjunction with analysis to determine if the overlap is correct Logical : Topological DCS26 The percentage of features that incorrectly intersect other features Direct Internal: Geoprocessing tool to identify intersecting polygons in conjunction with analysis to determine if the overlap is correct (maximum) Logical : Topological DCS27 The number of features that are not entirely contained within the site boundary associated with the feature Direct Internal: Geoprocessing tool to identify polygons that are not entirely within the site boundary Logical : Topological DCS28 The percentage of features that are not entirely contained within the site boundary associated with the feature Direct Internal: Geoprocessing tool to identify polygons that are not entirely within the site boundary (maximum) Logical : Topological DCS29 The number of features that are not coincident with bounding features Direct Internal: Geoprocessing tool to identify polygons that are not coincident with other features Logical : Topological DCS30 The percentage of features that are not coincident with bounding features Direct Internal: Geoprocessing tool to identify polygons that are not coincident with other features (maximum) Thematic Accuracy: n-quantitative Attribute Correctness 65 DCS31 The number of features in the layer that have an incorrect feature name Direct External: Comparison against database of known features (e.g. APSR or a defined naming convention) Thematic Accuracy: n-quantitative Attribute Correctness Thematic Accuracy: n-quantitative Attribute Correctness Thematic Accuracy: n-quantitative Attribute Correctness 66 DCS32 65 DCS33 66 DCS34 The percentage of features in the layer that have a correct feature name The number of features in the layer that have an incorrect installation ID The percentage of features in the layer that have a correct installation ID Direct External: Comparison against database of known features (e.g. APSR or a defined naming convention) Direct External: Comparison against database of known features (i.e. APSR) Direct External: Geoprocessing tool to determine if the installation ID matches the installation ID of the installation boundary in which the feature is located. Direct External: Comparison against database of known features (i.e. APSR) Direct External: Geoprocessing tool to determine if the installation ID matches the installation ID of the installation boundary in which the feature is located. 9 (minimum) 9 (minimum) Yes Yes A-14

64 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Data Quality Element ISO Identifier SDSFIE-Q Metric Identifier DCS General Guidance Metric Description Suggested Evaluation Method Acceptable Quality Level Mandatory for CIP Thematic Accuracy: n-quantitative Attribute Correctness 65 DCS35 The number of features in the layer that have an incorrect operational status Direct External: Comparison against database of known features (i.e. APSR) Thematic Accuracy: n-quantitative Attribute Correctness 66 DCS36 The percentage of features in the layer that have a correct operational status Direct External: Comparison against database of known features (i.e. APSR) 9 (minimum) Thematic Accuracy: n-quantitative Attribute Correctness 65 DCS37 The number of features in the layer that have an incorrect real property interest type Direct External: Comparison against database of known features (i.e. APSR) Thematic Accuracy: n-quantitative Attribute Correctness 66 DCS38 The percentage of features in the layer that have a correct real property interest type Direct External: Comparison against database of known features (i.e. APSR) 9 (minimum) Thematic Accuracy: n-quantitative Attribute Correctness 65 DCS39 The number of features in the layer that have an incorrect RPSUID Direct External: Comparison against database of known features (i.e. APSR) Direct External: Geoprocessing tool to determine if the RPSUID matches the RPSUID of the site boundary in which the feature is located. Yes Thematic Accuracy: n-quantitative Attribute Correctness 66 DCS40 The percentage of features in the layer that have a correct RPSUID Direct External: Comparison against database of known features (i.e. APSR) Direct External: Geoprocessing tool to determine if the RPSUID matches the RPSUID of the site boundary in which the feature is located. 9 (minimum) Yes Thematic Accuracy: n-quantitative Attribute Correctness 65 DCS41 The number of features in the layer that have an incorrect RPUID Direct External: Comparison against database of known features (i.e. APSR) Real Property feature types only Thematic Accuracy: n-quantitative Attribute Correctness 66 DCS42 The percentage of features in the layer that have a correct RPUID Direct External: Comparison against database of known features (i.e. APSR) 9 (minimum) Real Property feature types only Temporal Accuracy: Temporal Validity DCS43 The number of features that are accurately represented by the currentness reference required in the DCS Direct External: Comparison of DCS requirement to the date the dataset was last updated in the metadata Temporal Accuracy: Temporal Validity DCS44 The percentage of features that are accurately represented by the currentness reference required in the DCS Direct External: Comparison of DCS requirement to the date the dataset was last updated in the metadata 9 (minimum) Positional Accuracy: Absolute or external accuracy 39 DCS45 The horizontal accuracy of the layer (based on DCS requirements) Direct External: Comparison of reported coordinate values to known values Refer to DCS Thematic Accuracy: Classification Correctness 60 DCS46 The number of features that are incorrectly classified Direct External: Comparison against database of known features (i.e. APSR). For real property feature types, the classification of the feature should be compared to the CATCODE reported in the real Yes A-15

65 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Data Quality Element Thematic Accuracy: Classification Correctness ISO Identifier SDSFIE-Q Metric Identifier 61 DCS47 DCS General Guidance Metric Description The percentage of features that are incorrectly classified Suggested Evaluation Method property database. Direct External: Comparison against database of known features (i.e. APSR). For real property feature types, the classification of the feature should be compared to the CATCODE reported in the real property database. Acceptable Quality Level (maximum) Mandatory for CIP Yes A-16

66 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance DCS Conformance Metrics Components shall compare the results of the data quality evaluation for each mandatory metric listed in Table 3 against the acceptable quality level. These results will be aggregated and evaluated against the Usability metrics listed in Table 4 and reported as conformance results. If all reported values meet the acceptable quality level, the conformance result for SDSFIE-Q Metric Identifier DCS 50 (highlighted in red) will be Pass. Each DCS will establish the mandatory requirements for each feature type. Table 11: IGI&S DCS Conformance Metrics Data Quality Element ISO Identifier SDSFIE-Q Metric Identifier Usability 102 DCS48 Usability 103 DCS49 Usability 104 DCS50 Usability 105 DCS51 DCS General Guidance Metric Description The number of mandatory DCS requirements that were not met by the data layer. The number of mandatory DCS requirements that were met by the data layer. The percentage of mandatory DCS requirements that were not met by the data layer. The percentage of mandatory DCS requirements that were met by the data layer. Suggested Evaluation Method The number of "fail" results of the quality evaluations for each mandatory metric. The number of "pass" results of the quality evaluations for each mandatory metric. The percentage of "fail" results of the quality evaluations for each mandatory metric. The percentage of "pass" results of the quality evaluations for each mandatory metric. Acceptable Quality Level 0 (maximum) 0% (maximum) 100% (minimum) Usability 101 DCS52 The layer passed all validation checks required by the DCS. 100% pass fail aggregation of each mandatory requirement expressed in the DCS Pass Data Quality Metadata: Reporting Results Each mandatory data quality reporting element must have a corresponding data quality evaluation result populated in the metadata. As stated in SDSFIE-Q, metadata for non-cip IGI&S vector data must adhere to SDSFIE-M with no additional requirements. CIP feature types must have at least one data quality reporting element (DQ_Element) populated. The DCS for the feature type will specify which element(s) are required and what data quality units will be evaluated. For CIP feature types, the result of each mandatory metric shall be reported as both a quantitative result (DQ_QuantitativeResult) and a conformance result (DQ_ConformanceResult) as described in SDSFIE-Q. If the acceptable quality level for a metric is, the metric must be reported as a quantitative result only. The result will not be used to evaluate conformance with a DCS, therefore a conformance result is not required. Overall DCS Conformance should also be reported as a conformance result. Appendix A contains an example metadata record for a CIP feature type that includes both quantitative and conformance results. The SDSFIE-Q Metric Identifier is a unique code assigned to each metric. This code must be referenced in the metadata for the feature type in MD_Identifier:code. Using the SDSFIE-Q metric ID eliminates the need to populate multiple metadata elements and standardizes the reporting method. The SDSFIE-Q Metric ID is included in Tables 3 and 4, as well as the DCS for each feature type. An online registry will also be established to store these IDs and their associated information. A-17

67 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance DCS General Guidance Appendix A: Metadata Example The following metadata example displays the DQ_DataQuality elements in SDSFIE-M. It includes a Quantitative Result and a Conformance Result for Completeness Omission and an overal DCS Conformance Result. Red text indicates that the element is mandatory in SDSFIE-M. DQ_DataQuality scope: DQ_Scope Metadata Element Example Comment level: MD_ScopeCode featuretype leveldescription notapplicable standalonequalityreport: DQ_StandaloneQualityReportInformation Optional. Will be added to SDSFIE-M (v. 2.0) reportreference: CI_Citation title: CharacterString date: CI_Date date: Date datetype: CI_DateTypeCode abstract: CharacterString report: DQ_Element nameof: CharacterString measureidentification: MD_Identifier authority Title: CharacterString date: CI_Date Military Range Data Quality Report 25-Feb-16 Creation Report containing detailed descriptions of procedures used to evaluate the Military Range layer and descriptions of the quality results in a report format Completeness Omission SDSFIE-Q Metrics Registry date: Date 3/31/2016 A-18 Mandatory for CIP Layers SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. I&E community preferred method to identify the measure.

68 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Metadata Element Example Comment datetype: CI_DateTypeCode Creation citedresponsibleparty: CI_ResponsibleParty DISDI Program code: CharacterString measuredescription: CharacterString evaluationmethodtype: DQ_EvaluationMethodTypeCode evaluationmethoddescription: CharacterString result: DQ_Result DCS5 Quantitative Result value: Record valueunit: UnitOf result: DQ_Result specification:ci_citation Title: CharacterString date: CI_Date ne Conformance Result Military Range Data Content Specification date: Date 3/31/2016 datetype: CI_DateTypeCode Creation citedresponsibleparty: CI_ResponsibleParty DISDI Program Meets acceptable quality level in explanation: CharacterString DCS pass: Boolean 1 0 = fail and 1 = pass report: DQ_Element nameof: CharacterString measureidentification: MD_Identifier authority Title: CharacterString date: CI_Date Usability Element SDSFIE-Q Metrics Registry date: Date 3/31/2016 datetype: Creation A-19 Currently optional: This code must correspond to the SDSFIE-Q Metric ID identified in the metrics registry SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. Optional, but may be required by a DCS A Conformance Result is not required for metrics with an acceptable quality level of Mandatory for CIP Layers SDSFIE-M currently says use should be avoided. This element may be used until the metrics registry is finalized. I&E community preferred method to identify the measure.

69 12 Dec 2016 SDSFIE-Q Annex A: DCS General Guidance Metadata Element Example Comment CI_DateTypeCode citedresponsibleparty: CI_ResponsibleParty code: CharacterString measuredescription: CharacterString evaluationmethodtype: DQ_EvaluationMethodTypeCode evaluationmethoddescription: CharacterString result: DQ_Result resultscope: MD_Scope level: MD_ScopeCode DISDI Program DCS41 100% pass fail aggregation of each requirement expressed in the DCS Conformance Result Currently optional: This code must correspond to the SDSFIE-Q Metric ID identified in the metrics registry Optional, but may be required by a DCS. SDSFIE-M currently says use should be avoided. Optional, but may be required by a DCS. SDSFIE-M currently says use should be avoided. Optional, but may be required by a DCS same as DQ_Scope in SDSFIE-Q same as DQ_Scope in SDSFIE-Q leveldescription notapplicable Constrained by NMF_ScopeAmplificationCode value: Record 100% valueunit: UnitOf specification:ci_citation Title: CharacterString date: CI_Date ne Military Range Data Content Specification date: Date 3/31/2016 datetype: CI_DateTypeCode Creation citedresponsibleparty: CI_ResponsibleParty DISDI Program 2 requirements of 2 are fulfilled. The feature type is conformant to explanation: CharacterString the DCS pass: Boolean 1 0 = fail and 1 = pass A-20

70 12 Dec 2016 SDSFIE-Q Annex B: DCS Example Annex B: Data Content Specification (DCS) Example SDSFIE-Q Annex B: Data Content Specification (DCS) Example Military Range SDSFIE-V Dec 2016 Prepared By: The Installation Geospatial Information and Services Governance Group (IGG) For: The Assistant Secretary of Defense (Energy, Installations & Environment) B-1

The Spatial Data Standards for Facilities, Infrastructure and Environment (SDSFIE) Quality and Raster Standards

The Spatial Data Standards for Facilities, Infrastructure and Environment (SDSFIE) Quality and Raster Standards The Spatial Data Standards for Facilities, Infrastructure and Environment (SDSFIE) Quality and Raster Standards Ms. Karen Barnhouse DISDI Program Support OASD(EI&E) June 29, 2016 Agenda What is the SDSFIE

More information

Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Governance Plan. Revision 2 13 September 2017

Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Governance Plan. Revision 2 13 September 2017 Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Governance Plan Revision 2 13 September 2017 Prepared By: The Installation Geospatial Information and Services (IGI&S) Governance

More information

Installation Geospatial Information and Services (IGI&S) - Update on Policy, Standards, Issues

Installation Geospatial Information and Services (IGI&S) - Update on Policy, Standards, Issues Installation Geospatial Information and Services (IGI&S) - Update on Policy, Standards, Issues Mr. David LaBranche, PE Geospatial Information Officer OASD(EI&E) February 14, 2017 Agenda IGI&S Policy Implementation

More information

A Metadata Standard for IGI&S: Spatial Data Standards for Facilities, Infrastructure, and Environment - Metadata (SDSFIE-M)

A Metadata Standard for IGI&S: Spatial Data Standards for Facilities, Infrastructure, and Environment - Metadata (SDSFIE-M) A Metadata Standard for IGI&S: Spatial Data Standards for Facilities, Infrastructure, and Environment - Metadata (SDSFIE-M) Mr. David LaBranche, PE DISDI Program Manager ODUSD(I&E) July 15, 2014 ESRI IUC

More information

GEOFidelis SDSFIE Implementation Roles and Responsibilities Guide

GEOFidelis SDSFIE Implementation Roles and Responsibilities Guide GEOFidelis SDSFIE Implementation Roles and Responsibilities Guide Version: 1.4 Prepared for: USMC Installation Geospatial Information and Services Program (GEOFidelis) November 19th, 2012 TABLE OF CONTENTS

More information

GOVERNMENT GAZETTE REPUBLIC OF NAMIBIA

GOVERNMENT GAZETTE REPUBLIC OF NAMIBIA GOVERNMENT GAZETTE OF THE REPUBLIC OF NAMIBIA N$7.20 WINDHOEK - 7 October 2016 No. 6145 CONTENTS Page GENERAL NOTICE No. 406 Namibia Statistics Agency: Data quality standard for the purchase, capture,

More information

A New Governance Plan for the Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE)

A New Governance Plan for the Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) A New Governance Plan for the Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) Mr. David LaBranche, PE DISDI Group Chair ODUSD(I&E) June 24, 2014 1 Overview and Background

More information

DISDI Plenary Session

DISDI Plenary Session JSEM JSEM // Geospatial Geospatial Information Information & & Services Services Conference, Conference, 2007 2007 DISDI Plenary Session 22 22 May May 2007 2007 Columbus, Columbus, Ohio Ohio JSEM JSEM

More information

ISO INTERNATIONAL STANDARD. Geographic information Quality principles. Information géographique Principes qualité. First edition

ISO INTERNATIONAL STANDARD. Geographic information Quality principles. Information géographique Principes qualité. First edition INTERNATIONAL STANDARD ISO 19113 First edition 2002-12-01 Geographic information Quality principles Information géographique Principes qualité Reference number ISO 2002 Provläsningsexemplar / Preview PDF

More information

The Modeling and Simulation Catalog for Discovery, Knowledge, and Reuse

The Modeling and Simulation Catalog for Discovery, Knowledge, and Reuse The Modeling and Simulation Catalog for Discovery, Knowledge, and Reuse Stephen Hunt OSD CAPE Joint Data Support (SAIC) Stephen.Hunt.ctr@osd.mil The DoD Office of Security Review has cleared this report

More information

Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary

Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary December 17, 2009 Version History Version Publication Date Author Description

More information

Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Metadata (SDSFIE-M): Implementation Guidance

Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Metadata (SDSFIE-M): Implementation Guidance Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Metadata (SDSFIE-M): Implementation Guidance Version 1.0 (8 SEP 2015) Prepared By: The IGI&S Governance Group (IGG)

More information

Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Vector (SDSFIE-V): Implementation Guidance

Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Vector (SDSFIE-V): Implementation Guidance Spatial Data Standards for Facilities, Infrastructure, and Environment (SDSFIE) SDSFIE Vector (SDSFIE-V): Implementation Guidance Version 4.0 (31 JANUARY 2017) Prepared By: The IGI&S Governance Group For:

More information

Service Vs. System. Why do we need Services and a Services Viewpoint in DM2 and DoDAF? Fatma Dandashi, PhD March 4, 2011

Service Vs. System. Why do we need Services and a Services Viewpoint in DM2 and DoDAF? Fatma Dandashi, PhD March 4, 2011 Service Vs. System Why do we need Services and a Services Viewpoint in DM2 and DoDAF? Fatma Dandashi, PhD March 4, 2011 1. Does DoD Need To Model a Service? Bottom Line Up front (BLUF) DoD has a requirement

More information

ISO/IEC/ IEEE INTERNATIONAL STANDARD. Systems and software engineering Architecture description

ISO/IEC/ IEEE INTERNATIONAL STANDARD. Systems and software engineering Architecture description INTERNATIONAL STANDARD ISO/IEC/ IEEE 42010 First edition 2011-12-01 Systems and software engineering Architecture description Ingénierie des systèmes et des logiciels Description de l'architecture Reference

More information

DATA Act Information Model Schema (DAIMS) Architecture. U.S. Department of the Treasury

DATA Act Information Model Schema (DAIMS) Architecture. U.S. Department of the Treasury DATA Act Information Model Schema (DAIMS) Architecture U.S. Department of the Treasury September 22, 2017 Table of Contents 1. Introduction... 1 2. Conceptual Information Model... 2 3. Metadata... 4 4.

More information

SVENSK STANDARD SS-ISO 19114:2004. Geografisk information Metoder för utvärdering av kvalitet. Geographic information Quality evaluation procedures

SVENSK STANDARD SS-ISO 19114:2004. Geografisk information Metoder för utvärdering av kvalitet. Geographic information Quality evaluation procedures SVENSK STANDARD Fastställd 2004-05-14 Utgåva 1 Geografisk information Metoder för utvärdering av kvalitet Geographic information Quality evaluation procedures ICS 35.020; 35.240.01; 35.240.50; 35.240.60;

More information

Test & Evaluation of the NR-KPP

Test & Evaluation of the NR-KPP Defense Information Systems Agency Test & Evaluation of the NR-KPP Danielle Mackenzie Koester Chief, Engineering and Policy Branch March 15, 2011 2 "The information provided in this briefing is for general

More information

http://www.sis.se http://www.sis.se http://www.sis.se http://www.sis.se http://www.sis.se Provläsningsexemplar / Preview Teknisk specifikation SIS-ISO/TS 19138:2007 Utgåva 1 Mars 2007 Geografisk information

More information

AS/NZS ISO 19157:2015

AS/NZS ISO 19157:2015 AS/NZS ISO 19157:2015 (ISO 19157:2013, IDT) Australian/New Zealand Standard Geographic information Data quality Superseding AS/NZS ISO 19113:2004, AS/NZS ISO 19114:2005, and AS/NZS ISO 19138:2008 AS/NZS

More information

Geografisk information Kvalitetsprinciper. Geographic information Quality principles

Geografisk information Kvalitetsprinciper. Geographic information Quality principles SVENSK STANDARD SS-ISO 19113 Fastställd 2002-12-06 Utgåva 1 Geografisk information Kvalitetsprinciper Geographic information Quality principles ICS 35.240.70 Språk: engelska Tryckt i januari 2003 Copyright

More information

ENGINEERING AND CONSTRUCTION BULLETIN

ENGINEERING AND CONSTRUCTION BULLETIN ENGINEERING AND CONSTRUCTION BULLETIN No. 2018-7 Issuing Office: CECW-EC Issued: 06 Jun 18 Expires: 06 Jun 20 SUBJECT: Advanced Modeling Requirements on USACE Projects CATEGORY: Directive and Policy 1.

More information

DON XML Achieving Enterprise Interoperability

DON XML Achieving Enterprise Interoperability DON XML Achieving Enterprise Interoperability Overview of Policy, Governance, and Procedures for XML Development Michael Jacobs Office of the DON CIO Vision The Department of the Navy will fully exploit

More information

The descriptions of the elements and measures are based on Annex D of ISO/DIS Geographic information Data quality.

The descriptions of the elements and measures are based on Annex D of ISO/DIS Geographic information Data quality. 7 Data quality This chapter includes a description of the data quality elements and sub-elements as well as the corresponding data quality measures that should be used to evaluate and document data quality

More information

Framework for building information modelling (BIM) guidance

Framework for building information modelling (BIM) guidance TECHNICAL SPECIFICATION ISO/TS 12911 First edition 2012-09-01 Framework for building information modelling (BIM) guidance Cadre pour les directives de modélisation des données du bâtiment Reference number

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE CGS Risk Monitoring Risk Monitoring assesses the effectiveness of the risk decisions that are made by the Enterprise.

More information

DoDD DoDI

DoDD DoDI DoDD 8500.1 DoDI 8500.2 Tutorial Lecture for students pursuing NSTISSI 4011 INFOSEC Professional 1 Scope of DoDD 8500.1 Information Classes: Unclassified Sensitive information Classified All ISs to include:

More information

Open Geospatial Consortium

Open Geospatial Consortium Open Geospatial Consortium Date: 28-March-2011 Reference number of this document: 10-195 Editors: OGC Aviation Domain Working Group Requirements for Aviation Metadata Copyright 2011 Open Geospatial Consortium.

More information

OFFICE OF THE DIRECTOR OF NATIONAL INTELLIGENCE INTELLIGENCE COMMUNITY POLICY MEMORANDUM NUMBER

OFFICE OF THE DIRECTOR OF NATIONAL INTELLIGENCE INTELLIGENCE COMMUNITY POLICY MEMORANDUM NUMBER OFFICE OF THE DIRECTOR OF NATIONAL INTELLIGENCE INTELLIGENCE COMMUNITY POLICY MEMORANDUM NUMBER 2007-500-3 SUBJECT: (U) INTELLIGENCE INFORMATION SHARING A. AUTHORITY: The National Security Act of 1947,

More information

OG0-091 Q&As TOGAF 9 Part 1

OG0-091 Q&As TOGAF 9 Part 1 CertBus.com OG0-091 Q&As TOGAF 9 Part 1 Pass The Open Group OG0-091 Exam with 100% Guarantee Free Download Real Questions & Answers PDF and VCE file from: 100% Passing Guarantee 100% Money Back Assurance

More information

Geospatial Intelligence Interoperability Through Standards Gordon C.Ferrari Chief, Content Standards and Interoperability Division

Geospatial Intelligence Interoperability Through Standards Gordon C.Ferrari Chief, Content Standards and Interoperability Division Geospatial Intelligence Interoperability Through Standards Gordon C.Ferrari Chief, Content Standards and Interoperability Division 15 May 2002 NIMA Vision and Mission Statements National Imagery and Mapping

More information

GEOFidelis Program Update

GEOFidelis Program Update GEOFidelis Program Update JSEM & Geospatial Conference Columbus, OH 22 May 2007 Organization Secretary of the Navy CNO CMC GEOFidelis Program Office MCI NCR MCI East MCRD ERR MCI West MCRD WRR MCI Mid

More information

ISO/IEC TR TECHNICAL REPORT. Software engineering Product quality Part 4: Quality in use metrics

ISO/IEC TR TECHNICAL REPORT. Software engineering Product quality Part 4: Quality in use metrics TECHNICAL REPORT ISO/IEC TR 9126-4 First edition 2004-04-01 Software engineering Product quality Part 4: Quality in use metrics Génie du logiciel Qualité des produits Partie 4: Qualité en métrologie d'usage

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Software asset management Part 1: Processes and tiered assessment of conformance

ISO/IEC INTERNATIONAL STANDARD. Information technology Software asset management Part 1: Processes and tiered assessment of conformance INTERNATIONAL STANDARD This is a preview - click here to buy the full publication ISO/IEC 19770-1 Second edition 2012-06-15 Information technology Software asset management Part 1: Processes and tiered

More information

DoD Internet Protocol Version 6 (IPv6) Contractual Language

DoD Internet Protocol Version 6 (IPv6) Contractual Language DoD Internet Protocol Version 6 (IPv6) Contractual Language 1. Purpose: Contents of this document shall be incorporated in Government Acquisition Programs, Procurements, Services, and Contracts (including

More information

ISO/IEC TR TECHNICAL REPORT. Information technology Procedures for achieving metadata registry (MDR) content consistency Part 1: Data elements

ISO/IEC TR TECHNICAL REPORT. Information technology Procedures for achieving metadata registry (MDR) content consistency Part 1: Data elements TECHNICAL REPORT ISO/IEC TR 20943-1 First edition 2003-08-01 Information technology Procedures for achieving metadata registry (MDR) content consistency Part 1: Data elements Technologies de l'information

More information

Government of Ontario IT Standard (GO ITS)

Government of Ontario IT Standard (GO ITS) Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard Version # : 1.5 Status: Approved Prepared under the delegated authority of the Management Board of Cabinet Queen's

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE Digital Policy Management consists of a set of computer programs used to generate, convert, deconflict, validate, assess

More information

Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard

Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard Version # : 1.6 Status: Approved Prepared under the delegated authority of the Management Board of Cabinet Queen's

More information

U.S. Department of Transportation. Standard

U.S. Department of Transportation. Standard U.S Department of Transportation Federal Aviation Administration U.S. Department of Transportation Federal Aviation Administration Standard DATA STANDARD FOR THE NATIONAL AIRSPACE SYSTEM (NAS) Foreword

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE CGS IA Policies, Procedures, The Information Assurance (IA) Policies, Procedures, encompasses existing policies, procedures,

More information

STUDENT GUIDE Risk Management Framework Step 1: Categorization of the Information System

STUDENT GUIDE Risk Management Framework Step 1: Categorization of the Information System Slide 1 RMF Overview RMF Module 1 RMF takes into account the organization as a whole, including strategic goals and objectives and relationships between mission/business processes, the supporting information

More information

Vendor: The Open Group. Exam Code: OG Exam Name: TOGAF 9 Part 1. Version: Demo

Vendor: The Open Group. Exam Code: OG Exam Name: TOGAF 9 Part 1. Version: Demo Vendor: The Open Group Exam Code: OG0-091 Exam Name: TOGAF 9 Part 1 Version: Demo QUESTION 1 According to TOGAF, Which of the following are the architecture domains that are commonly accepted subsets of

More information

Implementing the Army Net Centric Data Strategy in a Service Oriented Environment

Implementing the Army Net Centric Data Strategy in a Service Oriented Environment Implementing the Army Net Centric Strategy in a Service Oriented Environment Michelle Dirner Army Net Centric Strategy (ANCDS) Center of Excellence (CoE) Service Team Lead RDECOM CERDEC SED in support

More information

OFFICE OF THE UNDER SECRETARY OF DEFENSE 3000DEFENSEPENTAGON WASHINGTON, DC

OFFICE OF THE UNDER SECRETARY OF DEFENSE 3000DEFENSEPENTAGON WASHINGTON, DC OFFICE OF THE UNDER SECRETARY OF DEFENSE 3000DEFENSEPENTAGON WASHINGTON, DC 20301-3000 ACQUISITION, TECHNO LOGY. A N D LOGISTICS SEP 2 1 2017 MEMORANDUM FOR COMMANDER, UNITED ST A TES SPECIAL OPERATIONS

More information

Data Governance Central to Data Management Success

Data Governance Central to Data Management Success Data Governance Central to Data Success International Anne Marie Smith, Ph.D. DAMA International DMBOK Editorial Review Board Primary Contributor EWSolutions, Inc Principal Consultant and Director of Education

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE CGS Network Boundary and The Network Boundary and for an Enterprise is essential; it provides for an understanding of

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Software asset management Part 2: Software identification tag

ISO/IEC INTERNATIONAL STANDARD. Information technology Software asset management Part 2: Software identification tag INTERNATIONAL STANDARD ISO/IEC 19770-2 First edition 2009-11-15 Information technology Software asset management Part 2: Software identification tag Technologies de l'information Gestion de biens de logiciel

More information

Forensics and Biometrics Enterprise Reference Architecture (FBEA)

Forensics and Biometrics Enterprise Reference Architecture (FBEA) Forensics and Biometrics Enterprise Reference (FBEA) Overview and Summary Information (AV-1) Final Draft Version 2.6 Aug 2016 Version History Version# Date Page(s) Changed Change Description 1.0 Feb 2016

More information

Information Systems Security Requirements for Federal GIS Initiatives

Information Systems Security Requirements for Federal GIS Initiatives Requirements for Federal GIS Initiatives Alan R. Butler, CDP Senior Project Manager Penobscot Bay Media, LLC 32 Washington Street, Suite 230 Camden, ME 04841 1 Federal GIS "We are at risk," advises the

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE CGS Network Mapping The Network Mapping helps visualize the network and understand relationships and connectivity between

More information

USAF Environmental Management System Update

USAF Environmental Management System Update Headquarters U.S. Air Force USAF Environmental Management System Update Maj Aaron Altwies HQ USAF/A7CAQ Why do EMS? EO 13423 mandates it Requires Fully implemented EMSs by Dec 2008 Audited by external

More information

Physical Security Reliability Standard Implementation

Physical Security Reliability Standard Implementation Physical Security Reliability Standard Implementation Attachment 4b Action Information Background On March 7, 2014, the Commission issued an order directing NERC to submit for approval, within 90 days,

More information

Safeguarding Unclassified Controlled Technical Information

Safeguarding Unclassified Controlled Technical Information Safeguarding Unclassified Controlled Technical Information (DFARS Case 2011-D039): The Challenges of New DFARS Requirements and Recommendations for Compliance Version 1 Authors: Justin Gercken, TSCP E.K.

More information

Paper for Consideration by CHRIS. Cooperation Agreement Between IHO and DGIWG

Paper for Consideration by CHRIS. Cooperation Agreement Between IHO and DGIWG CHRIS17-12.2A Paper for Consideration by CHRIS Cooperation Agreement Between IHO and DGIWG Submitted by: Executive Summary: Related Documents: IHB The IHO standards for digital hydrographic information

More information

DoD Information Technology Security Certification and Accreditation Process (DITSCAP) A presentation by Lawrence Feinstein, CISSP

DoD Information Technology Security Certification and Accreditation Process (DITSCAP) A presentation by Lawrence Feinstein, CISSP DoD Information Technology Security Certification and Accreditation Process (DITSCAP) A presentation by Lawrence Feinstein, CISSP April 14, 2004 Current Macro Security Context within the Federal Government

More information

OpenChain Specification Version 1.3 (DRAFT)

OpenChain Specification Version 1.3 (DRAFT) OpenChain Specification Version 1.3 (DRAFT) 2018.10.14 DRAFT: This is the draft of the next version 1.3 of the OpenChain specification. Recommended changes to be made over the current released version

More information

Step: 9 Conduct Data Standardization

Step: 9 Conduct Data Standardization Step: 9 Conduct Data Standardization Version 1.0, February 2005 1 Step Description/Objectives: Step 9, Conduct Data Standardization, is intended to reduce the life cycle cost of data through data integration,

More information

HITSP/T16. October 15, 2007 Version 1.1. Healthcare Information Technology Standards Panel. Security and Privacy Technical Committee.

HITSP/T16. October 15, 2007 Version 1.1. Healthcare Information Technology Standards Panel. Security and Privacy Technical Committee. October 15, 2007 Version 1.1 HITSP/T16 Submitted to: Healthcare Information Technology Standards Panel Submitted by: Security and Privacy Technical Committee 20071015 V1.1 D O C U M E N T C H A N G E H

More information

http://www.sis.se http://www.sis.se http://www.sis.se http://www.sis.se http://www.sis.se Provläsningsexemplar / Preview SVENSK STANDARD SS-ISO/IEC 14598-1 Fastställd 2003-01-31 Utgåva 1 Information technology

More information

The descriptions of the elements and measures are based on Annex D of ISO/DIS Geographic information Data quality.

The descriptions of the elements and measures are based on Annex D of ISO/DIS Geographic information Data quality. 7 Data quality This chapter includes a description of the data quality elements and sub-elements as well as the corresponding data quality measures that should be used to evaluate and document data quality

More information

ISO INTERNATIONAL STANDARD. Information and documentation Records management processes Metadata for records Part 1: Principles

ISO INTERNATIONAL STANDARD. Information and documentation Records management processes Metadata for records Part 1: Principles INTERNATIONAL STANDARD ISO 23081-1 First edition 2006-01-15 Information and documentation Records management processes Metadata for records Part 1: Principles Information et documentation Processus de

More information

DoD M, March 1994

DoD M, March 1994 1 2 3 FOREWORD TABLE OF CONTENTS Page FOREWORD 2 TABLE OF CONTENTS 4 FIGURES 5 REFERENCES 6 DEFINITIONS 8 ABBREVIATIONS AND ACRONYMS 14 CHAPTER 1 - INTRODUCTION 16 C1.1. Purpose 16 C1.2. Applicability

More information

PART C INTERNATIONAL HYDROGRAPHIC ORGANIZATION IHO GUIDELINE FOR CREATING S-100 PRODUCT SPECIFICATIONS

PART C INTERNATIONAL HYDROGRAPHIC ORGANIZATION IHO GUIDELINE FOR CREATING S-100 PRODUCT SPECIFICATIONS INTERNATIONAL HYDROGRAPHIC ORGANIZATION IHO GUIDELINE FOR CREATING S-100 PRODUCT SPECIFICATIONS PART C Draft Version 0.2 2018-08-31 Special Publication No. S-97 Guideline for Creating an S-100 Product

More information

Information technology Process assessment Concepts and terminology

Information technology Process assessment Concepts and terminology Provläsningsexemplar / Preview INTERNATIONAL STANDARD ISO/IEC 33001 Second edition 2015-03-01 Information technology Process assessment Concepts and terminology Technologies de l information Évaluation

More information

IVI. Interchangeable Virtual Instruments. IVI-3.10: Measurement and Stimulus Subsystems (IVI-MSS) Specification. Page 1

IVI. Interchangeable Virtual Instruments. IVI-3.10: Measurement and Stimulus Subsystems (IVI-MSS) Specification. Page 1 IVI Interchangeable Virtual Instruments IVI-3.10: Measurement and Stimulus Subsystems (IVI-MSS) Specification March, 2008 Edition Revision 1.0.1 Page 1 Important Information The IVI Measurement and Stimulus

More information

Stewarding NOAA s Data: How NCEI Allocates Stewardship Resources

Stewarding NOAA s Data: How NCEI Allocates Stewardship Resources Stewarding NOAA s Data: How NCEI Allocates Stewardship Resources Eric Kihn, Ph.D. Director, NCEI Center for Coasts, Oceans and Geophysics (CCOG) Krisa M. Arzayus, Ph.D. Deputy Director (Acting), NCEI Center

More information

ISO/IEC/ IEEE Systems and software engineering Content of life-cycle information items (documentation)

ISO/IEC/ IEEE Systems and software engineering Content of life-cycle information items (documentation) This is a preview - click here to buy the full publication INTERNATIONAL STANDARD ISO/IEC/ IEEE 15289 Second edition 2015-05-15 Systems and software engineering Content of life-cycle information items

More information

Standard Development Timeline

Standard Development Timeline Standard Development Timeline This section is maintained by the drafting team during the development of the standard and will be removed when the standard is adopted by the NERC Board of Trustees (Board).

More information

Standards Readiness Criteria. Tier 2

Standards Readiness Criteria. Tier 2 Document Number: HITSP 06 N 85 Date: June 1, 2006 Standards Readiness Criteria Tier 2 Version 1.0 May 12, 2006 HITSP Standards Harmonization Committee V 1.0 (5/12/2006) 1 Introduction...3 Background Information...3

More information

OFFICE OF THE ASSISTANT SECRETARY OF DEFENSE HEALTH AFFAIRS SKYLINE FIVE, SUITE 810, 5111 LEESBURG PIKE FALLS CHURCH, VIRGINIA

OFFICE OF THE ASSISTANT SECRETARY OF DEFENSE HEALTH AFFAIRS SKYLINE FIVE, SUITE 810, 5111 LEESBURG PIKE FALLS CHURCH, VIRGINIA OFFICE OF THE ASSISTANT SECRETARY OF DEFENSE HEALTH AFFAIRS SKYLINE FIVE, SUITE 810, 5111 LEESBURG PIKE FALLS CHURCH, VIRGINIA 22041-3206 TRICARE MANAGEMENT ACTIVITY MEMORANDUM FOR: SEE DISTRIBUTION SUBJECT:

More information

Network Working Group. Category: Informational April A Uniform Resource Name (URN) Namespace for the Open Geospatial Consortium (OGC)

Network Working Group. Category: Informational April A Uniform Resource Name (URN) Namespace for the Open Geospatial Consortium (OGC) Network Working Group C. Reed Request for Comments: 5165 Open Geospatial Consortium Category: Informational April 2008 Status of This Memo A Uniform Resource Name (URN) Namespace for the Open Geospatial

More information

ISO INTERNATIONAL STANDARD. Information and documentation Managing metadata for records Part 2: Conceptual and implementation issues

ISO INTERNATIONAL STANDARD. Information and documentation Managing metadata for records Part 2: Conceptual and implementation issues INTERNATIONAL STANDARD ISO 23081-2 First edition 2009-07-01 Information and documentation Managing metadata for records Part 2: Conceptual and implementation issues Information et documentation Gestion

More information

Building an Assurance Foundation for 21 st Century Information Systems and Networks

Building an Assurance Foundation for 21 st Century Information Systems and Networks Building an Assurance Foundation for 21 st Century Information Systems and Networks The Role of IT Security Standards, Metrics, and Assessment Programs Dr. Ron Ross National Information Assurance Partnership

More information

This document is a preview generated by EVS

This document is a preview generated by EVS INTERNATIONAL STANDARD ISO 19119 Second edition 2016-01-15 Geographic information Services Information géographique Services Reference number ISO 19119:2016(E) ISO 2016 ISO 19119:2016(E) COPYRIGHT PROTECTED

More information

National Data Sharing and Accessibility Policy-2012 (NDSAP-2012)

National Data Sharing and Accessibility Policy-2012 (NDSAP-2012) National Data Sharing and Accessibility Policy-2012 (NDSAP-2012) Department of Science & Technology Ministry of science & Technology Government of India Government of India Ministry of Science & Technology

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE CGS Deployment Deployment is the phase of the system development lifecycle in which solutions are placed into use to

More information

Realizing the Army Net-Centric Data Strategy (ANCDS) in a Service Oriented Architecture (SOA)

Realizing the Army Net-Centric Data Strategy (ANCDS) in a Service Oriented Architecture (SOA) Realizing the Army Net-Centric Data Strategy (ANCDS) in a Service Oriented Architecture (SOA) A presentation to GMU/AFCEA symposium "Critical Issues in C4I" Michelle Dirner, James Blalock, Eric Yuan National

More information

IT Security Evaluation and Certification Scheme Document

IT Security Evaluation and Certification Scheme Document IT Security Evaluation and Certification Scheme Document June 2015 CCS-01 Information-technology Promotion Agency, Japan (IPA) IT Security Evaluation and Certification Scheme (CCS-01) i / ii Table of Contents

More information

Department of Defense (DoD) Joint Federated Assurance Center (JFAC) Overview

Department of Defense (DoD) Joint Federated Assurance Center (JFAC) Overview Department of Defense (DoD) Joint Federated Assurance Center (JFAC) Overview Kristen Baldwin Principal Deputy, Office of the Deputy Assistant Secretary of Defense for Systems Engineering (DASD(SE)) 17

More information

ISO/TC 211 Geographic information/geomatics

ISO/TC 211 Geographic information/geomatics ISO/TC 211 N 1273 2002-05-03 Number of pages: 37 ISO/TC 211 Geographic information/geomatics Title: Text of 19113 Geographic information - Quality principles, as sent to the ISO Central Secretariat for

More information

DRAFT. Cyber Security Communications between Control Centers. March May Technical Rationale and Justification for Reliability Standard CIP-012-1

DRAFT. Cyber Security Communications between Control Centers. March May Technical Rationale and Justification for Reliability Standard CIP-012-1 DRAFT Cyber Security Communications between Control Centers Technical Rationale and Justification for Reliability Standard CIP-012-1 March May 2018 NERC Report Title Report Date I Table of Contents Preface...

More information

DHS Overview of Sustainability and Environmental Programs. Dr. Teresa R. Pohlman Executive Director, Sustainability and Environmental Programs

DHS Overview of Sustainability and Environmental Programs. Dr. Teresa R. Pohlman Executive Director, Sustainability and Environmental Programs DHS Overview of Sustainability and Environmental Programs Dr. Teresa R. Pohlman Executive Director, Sustainability and Environmental Programs DHS Mission DHS Organization Getting to Know DHS Mission: Secure

More information

Data Partnerships to Improve Health Frequently Asked Questions. Glossary...9

Data Partnerships to Improve Health Frequently Asked Questions. Glossary...9 FAQ s Data Partnerships to Improve Health Frequently Asked Questions BENEFITS OF PARTICIPATING... 1 USING THE NETWORK.... 2 SECURING THE DATA AND NETWORK.... 3 PROTECTING PRIVACY.... 4 CREATING METADATA...

More information

IMPROVING CYBERSECURITY AND RESILIENCE THROUGH ACQUISITION

IMPROVING CYBERSECURITY AND RESILIENCE THROUGH ACQUISITION IMPROVING CYBERSECURITY AND RESILIENCE THROUGH ACQUISITION Briefing for OFPP Working Group 19 Feb 2015 Emile Monette GSA Office of Governmentwide Policy emile.monette@gsa.gov Cybersecurity Threats are

More information

Metadata of geographic information

Metadata of geographic information Metadata of geographic information Kai Koistinen Management of environmental data and information 4.10.2017 Topics Metadata of geographic information What is metadata? Metadata standards and recommendations

More information

An Overview of ISO/IEC family of Information Security Management System Standards

An Overview of ISO/IEC family of Information Security Management System Standards What is ISO/IEC 27001? The ISO/IEC 27001 standard, published by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC), is known as Information

More information

Policy Document. PomSec-AllSitesBinder\Policy Docs, CompanyWide\Policy

Policy Document. PomSec-AllSitesBinder\Policy Docs, CompanyWide\Policy Policy Title: Binder Association: Author: Review Date: Pomeroy Security Principles PomSec-AllSitesBinder\Policy Docs, CompanyWide\Policy Joseph Shreve September of each year or as required Purpose:...

More information

STEP Data Governance: At a Glance

STEP Data Governance: At a Glance STEP Data Governance: At a Glance Master data is the heart of business optimization and refers to organizational data, such as product, asset, location, supplier and customer information. Companies today

More information

Using GIS to Help Support and Sustain U.S. Army Ranges - A Global Approach

Using GIS to Help Support and Sustain U.S. Army Ranges - A Global Approach INFRASTRUCTURE & TECHNOLOGY Using GIS to Help Support and Sustain U.S. Army Ranges - A Global Approach Robert Maple, GIS Program Manager Army Sustainable Range Program (SRP) Geospatial Support Center Army

More information

Bentley Systems Army Training Program Focuses on CAD GIS Integration

Bentley Systems Army Training Program Focuses on CAD GIS Integration Bentley Systems Army Training Program Focuses on CAD GIS Integration Tom Brown, Army Account Manager, Bentley Systems, Inc. Installations face serious challenges managing, analyzing and publishing geospatial

More information

Quality issues in LPIS: Towards Quality Assurance

Quality issues in LPIS: Towards Quality Assurance LPIS Workshop 2008, 17-18 September, 2008 1 Quality issues in LPIS: Towards Quality Assurance Valentina SAGRIS Geoinformation management GeoCAP AGRICULTURE UNIT EC - Joint Research Centre Relevant Quality

More information

Systems and software engineering Requirements for managers of information for users of systems, software, and services

Systems and software engineering Requirements for managers of information for users of systems, software, and services This is a preview - click here to buy the full publication INTERNATIONAL STANDARD ISO/IEC/ IEEE 26511 Second edition 2018-12 Systems and software engineering Requirements for managers of information for

More information

Implementing a Successful Data Governance Program

Implementing a Successful Data Governance Program Implementing a Successful Data Governance Program Mary Anne Hopper Data Management Consulting Manager SAS #AnalyticsX Data Stewardship #analyticsx SAS Data Management Framework BUSINESS DRIVERS DATA GOVERNANCE

More information

Data Governance Framework

Data Governance Framework Data Governance Framework Purpose This document describes the data governance framework for University of Saskatchewan (U of S) institutional data. It identifies designated roles within the university

More information

July 13, Via to RE: International Internet Policy Priorities [Docket No ]

July 13, Via  to RE: International Internet Policy Priorities [Docket No ] July 13, 2018 Honorable David J. Redl Assistant Secretary for Communications and Information and Administrator, National Telecommunications and Information Administration U.S. Department of Commerce Washington,

More information

Critical Cyber Asset Identification Security Management Controls

Critical Cyber Asset Identification Security Management Controls Implementation Plan Purpose On January 18, 2008, FERC (or Commission ) issued Order. 706 that approved Version 1 of the Critical Infrastructure Protection Reliability Standards, CIP-002-1 through CIP-009-1.

More information

2 The IBM Data Governance Unified Process

2 The IBM Data Governance Unified Process 2 The IBM Data Governance Unified Process The benefits of a commitment to a comprehensive enterprise Data Governance initiative are many and varied, and so are the challenges to achieving strong Data Governance.

More information

CIP Cyber Security Personnel & Training

CIP Cyber Security Personnel & Training A. Introduction 1. Title: Cyber Security Personnel & Training 2. Number: CIP-004-6 3. Purpose: To minimize the risk against compromise that could lead to misoperation or instability in the Bulk Electric

More information

Implementing a Modular Open Systems Approach (MOSA) to Achieve Acquisition Agility in Defense Acquisition Programs

Implementing a Modular Open Systems Approach (MOSA) to Achieve Acquisition Agility in Defense Acquisition Programs Implementing a Modular Open Systems Approach (MOSA) to Achieve Acquisition Agility in Defense Acquisition Programs Philomena Zimmerman Office of the Deputy Assistant Secretary of Defense for Systems Engineering

More information