Draft CWA: Global ebusiness Interoperability Test Bed (GITB)

Size: px
Start display at page:

Download "Draft CWA: Global ebusiness Interoperability Test Bed (GITB)"

Transcription

1 CEN WS GITB2 Date: draft CWA XXXX:2011 Secretariat: NEN Draft CWA: Global ebusiness Interoperability Test Bed (GITB) Status: Version 41 (September 1, 2011) - Draft CWA for public review, deadline for comments is 31 October 2011

2 Contents 1 Introduction Definitions and Abbreviations Definitions ebusiness Specifications ( see Section 3.2) Testing Purposes ( see Section 3.3) Testing Requirements ( see Chapter 11) Testing Roles ( see Section 5.2) Testing Framework and Architecture ( see Chapters 5 to 8) Abbreviations Part I: Motivation for ebusiness Testing and Synthesis of Architecture Solutions (Target Group: ebusiness users, standard development organizations, industry consortia, testing experts and all other stakeholders) Motivation Overview Stakeholders and their Interests in ebusiness Testing ebusiness Specifications ebusiness Testing Conformance and Interoperability Testing Testing Context Benefits of a Global ebusiness Interoperability Test Bed GITB Architecture Vision: Alternative Architecture Solutions and General Directions for a Testing Framework Objectives and Principles Synthesis of Architecture Solutions Implications for GITB: Focus on a Testing Framework Part II: GITB Testing Framework and Architecture (Target Group: Test Bed Experts and Architects) 31 5 GITB Testing Framework

3 5.1 Overview Testing Roles within the Testing Framework GITB-compliant Test Beds Test Artifacts General Design/Representation Principles and Best Practices Standard Test Artifact Header and Reference Stubs Test Artifact Headers External Reference Stubs Test Configuration Artifacts Configuration Artifacts related to ebusiness Communication Messaging Adapter Configuration Artifacts Design and Representation Artifacts Test Logic Artifacts Document Assertion Set Test Case Test Suite Test Assertion Test Output Artifacts Test Execution Log Test Report GITB-Compliance of Test Artifacts Test Services Test Services Functional Areas Mapping Testing Roles to Test Services GITB-Service compliance of Test Beds Test Bed Components Overview of the Test Bed Components User-facing Components Testing Capability Components Messaging Adapter Core plug-in interface

4 Core platform interface Document Validator Core plug-in interface Core platform interface Test Suite Engine Core plug-in interface Core platform interface Test Bed Core Platform Components Test Bed Platform Functional Components Test Bed Platform Interface Components GITB Test Registry and Repository (TRR) Overview of the GITB TRR Architecture of the GITB TRR Usage Scenarios GITB TRR for the Test Artifacts GITB TRR for the Testing Capability Components GITB TRR for the Remote Testing Capability Components Testing Methodologies and the Scenarios Testing Methodologies Using Test Assertions Standalone Document Validation SUT-Interactive Conformance Testing Interoperability Testing Two-phase Testing Proposed Testing Practices for SUTs Testing Scenarios Overview of the Testing Scenarios Testing Scenario 1: Creation, Deployment and Execution of a Validating Conformance Test Suite Testing Scenario 2: Creation, Deployment and Execution of an [SUT] Interacting Conformance Test Suite

5 Testing Scenario 3: Creation, Deployment and Execution of an [SUT] Interacting Interoperability Test Suite Interactions among Test Bed Components for Enabling Testing Scenarios Testing Scenario Testing Scenario Testing Scenario Part III: GITB Application and Validation based on Use Cases from the Automotive Industry, Healthcare and Public Procurement (Target Group: Test Bed Users, ebusiness Experts, SDOs) Applying GITB in Use Cases Approach Deriving Testing Requirements from ebusiness Scenarios Verification Scope («Test Patterns») Operational Requirements («Testing Environment») Deriving Test Scenarios and Solutions Use Case 1: Long Distance Supply Chains in the Automotive Industry ebusiness Scenarios Testing Requirements What and How to Test? Verification Scope (What to Test?) Testing Environment (How to Test?) Testing Scenarios How to Use GITB? Scenario 1: Creating, Deploying and Executing a Validating Test Suite for Conformance to a MOSS Message Design Phase Execution Phase Scenario 2: Creating, Deploying and Executing an Interacting Test Suite for Conformance to a MOSS Message Design Phase Execution Phase Scenario 3: Creating, Deploying and Operating an SUT Interacting Test Suite for Interoperability between several SUTs based on MOSS Trade Lane Design Design Phase

6 Execution Phase Use Case 2: Health Level (HL7) v3 Scenarios ebusiness Scenarios - HL7 Storyboards and IHE Interoperability Profiles Testing Requirements What and How to Test? Verification Scope (What to Test?) Testing Environment (How to Test) Testing Scenarios How to Use GITB? Scenario 1: Creating, Deploying and Executing a Validating Test Suite for Conformance to Turkish HL7 V3 Profile Scenario 2: Creating, Deploying and Executing an SUT Interacting Test Suite for Conformance to Turkish HL7 V3 Profile Scenario 3: Creating, Deploying and Executing an SUT Interacting Test Suite for Interoperability between several SUTs based on HL7 Turkish Profile Integrating the Healthcare Enterprise (IHE), Scheduled Workflow (SWF) Profile Scenario Use Case 3: Public Procurement CEN/BII Scenarios ebusiness Scenarios Testing Requirements What and How to Test? Verification Scope (What to Test?) Business Interoperable Specifications Transport Infrastructure Business Interoperability Specification sample: BIS 6a Testing Environment (How to Test?) Testing context Testing integration in business environment Testing location Testing Scenarios How to Use GITB? Scenario 1: Creating, Deploying and Operating a Validating Test Suite for Conformance to a PEPPOL Profile Scenario 2: Creating, Deploying and Executing an SUT Interacting Test Suite for Conformance to a PEPPOL Profile Scenario 3: Creating, Deploying and Executing an SUT Interacting test Suite for Interoperability between Several SUTs based on a PEPPOL Profile

7 Part IV: Test Bed Governance and Development Process Governance Governance Models Single Governance Model Shared Governance Model Open Governance Model Governance Recommendation Global Cooperation Model Part V: Appendices Appendix I: Test Service Interfaces Test Design Services Document Assertion Design Sub-service Create a Document Assertion Set Read a Document Assertion Set Update a Document Assertion Set Delete a Document Assertion Set Test Case Design Sub-service Create a Test Case Read a Test Case Update a Test Case Delete a Test Case Configure a Test Case Test Suite Design Sub-service Create a Test Suite Read a Test Suite Update a Test Suite Delete a Test Suite Configure a Test Suite Test Configuration Design Sub-service

8 Standard CRUD Operations Design for any Configuration Artifact CRUD Operations on a Message Adapter Configuration Artifact CRUD Operations on a Connection Artifact CRUD Operations on a Message Binding Artifact CRUD Operations on a Message Payload artifact CRUD Operations on a Test Suite Parameters Set Artifact Test Assertion Design Sub-service Issues Tracking Sub-service Test Deployment Services General Deployment Status Functions Get Deployment Status of a Test Bed Undeploy all Test Suites and DAS on a Test Bed Test Suite Deployment Sub-service Deploy a Test Suite Artifact Get Test Suite Deployment Status Undeploy a Test Suite Document Assertion Set Deployment Sub-service Deploy a Document Assertion Set Artifact Get a Document Assertion Set Deployment status Undeploy a Document Assertion Set Test Execution Services Test Suite Execution Control Sub-Service Start a Test Suite Execution Stop a Test Suite Execution Resume a Test Suite Execution Get Status of a Test Suite Execution Alter a Test Suite Execution Document Assertion Set Execution Control Sub-service Validate a Document Get status of a Document Validation Test Execution Collaboration and Logistics Sub-service

9 Register Parties Involved Notify Parties Involved Report Issue Get Issue Report Test Execution Agent Control Sub-service Execute Operation Get Operation Status Test Repository Services General Search Functions Get Test Artifacts Matching a Pattern Administration functions Create an Archive Duplicate an Archive Delete an Archive Set Access Rights for an Archive Archival Functions Store a Test Artifact Download a Test Artifact Select a Test Artifact or a Set of Artifacts Transfer a test Artifact or a Set of Artifacts Protocol and Message Bindings to Services Appendix II: Reusing and Coordinating Test Beds - How to become a GITB compliant Test Agent? Coordination Test Services Validate Document Operation Configure Test Case or Test Suite Operation Start Test Suite Execution Operation Get Status of Test Suite Execution Operation Recommended Implementable Syntax Format for the Operations Test Report Format

10 Recommended Implementable Syntax Format for the Test Report Coordination Test Service Metadata Recommended Implementable Syntax Format for the Test Service Metadata References

11 1 Introduction Background This document presents the draft CWA of the second phase of the Global Interoperability Test Bed Methodologies GITB) project as of August 2011, which runs within the CEN Workshop GITB. The production of the draft CWA on the proposed architecture and process to develop ebusiness test bed at global level was agreed at the CEN GITB Workshop meeting on 18 January The first phase of the GITB project ran within the CEN ebusiness Interoperability Forum (ebif). Its results are contained in CWA 16093:2010. The draft CWA is available for public comment in September-October The list of companies/organizations supporting the CWA will be included. Motivation The work on GITB is motivated by the increasing need to support testing of ebusiness scenarios as a means to foster standards adoption, achieve better compliance to standards and greater interoperability within and across the various industry, governmental and public sectors. While ebusiness scenarios are widely adopted by users in these sectors, it is still cumbersome for them to reach interoperability of ebusiness solutions and to achieve conformance with standards specifications. More advanced testing methodologies and practices are needed to cope with the relevant set of standards for realizing comprehensive ebusiness scenarios (i.e. business processes and choreography, business documents, transport and communication protocols), as well as test beds addressing the specific requirements of multi-partner interactions. GITB intends to increase the coordination between the manifold industry consortia and standards development organizations with the goal to increase awareness of testing in ebusiness standardization and to reduce the risk of fragmentation, duplication and conflicting ebusiness testing efforts. It thereby supports the goals of the European ICT standardization policy 1 2 to increase the quality, coherence and consistency of ICT standards and provide active support to the implementation of ICT standards. Objectives The long-term objective is to establish a comprehensive and Global ebusiness Interoperability Test Bed (GITB) infrastructure to support conformance and interoperability testing of ebusiness Specifications and their implementation by software vendors and end-users. The GITB project aims at developing the required global testing architecture, frameworks and methodologies for state-of-the-art ebusiness Specifications and profiles. It will support the realization of GITB as a network of multiple test beds leveraging existing and future testing capabilities from different stakeholders (for example standards developing organizations and 1 COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL AND THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE A strategic vision for European standards: Moving forward to enhance and accelerate the sustainable growth of the European economy by 2020 COM(2011)311 final 2 Proposal for a Regulation of the EU Parliament and of the Council on European Standardisation COM (2011)315 final dated 1 June

12 industry consortia, test bed providers, and accreditation / certification authorities). GITB is not intended to become an accreditation/certification authority or to impose a particular test bed implementation. Benefits The GITB Testing Framework enables a network of Test Beds to share testing resources and capabilities. By defining a modular architecture based on standard Test Bed component interfaces, it allows for reusability of certain Testing Capabilities, and extensible plug-in design. A joint approach for developing test beds across different world regions would positively affect development cost, capability, and compatibility of future testing facilities by leveraging best of class expertise and shared resources. Within a shared test bed effort, users, standards development organizations (SDOs), test service providers, and software vendors could benefit from sharing the work load, agreeing on the interpretations of the standards, and working in a synchronized manner. A shared, international test bed infrastructure would leverage synergies between existing testing activities and provide an opportunity to collaborate across national standards bodies. Approach The main objective of GITB is to develop, under EU support and guidance, a setup of a comprehensive and global ebusiness interoperability test bed infrastructure in a global collaboration of European, North American and Asian partners. This objective is planned to be achieved in three phases (Table 1-1), the current project covering phase two. In order to capture ebusiness testing requirements and validate the target GITB concepts, we rely on the contribution of industry communities and their experience in setting up standards-based ebusiness relationships. Automotive Industry: MOSS (Materials Off-Shore Sourcing) Healthcare: HL7 v3 Scenarios and Clinical Document Architecture (CDA) Public Procurement: CEN Business Interoperability Interfaces (BII) and Pan European Public Procurement (PEPPOL) During the initial phase, the feasibility analysis was performed by gathering the requirements from three use cases: automotive industry, public procurement and healthcare. Requirements were analysed at multiple levels (i.e., business, functional, and non-functional requirements) along with the existing Testing Capabilities to develop a shared conceptualization for ebusiness test beds. The comparison between the existing ebusiness Testing Capabilities and GITB requirements revealed a set of functional and non-functional gaps. The assessment of these gaps demonstrated that a shared, operational test bed is desirable and feasible to complement ebusiness standards development efforts. The aim of the second phase is to further conceptualize and elaborate the suggested approaches to architecting and implementing GITB. It is an objective of phase 2 to assess interest from international stakeholders in supporting GITB and realize the test bed system as a network of test services at a global scale. Some ideas on how the architecture and cooperation model will look like were already developed in phase1, but further analysis is needed to come up with a recommended architecture and a clear view on how to implement the test bed in phase 3. 11

13 Table 1-1: The Three Phases of the GITB Project Phase 1: Feasibility study An analysis of the benefits, risks, tasks, requirements, required resources for a GITB based on business use-cases; current state of ebusiness testing facilities. Phase 2: Conceptualization of the target architecture Analysis of alternative approaches to architecting and implementing a GITB. A recommended architecture and process to implement the test bed that follows from the requirements and architectural analysis with clear rationale. Assessment requirements from international stakeholders. Phase 3: Realization Implementation of test beds as shared testing facility. Provisioning of testing services to industry users, software vendors and SDOs. Key deliverables of phase two are: Interim report (by 30 June 2011) covering: (1) Synthesis of alternative architectural solutions; (2) Outline of the target architecture; (3) Validation of the target architecture and approach based on the use cases Final report (by 31 December 2011) covering (4) Final version of the recommended architecture and process to develop the test bed; (5) A global cooperation model based on the expressed intent of support of the key international organizations; (6) The key components of GITB; (7) The outline of a testing methodology. How to Read This Report This draft CWA presents the recommended GITB architecture and applies the generic architecture to three use cases. In order to improve readability, the report is structured in four main sections addressing different target groups and their view on GITB project results. Table 1-2 is the summary of guidelines. 12

14 Table 1-2: Guidelines of How to Read the Report Main Parts of the Report Content Target Audience Part I: Motivation for ebusiness Testing and Synthesis of Architecture Solutions Chapters 3 to 4 (pp.19-29) Part II: GITB Testing Framework and Architecture Chapters 5 to 10 (pp ) Why ebusiness testing matters? Motivation for ebusiness Testing How GITB envisions ebusiness testing GITB Architecture Vision: Alternative Architecture Solutions General Directions for a Testing Framework What is the suggested GITB testing framework and architecture? GITB Testing Framework Test Artifacts ebusiness users, standard development organizations, industry consortia, testing experts and all other stakeholders interested in the general motivation for GITB and an overview of the proposed solution Testing experts and architects that are interested in the detailed test bed architecture Test Services Test Bed Components Test Registry and Repository Testing Scenarios Part III: Validation based on Use Cases from the Automotive Industry, Healthcare and Public Procurement Chapters 11 to 14 (pp ) Part IV: Test Bed Governance and Deployment Process How to Use GITB for ebusiness testing? Approach Automotive Industry Healthcare and Public Procurement How to setup GITB? Test Bed Governance ebusiness users, standard development organizations, industry consortia that are interested in applying the test bed architecture to their ebusiness scenarios All stakeholders interested in GITB development and setup Chapters 15 to 16 (pp ) Test Bed Development Process Global Cooperation Model These three topics are yet to be discussed in detail in the GITB Phase 2 project and are therefore only mentioned in a preliminary form in this report. 13

15 2 Definitions and Abbreviations 2.1 Definitions The following definitions are intended to address the most commonly recurring terms about testing in this report. They are general definitions that may be refined in later sections of the document. Other terms relating to specific areas (e.g. architecture, artifacts), will be listed in the related sections. For the purpose of the present document, the terms and definitions given in ISO/IEC :1994 Information technology -- Open Systems Interconnection -- Conformance testing methodology and framework -- Part 1: General concepts apply. Most of the definitions below are capitalized, even when involving common terms e.g. Test Bed. When the capitalized version is used in this document, it should be understood as having the particular meaning defined in this section (or as defined in a further section), usually more precise or specific to the GITB context than the common meaning for the domain ebusiness Specifications ( see Section 3.3) ebusiness Specification: An ebusiness Specification is any agreement or mode of operation that needs to be in place between two or more partners in order to conduct ebusiness transactions. An ebusiness Specification is associated with one or more of three different layers in the ebusiness interoperability stack: transport and communication (Messaging) layer, business document layer, and business process layer. In many situations, an ebusiness Specification comprises a set of standards or a profile of these. Profile (of ebusiness Specifications): A Profile represents an agreed upon subset or interpretation of one or more ebusiness Specifications, intended to achieve interoperability while adapting to specific needs of a user community. Business Process: A Business Process is a flow of related, structured activities or tasks that fulfill a specific service or business function (serve a particular goal). It can be visualized with a flowchart as a sequence of activities. The term also includes the resulting exchanges between business partners, which is also named public process. The public process makes abstraction of the backend processes driving these exchanges. Business Document: A business document is a set of structured information that is relevant to conducting business, e.g., an order or an invoice. Business documents may be exchanged as a paper format or electronically, e.g. in the form of XML or EDI messages Testing Purposes ( see Section 3.4) System Under Test (SUT): An implementation of one or more ebusiness Specifications, being part of an ebusiness system which is to be evaluated by testing. Conformance Testing: Process of verifying that an implementation of a specification (SUT) fulfills the requirements of this specification, or of a subset of these in case of a particular conformance profile or level. Conformance testing is usually realized by a Test Bed connected to the SUT. The Test Bed simulates ebusiness protocol processes and artifacts against the SUT, and is generally driven by the means of test scripts. 14

16 Interoperability Testing: A process for verifying that several SUTs can inter-operate at one or more layers of the ebusiness interoperability stack (see ebusiness Specification ), while conforming to one or more ebusiness Specifications. This type of testing is executed by operating SUTs and capturing their exchanges. The logistics of interoperability testing is usually more costly (time, coordination, set-up, human efforts) than Conformance testing. Conformance does not guarantee interoperability, and interoperability testing is no substitute for a conformance test suite. Experience shows that interoperability testing is more successful and less costly when Conformance of implementations has been tested first. The interoperability test process can also be piloted by a Test Bed, using test scripts like in conformance testing Testing Requirements ( see Chapter 11) Verification Scope: The verification scope specifies the subject of testing. It answers the question: What type of concern to test for? A type of concern is defined by (1) a specific aspect or quality of SUT to be assessed and (2) an ebusiness Specification or profile. Operational Testing Requirements: An operating environment requirement specifies the concerns of defining, obtaining, and validating test items within a specific testing environment. It answers the question what is the specific testing environment? Testing Roles ( see Section 5.2) Test Designer: A Testing Engineer developing Test Suites, Test Cases and Document Assertions. See Section 5.2. This includes interpreting the B2B specification requirements, understanding or writing Test Assertions if any in order to derive test Cases from these. Test Manager: A role responsible for executing Test Suites or for facilitating their execution, including related organizational tasks such as coordination with test Participants. Test Participant: The owner or operator of an SUT, typically the end-user, an integrator or a software vendor. This role defines the Verification Scope and Testing Requirements. [Test Bed, Testing Capability, Test service] Provider: A general role that applies to anyone in position to offer some testing resource, either a hosting capability (Test Bed), Testing Capability (e.g. HL7 conformance), test design expertise or test facilitation (e.g. facilitator in organizing an interoperability testing session) Testing Framework and Architecture ( see Chapters 5 to 8) Document Assertions Set ( see Section 8.2): A Document Assertions Set (DAS) is a package of artifacts used to validate a Business Document, typically including one or more of the following: a schema (XML), consistency rules, codelists, etc. These artifacts are generally machineprocessable. Document Validator ( see Section 8.2): A processor (a software application) that can verify some aspects of document requirements, i.e. some validation assertions about a document such as an XML schema or some consistency rules. A document validator may be specialized for some type of validation assertion (e.g. XML schema validation, or semantic rules). GITB Testing Framework: see "Testing Framework". Legacy Test Bed ( see Section 5.3): An existing Test Bed that has been developed prior to GITB recommendations. A legacy Test Bed can be made GITB-service compliant by extending it with a subset of the Service interfaces described in this report. 15

17 Test Agent: A processor - either a simple software testing application or a complete Test Bed - that plays a secondary role in the execution of a Test Suite - i.e. is interacting either with a Test Bed or with a Web browser for the purpose of assisting Test Suite execution. A Test Agent may simulate one party in the execution of a test suite (e.g. send messages to an SUT or wait for messages from the SUT), or may be specialized for the execution of some Test Case, or for executing a Document Validator. A test agent may be either one or both of: (a) an interacting [Test] Agent if it is able to directly interact with an SUT e.g. to execute parts of the Test Suite (e.g. simulates a business party in some Test Case). (b) a validating [Test] Agent if it is able to verify conformance of some Test Items to an ebusiness Specification or to a profile. Test Artifact: Test Artifacts are documents that are machine-readable (e.g. formatted in XML). These documents may represent various data objects used as input or output of Test Beds, e.g. component configuration, test suite scripts, test reports, test logs. See Chapter 5. Test Assertion (Cf. OASIS Test Assertion Guidelines definition [TAG]; see Section 6.4.4): A test assertion is a testable or measurable expression - usually in plain text or with a semi-formal representation - for evaluating the adherence of an implementation (or part of it) to a normative statement in a specification. Test assertions generally provide a starting point for writing a conformance test suite or an interoperability test suite for a specification. Test Bed: An actual test execution environment for Test Suites or Test Services. In the context of this document, this generic term applies by default to various operational combinations of components provided by or developed according to the (GITB) Testing Framework. Test Bed Architecture: A particular combination of components, and relationships among the components in a software system designed based on Testing Framework resources and definitions and intended to perform testing operations in accordance with use case requirements. Test Bed Component ( see Section 8.1): A component of a Test Bed platform that executes a generic function independent from any ebusiness Specification. Either a core Test Bed platform component (performing an internal test Bed function, e.g. test suite deployment) or a user-facing component (e.g. a test Suite editor), or yet a plug-in component (Testing Capability such as a Document Validator). Test Case ( see Section 6.4.2): (A kind of Test Artifact). A Test Case is an executable unit of verification and/or of interaction with an SUT, corresponding to a particular testing requirement, as identified in an ebusiness Specification. Test Description Language: In the ebusiness domain, Test Description Language (TDL) is a high-level computational language capable of expressing Test Case and Test Suite execution logic and semantics. Test Execution Log ( see Section 6.5.1): (A kind of Test Artifact). Message capture or other trace of observable behavior that results from SUT activity. It is a collection of Test Items, subject to further verification or analysis. Testing Capability ( see Section 8.3): a test processing module that is implemented as a plugin Test Bed component or as a remote Test Agent. The role of such a module is either to perform some form of validation e.g. a Test Suite engine, a Document Validator or to perform some activity in support of such validation e.g. a Messaging Adapter. A Testing Capability may be added or removed from a Test Bed depending on the testing needs without modifying the code of the Test Bed but instead via a configuration operation. 16

18 Testing Resource: A generic term to designate any artifact (Test Artifact), part of a Test Bed (e.g. Testing Capability component), or a combination of these, available to an authorized user depending on his/her role. Testing Framework (or GITB Testing Framework see Chapter 5): Architecture, methodology and guidelines for assisting in the creation, use and coordination of Test Beds, including: - (a) Test Artifacts design and standards (e.g. for Test Suites, Test Reports), - (b) Test Services definitions and interfaces for supporting end-user testing activities, - (c ) Test Bed components and their integration, - (d) Test Registry and Repository for managing, archiving and sharing various testing resources. Test Item: A unit of data to be verified, e.g. a document, a message envelope, an XML fragment. In the B2B or ebusiness environment, test item can be message instance, event, or status report that is obtained from an SUT for the purposes of assessing conformance or interoperability of the SUT (see conformance testing, interoperability testing). Test Step: A unit of test operation(s) that translates into a controllable, separate unit of test execution. Test Suite ( see Section 6.4.3): (A kind of Test Artifact). A Test Suite defines a workflow of Test Case executions and/or Document Validator executions, with the intent of verifying one or more SUTs against one or more ebusiness Specifications, either for conformance or interoperability. Test Suite Engine ( see Section 8.3.3): A Test Suite Engine (or "Test Suite Driver") is a processor that can execute a Test Suite, or has control of the Test Suite main process execution in case it delegates part of the execution - e.g. some Test Cases or some validation tasks - to specialized Test Agents or to a Document Validator. Test Report ( see Section 6.5.2): (A kind of Test Artifact). Result of verifying the behavior or output of one or more SUT(s), or verifying Test Items such as business documents. It is making a conformance or interoperability assessment (see conformance testing, interoperability testing). It is generally intended for human readers (although possibly after some rendering, e.g. HTML rendering in a browser or after a translation XML to HTML). 2.2 Abbreviations AIAG B2B B2C B2G CDA DAS eac Automotive Industry Action Group Business-to-Business Business-to-Consumer Business-to-Government Clinical Document Architecture Document Assertion Set ebxml Asian Committee 17

19 ebbp ebif EDI GITB GUI HL7 HTML IDE IHE MOSS NHIS PEPPOL SDO SOAP SUT TAG TAPM TRR XML ebxml Business Process ebusiness Interoperability Forum (ebif) Electronic Data Interchange Global ebusiness Interoperability Test Bed Graphical User Interface Health Level Seven Hypertext Markup Language Integrated Development Environment Integrating the Healthcare Enterprise Material Off-Shore Sourcing National Health Information System Pan-European Public Procurement Standards Development Organization Simple Object Access Protocol System Under Test Test Assertion Guidelines Test Artifacts Persistence Manager Test Registry and Repository Extended Markup Language 18

20 Part I: Motivation for ebusiness Testing and Synthesis of Architecture Solutions (Target Group: ebusiness users, standard development organizations, industry consortia, testing experts and all other stakeholders) 3 Motivation 3.1 Overview The work on GITB is motivated by the increasing need to support testing of ebusiness scenarios across different world regions. By ebusiness, we denote electronic business processes which are conducted across organizational boundaries over computer mediated networks. This covers business processes between companies ("B2B" business-to-business), companies and consumers ("B2C" business-to-consumers), or with governments ("B2G" business-to-government) While ebusiness Specifications are implemented and adopted at a global level, interoperability has become a major concern. Consequently, organizations from private and public sectors as well as technology and software providers are engaged in cooperation for the development of vertical industry standards. However, it can be noticed that it is still cumbersome for software vendors and end-users to demonstrate full compliance with the specified standards and to achieve interoperability of the implementations 3. This is due to a number of facts: (1) Many standards development organizations (SDOs) and industry consortia are only in the process of conceptualizing how they will ensure interoperability of standards implementations. They are unsure how to provide adequate testing and certification services. (2) ebusiness interoperability typically requires that a full set of standards from open internet and Web Services standards to industry-level specifications and ebusiness frameworks are implemented. We denote this set of standards as ebusiness Specifications that underlie the electronic business relationship. (3) As of today, there are only limited and scattered Test Beds. If Test Beds are provided by one of the standard developing organizations, they have a rather narrow focus on a particular standard. In particular, they might not encompass testing the entire set of relevant ebusiness Specifications from a company perspective, i.e. a profile, and interactions in more complex business processes with several partners. The following section outlines the demand for ebusiness testing from the perspective of the relevant stakeholders. 3 ebusiness W@tch Report one-business Interoperability and Standards: A Cross-Sector Perspective and Outlook,

21 3.2 Stakeholders and their Interests in ebusiness Testing The relevant stakeholders in ebusiness testing comprise end-users from private and public sectors, industry consortia and SDOs, technology and software vendors, testing laboratories as well as public authorities and governments. End-users comprise all organizations from private and public sectors which implement ebusiness scenarios. Their ultimate goal is to increase the efficiency and effectiveness of their organizations and to keep up-to-date in solutions for enhanced customer experiences. ebusiness testing is of interest for them as they: (1) realize the benefits of ebusiness solutions more quickly, with less project risks, and (2) avoid costs implied by investments in low quality, non-interoperable standards. For end-users, the lack of ebusiness testing has negative impacts on project duration for onboarding business partners and is one of the root causes of significant B2B integration costs. While the ability of an enterprise to quickly add new business partners is a key factor in determining the level of its business agility, most companies need 3 to 10 days or more to on-board new business partners 4. The most negative effects of a lack of testing, however, are errors that occur in productive ebusiness scenarios, i.e. if supply chain operations are slowed down or customer requirements cannot be fulfilled as planned. Industry consortia and formal SDOs are communities of end-users, public authorities and other interested parties that act to achieve the following objectives: (1) Maintain cohesive community acting on key set of industry issues leading to industrydriven, voluntary standards development; (2) Develop high quality, timely industry standards specifications in support of the industry needs; (3) Effect efficient implementation of the developed standards by the vendors to provide a rational basis for the standards assessment; (4) Enable straightforward and effective approaches for standards implementation assessment, piloting, and eventual deployment. For industry consortia, the lack of testing increases the risks that implementations of the specified standards are not interoperable. Without testing, standard specifications will be difficult to implement and their implementations will be of low quality, resulting in a lack of interoperability. Software vendors that act to achieve the following objectives: (1) Develop enterprise applications that are standards-compliant, and (2) Effectively support their client base by achieving functional and interoperable business solutions. Software application vendors are struggling with the pure number and complexity of standards as well as the low quality of ebusiness Specifications with regard to their consistency. Missing implementation guidelines and missing Testing Capabilities increase their implementation efforts 4 Forrester Research Inc. (2009): The Value of a Comprehensive Integration Solution, Forrester Research Inc., Cambridge,

22 and the risks that their software applications are not conforming to ebusiness Specifications and / or are not interoperable with other implementations. Testing laboratories act to achieve the following objectives: (1) Increase efficiency and reliability of interoperable implementation of standards; (2) Assure unbiased and objective nature of the standards implementation assessment process. From the perspective of national governments and the European Union lacking interoperability and poor-quality standards harm innovation and competition, burn investments, and drain the growth potential of markets. In their current efforts to modernize the EU ICT standardization policy, the European Commission states the following policy goals: To provide industry including SMEs, with high-quality ICT standards in a timely manner to ensure competitiveness in the global market while responding to societal expectations; To increase the quality, coherence and consistency of ICT standard, and To provide active support to the implementation of ICT standards. ebusiness testing provides the necessary means to achieve these goals, as it contributes to solve quality issues in standards development and addresses implementation issues which currently hamper the adoption of ebusiness standards. Consequently, ebusiness testing needs to be a cornerstone of EU ICT standardization policy. 3.3 ebusiness Specifications The term ebusiness refers to doing business electronically and involves a broader scope of electronically-enabled activities among different individuals and organizations (including Businessto-Business, Business-to-Consumer and Business-to-Public Sector). Organizations are increasingly using information and communication technologies to link their business processes and heavily rely on electronic information exchange with their business partners. Numerous standards have been developed and some are still under development to address the various layers in the ebusiness interoperability stack 56 (Table 3-1): 1. Transport and Communication (Messaging) Layer: How do organizations communicate electronically? This layer addresses technical interoperability. Relevant specifications cover the range from transport and communication layer protocols like HTTP to higher level messaging protocols such as Simple Object Access Protocol (SOAP) or ebxml Messaging. Furthermore, security, reliability and other quality of service protocols and extensions over the transport and communication protocols are also considered in this layer. 2. Business Document Layer: What type of information do the ebusiness systems exchange? 5 CEN ISSS: ebusiness ROADMAP addressing key ebusiness standards issues Legner, C.; Vogel, T. (2008): Leveraging Web Services for Implementing Vertical Industry Standards: A Model for Service-Based Interoperability, in: Electronic Markets, 18, 1, 2008, pp

23 This layer addresses the semantic interoperability and specifies the form and content of business documents which are exchanged electronically. Specifications may relate to: Document structure, i.e. definition of the document syntax (e.g. XML), the naming and design rules (e.g. rules for generic business document structure, as specified by OAGIS BOD architecture) and the assembly of the document (e.g. rules for the assembly of business documents, as defined by OAGIS BOD architecture); Document semantics, i.e. the definition of document and fields (e.g. an XML document definition) and their meaning including reference to external code lists, taxonomies and vocabularies (UN/CEFACT Core Component Library, UBL Component Library), and Business rules that define restrictions or constraints among data element values. 3. Business Process Layer: How do the organizations interact? Business processes address organizational interoperability. Specifications at this level describe how business processes are organized across organizational boundaries. The business process layer, either presented in a formal business process specification standard such as ebxml Business Process Specification Schema (BPSS) or with an informal workflow definition like flowcharts or interaction diagrams, provides a message choreography, exception flows (error handling) and other business rules for the ebusiness application roles participating in the process. In addition to these layers, an ebusiness Specification may rely on profiles which define crosslayer dependencies and further restrictions on the single layers. Table 3-1: ebusiness Specifications 1. Business Process Layer describes the different actors, their roles and interactions, as well as the information flow Cross-org. process flow Specifications (example) Machine-readable representation Actors and roles Text descriptions of roles, UML Use case diagrams? Activities and interactions flow charts, text, UML activity diagrams, UN/CEFACT UMM diagrams, BPSS, (BPEL), Message choreography Specifications (example) Machine-readable representation Sequence of messages flow charts, UML sequence diagram, BPMN diagram, WS-Choreography Language, BPSS 2. Business Document Layer addresses the structure and content of business documents which are exchanged electronically Content / Semantics Specifications (example) Machine-readable representation Document definition XML message (XSD), EDIFACT document XML schema (XSD), EDI Dictionary / vocabulary external code lists, UN/CEFACT Core Component Library (CCL), RosettaNet Business Data models, Ontologies Dictionary, UBL Component Library, HR-XML Core Components, other dictionaries... Consistency Consistency rules for cross-message validation Rules, data models Structure / Syntax Specifications (example) Machine-readable representation Document assembly UN/CEFACT Core Component Message Assembly (CCMA), OAGIS BOD Architecture Naming & design rules (NDR) UN/CEFACT XML NDR, OAGIS NDR, Header definition UN/CEFACT Standardized Business Document Header (SBDH), Document syntax XML, EDI guidelines, 3. Transport and Communication Layer covers the range from transport and communication to higher level messaging protocols Messaging protocols ebxml Messaging (ebms), SOAP, EDIFACT ANSI, RosettaNet Implementation Framework (RNIF), Transport & communication HTTP, TCP/IP, OFTP, FTP, X.400 Others (Security, etc.) WS-Security, WS-Reliability, WS-Policy, 22

24 3.4 ebusiness Testing Conformance and Interoperability Testing From a general perspective, two types of testing are relevant in the context of ebusiness: Conformance testing involves checking whether the ebusiness implementations conform to the ebusiness Specifications so that they can interoperate with other conformant systems as prescribed by the specification. Interoperability testing is testing to ensure that ebusiness implementations actually will be able to intercommunicate. Experience shows that only through conformance and interoperability testing, correct information exchange among ebusiness implementations can be guaranteed and software implementations can be certified. Conformance testing is no substitute for interoperability testing, and vice-versa. Experience also shows that the type and quality of the ebusiness Specifications impact whether conformance and interoperability testing can easily be performed. If ebusiness Specifications comprise substantial text descriptions, with some flow-charts or diagrams, these narrative or semiformal representations often leave many degrees of freedom for interpretation to the users. The efforts to prepare test scripts and test cases are much higher than in the case of an ebusiness Specification which comprises machine-readable representations, such as XML schemas, code lists, data models or formal representations (e.g. in the Web Services Definition Language, or in ebxml Business Process (ebbp) -some examples for machine-readable specifications are depicted in the right column of Figure 3-1). As of today, the existing testing tools, test suites and testing committees individually address a specific standard or one of the above layers. However, integrated testing frameworks which do not hard-code a specific standard at any layer (because different communities may use different standards) and are capable of handling testing activities at all layers of the interoperability stack are necessary for conformance and interoperability testing Testing Context ebusiness testing is performed in different contexts with different business rationale and stakeholders: (1) Standardization initiated by a standard development organization (SDO) or an industry consortium (Figure 3-1): An SDO or industry consortium develops an ebusiness Specification (= standard or profile) and deploys it to the community of users, software vendors etc. In this case, testing occurs during standard development (in order to test conformance with other specifications, such as Naming and Design Rules) for quality assurance of the developed ebusiness Specifications. Testing also occurs during standard deployment to ensure the quality and the interoperability of the implementations. Testing may lead to certification of software or productive implementations. 23

25 Figure 3-1: Testing Context Standardization (2) Onboarding of new business partners initiated by user company (Figure 3-2): In this case, a company defines ebusiness Specifications and imposes their implementation on all business partners. Testing is performed as part of the so-called onboarding process of partners. Figure 3-2: Testing Scenario for Onboarding Table 3-2 illustrates efficiency gains resulting from automated interoperability testing compared to manual interoperability testing. It is based on experiences from testing ebms 2.0 implementations conducted among eac (ebxml Asian Committee) members. 24

26 Table 3-2: Comparison of Manual vs. Automated Testing (Source: KorBIT) Manual Testing Automated Testing (by using Test Bed) Average time to process a test case (including debugging) The number of test cases used per paired parties Average testing time per paired parties 0.5 hour 5 minutes hour * 25 = 12.5 hours 5 minutes * 25 2 hours The total testing time 12.5 hours * 20 = 250 hours 2 hours * 20 = 40 hours Resulting efficiency gains 16% man-hour of manual testing is needed Assumptions: 1. SUT: ebms 2.0 implementations 2. The number of SUTs: 5 3. The manual testing has been conducted among eac (ebxml Asian Committee) members. The number of pairs is 5 4 = 20, since each party within a pair may be either sender or receiver, depending on which party initiates the transaction first. 4. The software engineers from each party conducted testing remotely. 5. The number of test cases to be used is Benefits of a Global ebusiness Interoperability Test Bed To summarize, without ebusiness testing the potential of standard setting is not fully exploited and the wide-spread adoption of ebusiness standards will not be possible. Hence, the rationale for GITB can be summarized as follows: (1) Efficient deployment of means in ebusiness implementation projects (less means will be drained for low quality, conflicting or fragmented standards), (2) Higher quality of ebusiness standards and mitigation of systemic risks in the ebusiness community, and (3) Improvement of the ebusiness standards development and diffusion process. More attention to testing, visibility of outcomes and feedback from testing to industry consortia and SDO's will imply increased attention to quality of standards and their implementation, and to a crisp boundary between commons (standards as public resources) and proprietary assets. 25

27 4 GITB Architecture Vision: Alternative Architecture Solutions and General Directions for a Testing Framework This section introduces the objectives and principles for GITB as well as the architecture vision and general directions at the basis of the underlying testing framework. These objectives and principles have been considered appropriate by the partners of the GITB project, based on: (1) Extensive experience in designing, developing and using conformance and interoperability Test Beds (e.g. design, development and operation of NIST B2B Test Bed, KorBIT, SRDC Ltd s TestBATN, WS-I test suites and others) that are used today by various communities. These pre-existing works have been analyzed and described in Phase 1. (2) A detailed analysis of the ebusiness use cases and requirements from Phase 1, augmented by further operational requirements gathered from the selected set of three use cases in Phase 2 of the project. 4.1 Objectives and Principles The following objectives and principles should be met by the GITB architecture, the underlying testing framework and the Test Beds: ebusiness testing requirements: In view of the increasing number of ebusiness Specifications that are implemented and adopted at a global level, testing has to address all interoperability layers (i.e. business processes and choreography, business documents, transport and communication protocols) as well as profiles of them. Testing Anywhere, Anytime: Interoperability and conformance testing should not be restricted in time and place. Software vendors and end-users should be able to test their implementations over the Web anytime, anywhere and with any parties willing to do so. interoperability testing is expected to be repeated on a regular basis, as B2B networks and systems evolve continuously due to upgrades of ebusiness Specifications, changing business communities, and changing business requirements. Reduction of Time Spent in Testing: Considering the amount of test cases to cover the conformance or interoperability testing requirements of ebusiness specifications, the time spent by participants during the testing process should be significantly reduced. Ease of Design and Use: The Test Bed will aim at the low cost of entry for its users and hence provide a graphical environment where a test designer can assemble the reusable test cases for conformance and interoperability testing. Independence of Test Bed design from the ebusiness Specifications: Hard-coded test logic in one-of Test Bed implementations is not desirable due to opacity, maintenance difficulties, non-reusable skills and platforms. The Test Bed design(s) have to be independent from ebusiness Specifications to be tested for. Modularity: Current ebusiness Specifications specify a variety of messaging protocols, business document formats or choreographies. In order to support all of these and test them, the Test Bed should be adaptable and modular. Therefore, it is necessary to define interfaces for several layers and facilitate plug-in modules supporting different protocols or formats implementing the specified interfaces. 26

28 Reuse of existing Test Beds and Test Suites: A Service approach allows for reuse and leverage of existing Test Beds (legacy or not) and Test Suites. Flexibility in architecture: The testing framework should allow for flexibility in architecture Test Bed designs. It may be instantiated, e.g. as centralized Test Bed or as distributed Test Bed using a service-oriented approach. Distributed testing: A (distributed) test suite engineering platform allows for creating test suites from existing components, assembling and deriving test cases from existing ones, reusing similar design patterns. Standardized and innovative testing methodologies will ensure the successful development of testing of comprehensive ebusiness Specifications and profiles. Full Automation of Test Process: Partly automating the test process by providing different tools for certain layers of the interoperability stack and doing the rest manually results in human labor intensive, error prone, costly to develop test processes. Automation of testing also implies the non-interference with the system in its native state of execution. The GITB aims to provide a holistic approach to ebusiness interoperability testing by integrating configuration management and other preliminary test steps into the testing process. 4.2 Synthesis of Architecture Solutions In Phase 1, two types of architectural solutions have been proposed. The first one is Single Test Bed Architecture with a single Test Description Language (TDL) and Execution Model, where only one central Test Bed is envisaged. The second one is Network of Multiple Test Beds and Test Services which allows the coordination of multiple Test Beds through predefined interfaces. While these architectures represent two technically sound approaches, it appears that none of these two identified architecture alternatives is satisfying from a practical viewpoint, but rather a combination of both: The Single Test Bed Architecture with a single Test Description Language (TDL) and Execution Model would exclude legacy Test Beds that do not comply with its detailed internal design. Extensive reengineering would be required before an interoperable solution is achieved. GITB aims at promoting reuse and leveraging existing Testing Capabilities and legacy Test Beds. On the other hand, the Network of Multiple Test Beds and Test Services solution has some limits and scalability issues in the long run if too many aspects of the internals of a Test Bed remain unspecified. Standardizing on component interfaces and on test artifacts in addition to services - is essential in order to assist or encourage future Test Bed implementers to design reusable components and portable test artifacts. Not doing so would in the long term thwart scalability of test repositories, test portability and reusability, and ease of future development. GITB aims at defining some level of standardization in order to facilitate scalability, portability and interoperability of Testing Capabilities and Test Beds. Basically, these architectures can be combined and synthesized as follows: From Network of Multiple Test Beds and Test Services perspective, Single Test Bed Architecture with a single Test Description Language (TDL) and an Execution Model is just another Test Bed in the network of Test Beds. Therefore, if the Test Service Interface and Test Reporting Format 27

29 features are embedded to Single Test Bed Architecture with a single Test Description Language (TDL) and an Execution Model, these architectures become harmonized. This issue is clarified more in detail in Figure 4-1. Figure 4-1: Harmonization of the Architectures As shown in Figure 4-1, if the Test Service Interface and Test Reporting Format are supported by the Single Test Bed with a single TDL and Execution Model architecture style, it can be used in any network having these interfaces. Therefore, both of these architectural approaches are supported by the proposed GITB Testing Framework. 4.3 Implications for GITB: Focus on a Testing Framework A Testing Framework is the main object of this design work. Such a framework does not impose a particular Test Bed architecture, but defines several aspects of Test Beds by specifying: Functional components across ebusiness Test Beds, Test artifacts standardized design, Test services to be made available to test participants as well as to Test Bed components, Methodologies and best practices for designing test artifacts, using test services and assembling functional components into Test Beds. From this testing framework, some variants of Test Bed architectures (e.g. centralized or distributed) are allowed and presented. The objectives in focusing on the definition of a Testing Framework as opposed to defining a specific Test Bed design are: To promote reuse of functional components across ebusiness Test Beds while allowing variability in Test Bed architectural options, To allow for the portability and reuse of test artifacts across Test Beds by defining some level of standardization of these, and by facilitating their archival and discovery, To ensure the use of common design concepts across Test Beds, thus promoting a common understanding across ebusiness communities, and the same governance options, 28

30 To define a general methodology and best practices related to all of the above, so that a common set of skills in designing tests and operating them, may be shared and applied across ebusiness disciplines. 29

31 Part II: GITB Testing Framework and Architecture (Target Group: Test Bed Experts and Architects) The GITB is designed as a possibly distributed system of Test Beds and test services. What is proposed is a Testing Framework that enables various Test Bed architectures, by focusing on the modularity and reusability of certain Testing Capabilities. In doing so, the proposed framework enables the coordination of multiple collaborating Test Beds in a network of testing resources, offering test services for ebusiness specifications that can be used either directly by test participants, or by other Test Beds. Chapter 5 provides an overview of the GITB Testing Framework and its main elements. It also introduces testing roles and the rules for GITB-compliant Test Beds. Section 6 presents first the Test Artifacts, which constitute the information model of GITB Framework. Then, Section 7 introduces the Test Services that work on these Test Artifacts and that Test Beds should provide for service-based remote access and coordination. This section defines the test services that are provided by Test Beds, for use either by (a) other Test Beds, or (b) by end-user applications (for integration with a testing environment), or yet (c) for human control e.g. from a Web browser. Finally, Section 8 describes the main functions and related internal components of a Test Bed both to operate independently and to cooperate with other Test Beds. 5 GITB Testing Framework 5.1 Overview GITB is a Testing Framework that allows for a network of Test Beds to share testing resources and capabilities by means of services, yet also recommends an internal Test Bed design that promotes modularity and reuse. Figure 5-1 illustrates an overview of the GITB Testing Framework. At a high level, test coordination and reuse of Testing Capabilities across Test Beds may then be achieved with little additional constraints on their internal design, so that legacy Test Bed architectures may be adapted and integrated as services in a distributed network of testing resources. For further portability and reuse, the Testing Framework defines a modular architecture based on standard Test Bed component interfaces that allow for reusability of certain Testing Capabilities, and extensible plug-in design. A third tenet of interoperability and reuse across Test Beds is an information model that standardizes at appropriate levels the test artifacts to be processed (test cases, test suites, test reports, test configurations, etc.). The GITB Testing Framework comprises architecture, methodology and guidelines for assisting in the creation, use and coordination of Test Beds, including: Test Artifacts design and standards (section 6). These test artifacts are all kinds of documents processed by a test bed: o o o (a) test logic documents (test suite definitions, document test assertions), (b) test configurations documents (parameters and message bindings for test suites, configuration of messaging adapters), and (c) test output documents (test logs and test reports). 30

32 Test Services definitions and interfaces (section 7). These services are about managing the above test artifacts (design, deploy, archive, search) as well as controlling the major test bed functions (test execution and coordination). Test Bed Components and their integration (section 8). These components are functionally defined. They are of three kinds: o o o (a) Core Test Bed platform components providing basic features, integration and coordination support to be found in any GITB-compliant Test Bed, (b) Testing Capability components, that directly enable the processing of test suites (e.g. a test suite engine, a document validator) and related tasks (e.g. send/receive messages), (c) User-facing components, through which the users interact with the Test Bed for various functions (e.g. test suite design, test execution). Test Registry and Repository for managing, archiving and sharing various Test Resources (section 9). This component is not considered as part of a Test Bed, as it is a Test Resource that can be independently deployed, managed and accessed. It supports the archiving, publishing and sharing of various test artifacts (e.g. test suites to be reused, test reports to be published). It also provides for storing and sharing testing capability components to be downloaded when assembling or upgrading a test Bed (e.g. the latest version of a test suite engine, of a document validator). Figure 5-1: Overview of the GITB Testing Framework In this proposed architecture, the GITB Test Bed is perceived by its users (either persons with specific roles or other Test Beds) as a set of Test Services. The Test Bed in itself allows for plug-in 31

33 Testing Capabilities (for example, test suite engines, specialized validation components, message adapters, etc.). These Testing Capabilities can be supported either by existing (legacy) test beds, by remote services or by future test components to be developed. The Test Bed is also a platform where various Test Suites or Document Assertions can be deployed i.e. it is not tied to a particular ebusiness standard and its test suites. The proposed architecture enables the coordination of several Test Beds specialized for the testing of different ebusiness specifications, or for different testing procedures. Some of these Test Beds will be developed according to GITB recommendations, while others are legacy Test Beds that have been augmented with GITB-compliant Service interfaces. Both types of Test Beds can then be integrated in the same network by providing access via similar Service interfaces. These Service interfaces can either be directly accessed by users e.g. a Service Manager accessing the test services from a public interface (Web for instance), or they can be accessed by some other Test Beds, e.g. when a Test Suite executing on a Test Bed needs to delegate some document validation to another specialized Test Bed. The Testing Capabilities (either provided by local components or by remote services) support the conformance and interoperability testing of any ebusiness Specification. For example, the purpose of a Document Validation capability is to validate a given document according to a set of syntactical or semantic restrictions specified in an ebusiness Specification. Such capability can be implemented as a local, pluggable (and reusable) component, or as a remote service from another Test Bed. Similarly a Messaging Adapter capability aims to communicate with the SUTs based on specified transport and communication protocols and to provide some level of messaging validation. Such a capability can be provided as a component that has been downloaded from a common repository for reuse and local integration, or could also be provided as a remote service (e.g. from a Test Bed or test agent specialized in providing various messaging protocols). Additional Testing Capabilities or services may be added to validate the conformance to a specified message choreography and business rules. For each of such capability, a common interface will be defined so that any test service provider can implement a test component or service specific to a certain standard and can plug-in the Testing Capability to the GITB Test Bed. In GITB phase 3, further capability types other than messaging and document validation may be identified and the architecture may be extended accordingly by the same approach. This architecture promotes the reusability of testing resources and capabilities among different domains and different standards. As shown in Figure 5-2, a test designer developing a test case for a certain ebusiness profile or standard may need a test service (e.g. profile may state that in a certain transaction the communication should be performed via ebxml messaging, so we need ebxml communications with the SUT) which may already be developed and published by other test designers and test service providers working for another domain or standard. In this way, the GITB Testing Framework leverages the existing, distributed test services related to ebusiness testing and allows users to discover them and access them via a test registry and repository. 5.2 Testing Roles within the Testing Framework The following roles which generally correspond to different categories of test bed users and providers of testing facilities are identified and supported by the Testing Framework: Test Designer: this role involves all tasks related to the creation of a Test Suite or of its parts (test cases, document assertion sets, configuration artifacts). The test designer may also be responsible for the creation of the set of Test Assertions from which Test Suite/Cases or Document Assertion Sets will be derived. S/he must have a good understanding of the ebusiness domain and specification(s) addressed by the test suite. S/he must also understand the testing conditions and constraints under which the test suite and Test Bed will be used, and the variability that the test suite must offer with respect to its 32

34 reuse. The test designer is expected to be familiar with the Testing Framework methodology and best practices. Test Participant: The owner or operator of an SUT, typically the end-user, an integrator or a software vendor. This role defines the Verification Scope and Testing Requirements. This role is generally held by someone responsible for an ebusiness implementation, and having business domain expertise. Test Manager (or Test Facilitator): A role responsible for executing Test Suites or for facilitating their execution, including related organizational tasks such as coordination with test Participants. The Test Manager is an expert in Test Suites, and in the logistics involved in running tests. S/he is generally using the Test Bed on behalf of the Test Participants, or assisting the Test Participant in using the Test Bed, e.g. for configuring and deploying a Test Suite before execution, and for searching/discovering the appropriate test suite in the Test Repository. S/he is also is familiar with the Test Suite logic and related ebusiness domain. Test Participants may act as Test Manager, if they are knowledgeable in testing. Test Bed Provider: This role is about operating the Test Bed itself as a server or an application service. It also may extend to the actual development and evolution of the Test Bed from Testing Framework resources and components (as obtained from the Test Repository). The Test Bed Operator is responsible for keeping the Test Bed functionally operational, and represents the Test Bed owning party for any contractual relationship with users, i.e. all other roles. 5.3 GITB-compliant Test Beds This report describes two ways a Test Bed may comply with the GITB framework. Both types of Test Beds can cooperate in a network of testing resources. GITB-framework compliant Test Beds: These Test Beds are designed and implemented by following the architecture and principles described in the GITB Testing Framework i.e. they consume and produce Test Artifacts, they implement the Test Services defined in this report, and follow the Test Bed Component architecture described in this report. The qualification as GITB-framework compliant in this report does not imply adherence to precise compliance criteria, but indicates a design based on the recommendations of this report. GITB-service compliant Test Beds: These are legacy Test Beds that are not designed according to the GITB Testing Framework but that are modified so to provide Service interfaces and to produce Test Report artifacts complying with the GITB recommendations. Both GITB-framework compliant Test Beds and GITB-service compliant Test Beds can then cooperate in a network, where the Testing Capabilities of the latter (legacy GITB-service compliant Test Beds) can be reused by the former. The networking of both kinds of Test Beds is illustrated in Figure 5-2: 33

35 Figure 5-2: Service-based Networking of Test Beds 6 Test Artifacts This section specifies the Test Artifacts (data items) that are manipulated by Test Services and Test Bed components. For each major artifact type, both an abstract design or model and a concrete representation (generally XML) are provided separately. The major test artifact types are: Test Logic Artifacts: They represent test case and test suite logic that are related to the ebusiness Specifications / profiles to be tested for. For example: document schemas and rules (for business document standards), messaging protocol, or business process choreography. In general, these test artifacts depend on the ebusiness standards of reference or on some user-specific profiles of these. These artifacts are configurable in the sense they may abstract parts of test execution e.g. make abstraction of the messaging protocol, of a particular version of a business document, or of some user-specific data. Test Configuration Artifacts: Such test artifacts are used to configure test logic artifacts and related test processing components - e.g. a messaging adapter for a particular messaging protocol. They represent complementary data that are necessary for a Test Suite to be executed in a particular test environment, or a particular business context e.g. specific user business documents, specific messaging protocol, specific end-user information and security settings. 34

36 Test Output Artifacts: These are dynamically produced by a Test Bed, as result of executing a Test Suite (or other test process). They represent information of all kinds about the SUT behavior. They include test reports and test execution traces such as message captures, execution logs. 6.1 General Design/Representation Principles and Best Practices The level of standardization of the Test Artifacts may vary from one type of artifact to the other. These levels are: Level 1: Standardization of a general wrapper or header to the artifact (meta-data standardization). For example, at this level of standardization two Test Suites may differ in their TDL (Test Definition Language) and in every other aspect, except for their header that has similar structure and semantics. Although different Test Suite engines will be needed to execute these Test Suites, their headers allow for archiving and searching these Test Suites in a consistent way, using same tools and query capabilities. Level 2: Level 1 plus standardization of external references or interfaces to other artifacts. For example, at this level of standardization two Test Suites may differ in their TDL, but will share similar header structure and also similar ways to reference complementary resources e.g. document validation resources (Document Assertions Set) that are needed by a Test Case in the Test Suite, to be performed by external Testing Capabilities. Although different Test Suite engines (inside the same Test Bed or in different Test Beds) will be needed to execute these Test Suites, their headers allow for same benefits as in Level 1. In addition the Test Bed will be able to resolve these external references in the same way, invoking same external resources (such as Document Validators, Messaging Adapters) for each test Suite similarly. Level 3: Level 1 plus Level 2 plus whole content standardization (e.g. detailed XML schema reflecting the entire structure of the artifact). For example, at this level of standardization two Test Suites will be written using the same TDL, in addition to using similar header and external references. They can then be processed by the same Test Suite engine inside a Test Bed. Each of the above level of standardization represents a different trade-off between: (a) flexibility in content representation of each artifact type and its extensibility, and (b) ability to process the artifacts of same type in similar way, across all instances of Test Beds. An instance of the Testing Framework may define and require a minimum level of standardization as acceptable for a given artifact, while recommending a higher level. For example, Level 2 may be required in order for a Test Report to comply with this testing framework, yet Level 3 may be defined and recommended though not mandatory so that users may benefit from additional tools and processing capabilities associated with this higher level. An instance of the Testing Framework may also require different levels of standardization for different Test Artifacts e.g. it may tolerate Test Suites with different TDLs (Level 2) but require that these Test Suites produce Test Reports that use the same XML schema (Level 3). 35

37 6.2 Standard Test Artifact Header and Reference Stubs Test Artifact Headers A Test Artifact header is a general data structure that applies to all types of artifacts, and serves as a general identification and search information item, as shown in Table 6-1. Level 1 of test artifact standardization consists of requiring a standard header for all test artifacts, regardless of their type. Some examples for Test Artifact Headers of existing ebusiness specifications are given in Table 6-2 and Table 6-3. Table 6-1: Test Artifact Header artifactid artifactname version authors origdate modifdate description Properties Keywords A unique artifact ID. It is recommended that it be a URI. A name for this artifact. A version or revision identifier. One or more authors for this artifact. Origination date of the artifact. Last modification date of the artifact. Description of the artifact. A list of properties for this artifact. A property is a name/value pair. A list of keywords for this artifact, in order to facilitate search. Table 6-2: Example 1: A Test Artifact header for a set of assertions for verifying some profile (P1) of HL7 V3 documents. artifactid HL7V3-P1 artifactname HL7V3-P1-DocumentProfileAssertions version 1.0 authors John Doe, Mary Simpson origdate T10:10:03-07:00 modifdate T10:10:03-07:00 description Document Assertions for Profile 1 of HL7 messages Properties Type= DAS, TDL= Schematron, Specification= HL7, Profile= P1 Keywords HL7 Conformance Profiles Table 6-3: Example 2: A Test Artifact header for a validating Test Suite for WS-I basic profile 2.0 artifactid WS-I_BP2.0 artifactname ValidatingTestSuite_WS-I_BasicProfile2.0 version 1.0 authors Jacques Durand, Tom Rutt origdate T10:10:03-07:00 modifdate T10:10:03-07:00 description Test Suite for WS-I Basic Profile 2.0 Properties Type= TestSuite, Mode= validating, TDL= Tamelizer, 36

38 Keywords Specification= WS-I-BP2.0 WebServices, SOAP, HTTP, WSDL Level 1 of test artifact standardization requires that all Test Artifacts use such a header structure, the concrete representation of which (e.g. XML schema) is out of scope of this report External Reference Stubs An External Reference Stub (ERS) is a short data structure that can be found in any test Artifact at various locations. An ERS is of either one of two types, as it allows for either one of two features: (1) Artifact ERS: allows a Test Artifact to refer to another (external) Test Artifact in a standard way (e.g. a Test Case referring to a Document Assertions Set). (2) Control ERS: contains an instruction for the processor of the Test Artifact that contains the ERS (e.g. a breakpoint during the execution of a Test Suite, or a specific operation to perform). An artifact ERS (or A-ERS) has the structure in Table 6-4. Table 6-4: Artifact ERS (A-ERS) ERS-type Artifact ArtifactType (here the type of artifact being referenced, e.g. DAS or TestCase, TestSuite ) SymbolicName In case the referenced artifact is to be dynamically determined at processing or configuration time, the symbolic name will be bound to a real artifact at that time. Id A unique artifact ID for the referenced artifact. (URI.) Name Name of the referenced artifact. Version Version or revision identifier for referenced artifact (typically complements the name). Instructions Additional information useful when processing this reference, e.g.: - include() : the referenced artifact must be considered as part of this artifact, meaning it will be processed automatically when processing this artifact (if a DAS or a test Suite, its TDL must be compatible with the referenced artifact). - associate() :the referenced artifact is associated somehow with this artifact e.g. some naming dependency. Useful when editing the artifact, no other processing semantics. - process() :requires the processing of the referred artifact by a default processor appropriate for this DAS (indicated by its TDL) or by a specific processor given in argument. Context May provide some contextual data or parameters for the processing of this reference. A Control ERS (or C-ERS) has the structure in Table 6-5. Table 6-5: Control ERS (C-ERS) ERS-type Operation Control Operation of the artifact processor to be invoked. Typically, if the processor is a plug-in capability (e.g. Test Suite engine), this is an operation of the Core Platform interface (see section 7). 37

39 Context Object Path indicating parameter values to be passed to the operation. Input (Test Item(s)) to be passed to the operation, if any. Level 2 of test artifact standardization requires that all external references (both A-ERS and C- ERS) in Test Artifacts be expressed using ERS, the concrete representation of which (e.g. XML schema) is out of scope of this report. 6.3 Test Configuration Artifacts Configuration Artifacts related to ebusiness Communication Messaging Adapter Configuration Artifacts adapterconfig: This artifact represents configuration data necessary for a messaging adapter to participate into the execution of an actual test suite. The nature of this configuration data may vary with the adapter type e.g. a WSDL file will configure a Web Service client endpoint or proxy, while an ebxml CPA may be part of an ebms endpoint configuration. Such configuration also includes various artifacts involved in peripheral messaging services, e.g. digital certificates for message security. messagingconnection: This is a configuration item that will enable the actual sending or receiving of messages, by: (a) referring to an adapterconfig artifact, (b) referring to a particular messaging adapter component, (c) providing additional endpoint information, e.g. a destination URL. A messagingconnection is abstracting the details of a messaging connection, to a test suite. messagebinding: This configuration item is binding an abstract message as defined in a generic test case, with actual messaging details necessary for execution. It includes: (a) a reference to a messagingconnection, (b) a reference to one or more payloadfile(s), (c) additional data such as necessary to generate a message header and which may depend on the particular message adapter in use e.g. business information items such as party identification and role, service and action for an ebms message. NOTE: the above artifacts (a), (b), and (c) may sometimes be merged or partially merged in their representation, depending on the messaging protocol in use. For example, Message binding and connection will usually be represented in a ebxml CPA document with ebms, while Message binding and Adapter configuration (and sometimes URL information found in Connection) would be in a WSDL file for Web services Design and Representation Artifacts These are configuration artifacts related to Test Suites: testsuiteconfig: This is the root configuration data artifact necessary for instantiating an executable test suite in case the test suite has parameters or other abstractions such as messaging abstraction. It includes: a) values for global variables in a test suite (e.g. timeout limits for B2B transaction in the test suite, concrete role identifiers for participants of an interoperability test suite, reference data such as SLA values, general quantitative criteria or thresholds for defining a successful outcome, etc.), 38

40 b) list of messagebinding artifacts associated with each test case in the test suite, c) other values (e.g. for test case parameters) associated with each test case, if any. payloadfile: This is an artifact that represents a concrete document or part of it. It may represent a sample message content or a reference document with business significance. It is referred to either by a messagebinding artifact or in case the payload file is part of the test logic itself, and common to all users of a test suite may be an integral part of a test case (i.e. a value for a corresponding test case parameter) or part of a Document Assertion Set, e.g. as reference material for the verification to be performed. 6.4 Test Logic Artifacts Document Assertion Set A Document Assertions Set (DAS) is a package of artifacts used to validate a Test Item, typically a business document instance or a message instance. Such a package may include: Schema(s) or template(s), Semantic or integrity rules, Codelists, Document base format(s) (e.g. XML, EDI or other) and related naming and design rules. A DAS will typically contain a "manifest" pointing at all data resources needed to validate this document (schemas, rules, codelists, etc.). A DAS or parts of it may be machine-processable in order to carry out the actual document validation. A validator or parts of it may only be manually processable. Processable DAS: its parts are machine-readable (e.g. schema, rules) so the verification can be automated. A DAS can be partially processable. Manual DAS: its parts are not machine-readable (e.g. text guidelines) so the verification cannot be automated. A DAS can be partially manual. A DAS will typically be consumed by one or more processor(s) in order to carry out the actual document validation. External Reference Stub (ERS) elements (in case of standardization Level 2 / 3): Reference to another DAS: (in case some validation is delegated to another DAS and its processor) Table 6-6: ERS to a Document Assertion Set ERS-type ArtifactType SymbolicName Id Name Version ReferenceInstructions Artifact DAS In case the referenced DAS is to be dynamically resolved at configuration time, the symbolic name will be bound to a real DAS at that time. A unique artifact ID for the referenced artifact. (URI.) Name of the referenced DAS. Version or revision identifier for referenced DAS(typically complements the name). Typically, either include or process (if the TDLs are different). May specify additional information useful when processing this referenced DAS e.g. may identify a subset of the DAS to be considered, or may identify a subset of the document under test, to be processed by the 39

41 referenced DAS. Callback to the Test Bed Core Platform interface (see Section 8): In case some callback to the Test Bed is needed, this element should exist. Table 6-7: Control ERS for Invoking an Operation of the Test Control Manager ERS-type Operation Context Object Control Operation of the Core Platform interface to be invoked (by the Document Validator Testing Capability - see section 8.3.2). Path indicating parameter values to be passed to the operation. Input (Test Item(s) to be passed to the operation, if any Test Case A Test Case (or Test Case artifact) is an executable test unit that corresponds to a particular testing requirement as identified in an ebusiness Specification, and verifies - or participates into the verification of - SUT(s) or related Test Items, regarding this specification requirement. It may consist of multiple test steps, each containing a particular testing operation. It may invoke one or more Document Validators (in case the test case involves document verification). A Test Case generally addresses either one of the following, depending on the specification target: 1. The validation of a single business document. 2. The validation of a single business transaction. 3. The verification of a message for conformance to a particular B2B protocol. However there is no specific rule other than being a manageable test execution unit that represents an acceptable unit of reporting in a test report. In case it is necessary to provide finegrain test reports, a Test Case may be more fine-grained than the above, e.g. focus on a single specification feature. Core (or Configurable) Test Case: has (generally) some parameters and bindings left uninstantiated, so that it can be adapted to different contexts, users and combinations with other standards. External Reference Stub elements (in case of standardization Level 2 / 3): The same ERS elements as for DAS must be supported. Reference to a Messaging Adapter - i.e. to a messagebinding: (in case some message needs be sent of waited for). NOTE: if no messagebinding artifact exists, a message Adapter plug-in can be directly invoked instead (in which case the ERS is a C-ERS). Table 6-8: Reference (an Artifact ERS) to a Messaging Adapter ERS-type Artifact ArtifactType messagebinding SymbolicName In case the message binding is to be dynamically assigned at configuration time, the logical name will be bound to a real messagebinding at that time. Id A unique artifact ID for the referred messagebinding. (URI.) Name Name of the referred messagebinding Version Version of the referred messagebinding Instructions Process() Context - Message action expected: send() or receive() - Path Identifying the received document(in case of receive ) or a 40

42 document to be sent (in case of send ). Reference to another Test Suite: In case some suite processing is delegated this element is necessary. Table 6-9: Reference (an Artifact ERS) to another Test Suite ERS-type ArtifactType SymbolicName Id Name Version Instructions Context Artifact TestSuite In case the invoked test Suite is to be dynamically assigned at configuration time, the logical name will be bound to a real test Suite at that time. A unique artifact ID for the invoked Test Suite. (URI.) Name of the invoked Test Suite. Version or revision identifier for the invoked Test Suite. include() or process() Path indicating parameter values to be passed to the invoked test Suite engine Test Suite A Test Suite (or Test Suite artifact) defines a workflow of Test Case executions and/or Document Validator executions, with the intent of verifying one or more SUTs against one or more ebusiness Specifications. It is a complete set of artifacts that is executable by a test engine. It may result from configuring a Core (or Configurable) Test Suite with user-specific data or with Configuration Test Artifacts. The output of a Test Suite execution may be a Test Report, or just a Test Execution Log. See various categories of Test Suites: Core (or Configurable) Test Suite: has (generally) some parameters and bindings left uninstantiated, so that it can be adapted to different contexts, users and combination with other standards. When properly configured and instantiated, will become an executable Test Suite. Conformance Test Suite: intended for verifying that an SUT conforms to a specification. Interoperability Test Suite: intended for verifying that two or more SUTs can interoperate as expected according to a specification. Interacting Test Suite: a Test Suite that drives SUTs or interacts with SUTs during execution, typically to produce a set of Test Items (or Test Execution Log). An interacting test suite may of may not be also "validating". Validating Test Suite: a Test Suite that takes as input either a Test Execution Log or independent Test Items, verifies them and produces a Test Report about these. Unless it is also "interacting", such a test suite does not interact with any SUT and only analyzes artifacts. External Reference Stub elements (in case of standardization Level 2 / 3): The same ERS as for Test Case are supported, with in addition: Breakpoint: (in case the control needs be given back to the Test Control manager of the Test Bed see Section 8). Table 6-10: A Control ERS for introducing a Breakpoint ERS-type Operation Context Control Breakpoint Path indicating execution context values to be passed to the Core Platform. 41

43 Object Execution Status to be communicated to the Test Bed Core Platform (if statically determined) Test Assertion A test assertion is a testable or measurable expression - usually in plain text or with a semi-formal representation - for evaluating the adherence of an implementation (or part of it) to a normative statement in a specification. A test assertion must explicitly refer to the normative statement(s) it addresses in the specification. Test assertions are generally not executable but provide a starting point for writing a conformance test suite or an interoperability test suite for a specification. The actual verification of the SUT or of Test Item(s) is carried out by the test case(s) derived from test assertions. OASIS has published guidelines for writing test Assertions (see: along with an XML vocabulary for representing them. It is recommended to use the OASIS methodology. A Test Assertion clearly identifies: The normative source: the requirement of the specification that is to be verified. A test target: this is either a Test Item or part of it, or an SUT implementation or part of it. A predicate: this is the condition indicative of fulfillment of the specification requirement by the target. Figure 6-1 shows the categories of Test Artifacts and their relationships. Figure 6-1: Categories of Test Artifacts and Their Relationships 42

44 6.5 Test Output Artifacts Test Execution Log This artifact represents a trace of the message exchange between a Test Bed and one or more SUTs, or a trace of an exchange between SUTs, or a trace of other observable behavior that results from SUT activity. It is a collection of Test Items, e.g. a message capture. Such artifact may be the product of executing an InteractingTest Suite, or simply some monitoring output. It is intended as input to further verification steps in a testing process, e.g. input to a Validating Test Suite Test Report This artifact is generated as the result of verifying the behavior or output of one or more SUT(s), or verifying Test Items such as business documents. It is making an assessment about conformance to one or more standards or profiles, or about interoperability within the context of these standards or profiles. It is generally intended for human readers (although possibly after some rendering, e.g. HTML rendering in a browser or after a translation from XML to HTML). A Test Report should contain all the details of verification items: It should identify the Test Items i.e. the test targets (as defined in Test Assertions), i.e. items in a test Execution Log, or items dynamically produced by SUT implementations. For each test target, it should indicate the assertions and/or Test Cases it has been verified against, and also the specification requirement associated with each assertion / Test Case. It should provide a reverse view, i.e. for each Test Assertion, all the target Test Items that have been evaluated against it. External Reference (ERS) elements: In case of standardization Level 2 / 3, this element is necessary. Reference to the generating DAS: If the report is produced by a Document Validator, this element should exist. Table 6-11: Reference to the Generating DAS ERS-type ArtifactType Id Name Instructions artifact DAS DAS artifact ID DAS Name associate() (just associates the originating DAS with this Test report, without other processing semantics for this reference) Reference to the generating Test Suite (if the report is produced by a Test Suite engine): Table 6-12: Reference to the Generating Test Suite ERS-type Artifact 43

45 ArtifactType Id Name Instructions TestSuite TestSuite artifact ID TestSuite Name associate() (just associates the originating TestSuite with this Test report, without other processing semantics for this reference) Reference to the Test Items under test (if the report is about a Test Artifact such as a Test Execution log): Table 6-13: Reference to the Test Items Under Test ERS-type ArtifactType Id Name Instructions Artifact TestItem or TestExecutionLog TestItem or TestExecutionLog artifact ID TestItem or TestExecutionLog Name associate() (just associates this Test report, with the artifacts under test without other processing semantics for this reference) 6.6 GITB-Compliance of Test Artifacts In order to be GITB-framework compliant, test Artifacts must be standardized at least at the following levels: Table 6-14: GITB Compliance of Test Artifacts DAS Level 2 Test Case Level 2 Test Suite Level 2 Test Report Level 2 All other Test Artifacts Level 1 In order to be GITB-service compliant, a legacy Test Bed is expected to generate Test Report artifacts that are compliant with the recommendations of this report. In particular, the Test Report must be standardized at level 1 at least, i.e. contain an artifact header similar to those of a GITBframework compliant Test Bed. 44

46 7 Test Services This section introduces the generic Test Services that are defined by the Testing Framework, for use either by (a) other framework components (e.g. in a Test Bed), or by (b) end-user B2B applications (for integration in a testing environment), or yet (c) for human control e.g. from a Web browser. A more detailed description of the abstract interfaces (sets of operations) is given in Appendix I. 7.1 Test Services Functional Areas Some large entities of the Testing Framework such as Test Beds, Test Registry and Repository - will be usable as Services. Such entities will publish one or more Service interfaces (e.g. as a Web service) that can be remotely accessed either by authorized users (e.g. via a Web browser, or via an editor) or accessed by another Test Bed. These Services are called General Test Services, as their interface provides for a coherent set of operations in a particular functional domain (e.g. Test Suite design, or Test Artifact repository and archiving). Four major functional areas in the Testing Framework lend themselves to a service-based access, as General Test Services: (1) Test Design services: The Test Design Services are intended to support persistence of Test Artifacts during the design phase. These services do not pretend to cover every finegrain operation involved in the design of test artifacts. Detailed editing or design operations are to be supported by some specialized tools (e.g. editors). (2) Test Deployment services: The Test Deployment Services are intended for deploying Test Suites, Test Cases, Document Assertion Sets, and for configuring some components such as a Messaging Adapter so that a Test Bed is ready for test execution. (3) Test Execution services: The Test Execution Services are intended to control all operations of a Test Bed related to the execution of a test Suite. They include the control of the test engine and test suite execution steps as well as the coordination of participants and other collaborative tasks. (4) Test Repository services: The Test Repository Services are intended for all operations related to the GITB Test Registry and Repository for two purposes: As a publishing service for sharing of test resources across Test Beds and participants about Test Beds and test services, test artifacts. It supports discovery and search of test artifacts, and of related users evaluations and comments. As a long-term archival site for all test artifacts. These services are presented in detail in Appendix I: Test Service Interfaces. It should be noted that the Test Services defined in Appendix I are provided individually. They can be used by the Test Bed Components, or by other test beds to support coordination. More detailed information on how to use these services in order to establish test bed coordination is given in Appendix II. The definitions provided in Appendix II can also be used to implement a GITB compliant Test Agent. In Appendix II, more detailed information on the signatures of the test services, syntaxes and possible implementation technologies are described. 45

47 The following figure shows the General Test Service interfaces and various roles involved in using these. Figure 7-1 illustrates how these services are supported either by a Test Bed or by a Test Repository. NOTE: Although the figure below shows services supported by a single Test Bed, these services are intended to be remotely accessed and therefore enable cooperation between Test Beds. This is for example the case when a test suite is distributed across Test Bed, or uses validation resources deployed on a different (secondary) Test Bed i.e. acting as a Test Agent - as illustrated in the Testing Scenarios (Section 9). In such a case, a test suite execution will invoke the Test Execution Services of the secondary Test Bed. Figure 7-1: Testing Roles and General Test Services 7.2 Mapping Testing Roles to Test Services The following roles identified in Section 5.2, map to Test Services as follows: Test Designer: o Supporting Services: Test Design services, Test Deployment services, Test Repository services. o Collaborating roles: ebusiness domain expert, test Participant. Test Participant: o Supporting Services: Test Execution services o Collaborating roles: Test Designer, Test Manager 46

48 Test Manager/Facilitator: o Supporting Services: Test Deployment services, Test Execution services, Test Repository services. o Collaborating roles: Test Participant, Test Bed Operator Test Bed Provider:. o Supporting Services: Test Repository services o Collaborating roles: Test Manager, Test Participant The Test Services relate to the Test Artifacts as illustrated in Figure 7-2. Figure 7-2: Test Services and related Test Artifacts 7.3 GITB-Service compliance of Test Beds In order to be GITB-service compliant, a legacy Test Bed is expected to support at least a subset of the Test Execution Services described in this report. In particular, either the Test Execution Control services or the Document Validation services must be supported in the following way: In case the Test Bed is able to process test suites: the subset of Test Execution Control operations that includes the following operations must be supported: - Start a Test Suite Execution - Stop a Test Suite Execution - Resume a Test Suite Execution 47

49 - Get Status of a Test Suite Execution In case the Test Bed is able to perform document validation: the subset of Document Validation operations that includes the following operations must be supported: - Validate a Document - Get Status of a Document Validation 8 Test Bed Components This section describes the components of a Test Bed that complies with the GITB framework. Figure 8-1 gives an overview of the architecture of such a Test Bed. 8.1 Overview of the Test Bed Components The components of a Test Bed can be roughly grouped in three categories: (1) Core Platform components. These are internal components to the Test Bed that fulfill general core functions independent from any particular test suite, B2B protocol or TDL. (2) User-facing components. These are editors and management consoles. Such components could evolve, be added, be replaced easily as they connect to the Test Bed mostly through its Service interfaces. They could reside partially on the client side (e.g. in Browser). (3) Testing Capability components. These are plug-ins that enable more directly the test execution, and that may depend on the nature of the tests to be performed, and/or on B2B protocols. They can be replaced, added on demand by Test Bed operators. 48

50 Figure 8-1: Overview of the Test Bed Architecture Test Bed Core Platform: The term Test Bed Core Platform identifies the part of a Test Bed that provides the basic infrastructure for adding Testing Capability components (3) above and also allows the connection of (either local or remote) User-facing components (2) above. This Core Platform infrastructure only includes the Core Platform components (1) above. Operational Test Bed Platform: Once Testing Capability and User-facing components have been integrated on the Test Bed Core Platform, the result is an operational Test Bed platform. Such a platform only needs to host the deployment of appropriate Test Artifacts in order to be able to provide test services concerning various B2B specifications. Once these Test Artifacts have been deployed on an Operational Test Bed Platform, the resulting resource is an operational Test Bed. 8.2 User-facing Components User-facing components could evolve, be added, be replaced easily as they connect to the Test Bed mostly through its Service interfaces. They could reside partially on the client side (e.g. in Browser). As shown in Figure 8-1 the User-facing components and functions of the Test Bed are as follows: Test Monitoring and Management GUI: All the users access the system through this Web portal that provides a graphical user interface (GUI). Depending on the user type appropriate screens are displayed to the user. For example, when the Test Operator user (this role consists of managing and operating a Test Bed) enters the system, interfaces such as management of users, configuring repositories, management of document validators and management of messaging adapters are displayed. On the other hand, when a Test Manager enters the system, available Test Suites/Cases and the interface to 49

51 execute/configure a Test Suite/Case are displayed. The Test Managers also monitors the execution of the test suites/cases through this GUI. Test Suite/Case Editor: This component allows a Test Designer to develop executable test suites/cases. This editor generally supports the use of a Test Description Language and is able to store editing outputs to the local Test Artifact Persistence Manager (TAPM) or GITB Test Registry/Repository. The test suites/cases generated by the Test Designer generally need further configuration in order to be executable by the Test Suite Engine. For example the real IPs to which the test messages will be sent has not been determined yet. Therefore, these test suites/cases should be further configured i.e. instantiated - by the Test Suite/Case Instantiator. Test Suite/Case Instantiator: This tool allows the Test Managers, who are in charge of executing a test suite (the SUT Operators can also be Test Managers), to instantiate test suites/cases by entering necessary configuration parameters such as IP and port number of SUTs, the WSDL addresses, etc. As a result of the usage of this tool, an executable test suite/case instance is generated and stored to the local Test Artifact Persistence Manager (TAPM) or GITB Test Registry/Repository. 8.3 Testing Capability Components These components are Test plug-ins which enable more directly the test execution, and which may depend on the nature of the tests to be performed, and/or on B2B protocols. Several instances of each Testing Capability component type may be part of the same Operational Test Bed Platform. For example, several Messaging Adapters (e.g. one for ebms V2, one for AS2, one for Web services) could be deployed on the same Test Bed. Similarly, several Document Validators can be deployed (e.g. an XML schema validator, a Schematron engine, a Tamelizer engine). Test Suite Engines are also considered as plug-in components and more than one could be deployed on a Test Bed. These plug-in components can be replaced, added, removed by Test Bed Operators without much effort or disturbance, e.g. they should not require code recompiling or rebuilding the Test Bed application: their addition or removal should be a Test Bed platform configuration operation. Core Internal APIs: In order to qualify as Testing Capability for a GITB Test Bed Core Platform, these components must implement standardized internal APIs so that all instances of a same type of plug-in (e.g. all Messaging Adapters) can be controlled in the same way by the Core Platform. These are called Core Plug-in APIs. Conversely, when these plug-in components need to initiate some operation in the Core Platform, they will invoke a standardized API implemented by the Core Platform. For example, a Test Suite engine willing to invoke a Document Validator to validate a document will do so by invoking a Core Platform operation (which in turn will redirect the request to the appropriate Document Validator plug-in). These APIs are called Core Platform APIs. Figure 8-2 illustrates the role of these Core Internal APIs in coordinating the plug-in components with Core Platform functions: 50

52 Figure 8-2: Coordination of Plug-in Components via Core Interfaces For example, assume a Test Suite HL7-interop is executed by a Test Suite Engine TDL1 able to execute any test suite written in the Test Definition Language TDL1. In the Test Suite some Test Case is performing a document validation identified as HL7-doc for which there is a Document Assertions set deployed on the Test Bed. This document validation is delegated to a Document Validator able to run HL7-doc assertions here a Schematron engine (the Document Assertions set specifies the type of execution it is intended for here Schematron type). The symbolic name (HL7-doc) identifying the required document testing resource can either be hard-coded in the definition of the test suite HL7-interop, or defined in a configuration artifact for this test suite left to be associated with the test suite when deployed on the Test Bed. A separate Document Validator component has been plugged-in the Test Bed and declared as a Schematron Testing Capability. At run-time, the following sequence of operations will occur: 1. The Test Control manager component starts the execution of the Test Suite HL7-interop by invoking the Test Suite engine TDL1 (as HL7-interop has declared its language to be TDL1). 2. As a Test Case in the test suite HL7-interop needs to perform some functional invocation that requires HL7-doc capability, the TDL1 engine calls back the Test Control Manager with a request for HL7-doc validation over a particular document D1. 3. The Test Control Manager component then identifies a Testing Capability component that is able to process HL7-doc Document Assertion set here a Schematron engine. It invokes this plug-in. NOTE: it is also possible that such a plug-in is not present, in which case the Testing 51

53 Capability should have been configured in the Test Bed as a remote Test Agent that can be invoked instead. 4. The Test Control Manager component returns the validation result to the Test Suite engine TDL1. This engine proceeds then to executing the next Test Case of HL7-interop. The different types of Testing Capability Components are defined below, along with an abstract definition of the Core APIs they must support or invoke Messaging Adapter Several Messaging Adapters may exist in a Test Bed. The communication with an SUT is realized with messaging adapters (specialized for messaging protocol stacks such as ebxml Messaging, Web Services with SOAP or REST, AS2, and the underlying transports: SMTP, HTTP, etc.). The Messaging Adapters send/receive messages to/from one or more SUT(s) and also generally require a first level of validation at the message level. The GITB framework allows for any messaging protocol as long as the necessary Messaging Adapters are implemented and deployed to the system based on predefined interfaces. The Messaging Adapter obtains the document to be sent from the Test Suite engine, packages it as a message and sends it to the SUT. After the Messaging Adapter receives a message from an SUT, it extracts the document content from the message and delivers it to the Test Suite engine or to a Document Validator Core plug-in interface These operations are: - initiated by the Test Control manager component; - supported by the Messaging Adapter component. SendMessage (messaging connection, payload file), status Send a message over a messaging connection to a remote SUT or test Agent. GetMessage (messaging connection) payload file / status Receive a message (wait for the message) over a messaging connection. Configure (adapter config) status Configure the messaging adapter for a particular connection and messaging mode Core platform interface These operations are: - initiated by the Messaging Adapter component. - supported by the Test Control manager component. 52

54 ReceiveMessage (messaging connection, payload file), status Give a received payload back to the Test Control Manager component. This is the asynchronous (or call-back) version of GetMessage operation. The Test Suite configuration will indicate which one of the message reception techniques is to be used Document Validator One or more Document Validators may be deployed in a Test Bed. A Document Validator is responsible for validating the content of the documents retrieved from the Messaging Adapters in terms of both structure and semantic. With structure the format (e.g. the order of the elements) of the document is meant. On the other hand, with semantic the relations (e.g. business rules) among the elements and the control of coded elements not specified in the schema documents are meant. From an XML perspective, XSD schemas specify the structure of a document and some XPathbased rule (e.g. Schematron or TAML-X [TAML] from OASIS or XSL files) specify the semantics of the document. Like messaging adapters, the GITB framework supports any type of business document and related validation as long as necessary document validators are implemented and registered to the system by the predetermined validator interface Core plug-in interface These operations are: - initiated by the Test Control manager component. - supported by the Document Validator plug-in component. ValidateDocument (document, DAS identifier) validation-reference or test report Validate a document against a particular DAS. It returns a validation reference in case the validation is not immediate. GetResult (validation-reference) test report Get the result of a previous validation. Configure (DAS identifier, operation-mode) status Configure the Document Validator with a specific DAS and a mode of operation Core platform interface These operations are: - initiated by the Document Validator plug-in component. - supported by the Test Control manager component. 53

55 ReceiveResult (validation-reference, test report) Gives a test result back to the Test Control Manager component. This is the asynchronous (or call-back) version of GetResult operation. The Validator configuration will indicate which one of the test result passing technique is to be used. ValidateDocument (document,das identifier), validation-reference OR test report Validate a document against a particular DAS. The presence of this call in this interface means that a Document Validator can delegate some validation to another Validator, via a call-back to the Test Control manager. It returns a validation reference in case the validation is not immediate Test Suite Engine The Test Suite Engine drives the execution of a test suite/case instance by orchestrating other Test Bed components. When executing, Test Suite Engine: - parses the test suite/case in Test Description Language, and interprets the related operations (or test steps), - communicates with Messaging Adapters for the sending/receiving of messages (also for the message level validation), - communicates with Document Validators for the validation of the documents extracted from the messages, - records the details (validations failures, successes) of the steps - informs the Test Monitoring and Management GUI of monitoring the execution of the tests Core plug-in interface These operations are: - initiated by the Test Control manager component. - supported by the Test Suite engine plug-in component StartTestSuite (TSuite identifier) status + suiteexecution-handle Initiate a Test Suite execution. It returns a test suite execution handle and a status. StopTestSuite (suiteexecution-handle) status Interrupts a Test Suite execution. It returns a status. ResumeTestSuite (suiteexecution-handle) status Resumes a Test Suite execution. It returns a status. EndTestSuite (suiteexecution-handle) status Terminates a Test Suite execution. It returns a status. 54

56 GetTestSuiteResult (suiteexecution-handle) test report Get the result of a test suite execution. ConfigureTestSuite (Test Suite identifier, operation-mode) status Configure the test Suite engine with a specific test suite and a mode of operation Core platform interface These operations are: - initiated by the Test Suite engine plug-in component; - supported by the Test Control manager component ReceiveTestSuiteResult(suiteExecution-handle, test report) Gives a test result back to the Test Control manager component. This is the asynchronous (or call-back) version of GetTestSuiteResult operation. The Engine configuration will indicate which one of the test result passing technique is to be used. ValidateDocument(document,DAS identifier), validation-reference OR test report Validate a document against a particular DAS. The presence of this call in this interface means that a Test Suite engine can delegate some validation to a Document Validator, via a callback to the Test Control manager. It returns a validation reference in case the validation is not immediate. StartTestSuite(TSuite identifier) status + suiteexecution-handle Initiate a Test Suite execution. It returns a test suite execution handle and a status. The presence of this call in this interface means that a Test Suite engine can delegate some secondary Test Suite execution to another Test Suite engine, DoBreakPoint() status + suiteexecution-handle Interrupt the Test Suite execution from the engine itself, based on some break points introduced in the test suite for returning control to the Test Bed (i.e. Test Control manager). 8.4 Test Bed Core Platform Components These are internal components to the Test Bed that fulfill general core functions independent from any particular test suite, B2B protocol or TDL Test Bed Platform Functional Components Test Control Manager: This manager component has control of the interaction with the test processing capabilities (Test suite engine(s), Document validators, Message Adapters). For example, when a test suite invokes a document validation, the Control Manager mediates the invocation from Test Suite Engine to Document Validator. In other words, the 55

57 Test Control Manager abstracts the lower level Document Validator and/or Message Adapter invocation details from the Test Suite Engine, which may only be aware of the aliases/names of these components as specified in the test case definitions. The component also marshals inputs and outputs to these processors, e.g. may assemble or enforce correct formatting of the test report or logs produced by a test suite processor (e.g. adding header data). After the test execution completes, it invokes the Test Artifact Persistence Manager to store the test outputs. This component supports Test Execution services. Test Operation Manager: The Test Managers interact with the Test Operation Manager component in order to perform user-centric management tasks such as managing users accounts, coordinating users (notifications, workflow associated with a test suite), providing a general console for the Test Bed control and user-level control, keeping track of a test suite execution status. This component supports Test Execution services, and is controlled by User-facing components. Test Deployment Manager: This component ensures that the Test Bed is ready for test Suite execution. This involves: loading the test suite and related resources (document assertion sets, other test suites if used) from the TAPM, configuring and activating other components such as Messaging Adapters, Document Validators, and managing connections with SUTs and remote Test Agents. If some remote Service has to be used during test execution, the Test Deployment Manager is ensuring these are available. Test Artifact Persistence Manager (TAPM): This repository contains the test suites/cases as well as test suite/case instances. Furthermore, it stores the detailed test reports for the executed test suites/cases Test Bed Platform Interface Components These interfaces allow for a Test Bed accessing remote services (when the interface is locally available as a Client ), or allow for the Test Bed to publish some of its functions as Services to other Test Beds or Web front-ends. Test Agent Interfaces / API: The GITB Testing Framework allows for the cooperation of several Test Beds specialized for different domains/subdomains. In such an environment, a Test Bed may need the Testing Capabilities of other Test Beds. Therefore, the GITB compliant Test Bed is designed to have necessary interfaces to communicate with other GITB compliant Test Beds so that a test suite execution may involve several Test Beds collaboratively. In this execution, the driving Test Bed is called Test Suite Driver and the other Test Beds (providing their Testing Capabilities for Test Suite Driver) are called Test Agents. The Test Agent Interface (or API) Client and Service enable this invocation mechanism: the Test Agent Interface Client allows for invoking remote Test Agents, while the Test Agent Service allows for a Test Bed to be acting as a Test Agent for other Test Beds. Test Agent Services include two major services described in Section 6: Test Deployment services and Test Execution services. Registry/Repository Client: This client communicates with the GITB Registry and Repository, which advertises Test Services (and the Test Beds that support these) that are remotely available and archives and publishes test artifacts globally. Table 8-1 shows which testing services are provided or supported (even partially) by each Test Bed component. Table 8-1: Test Bed Components and Services 56

58 Test Bed Component Test Monitoring and Management GUI Test Suite/Case Editor Test Suite/Case Instantiator Messaging Adapters Document Validators Test Suite Engine Test Control Manager [component] Test Operation Manager [component] Test Deployment Manager [component] Test Artifact Persistence Manager (TAPM) [component] Test Agent Interfaces / API Registry/Repository Client Test Services supported Test Execution Services, Test Design Services Test Design Services Test Design Services Test Execution Services, Test Deployment Services Test Execution Services Test Execution Services Test Execution Services Test Execution Services Test Deployment Services Test Repository Services Test Execution Services, Test Deployment Services Test Repository Services 57

59 9 GITB Test Registry and Repository (TRR) This section describes the GITB Test Registry and Repository (TRR) adopted in the GITB Framework. 9.1 Overview of the GITB TRR The GITB Test Registry and Repository is used for managing, archiving and sharing various Test Resources such as Test Artifacts and Testing Capability Components. These resources can be shared and reused to implement Test Bed and Services. This component is not considered as part of a Test Bed, as Test Resources need to be independently deployed, managed and accessed. The stored contents in the GITB TRR are summarized in Table 9-1. Table 9-1: GITB TRR Contents Testing Resources TRR Contents TRR contains the references to or actual Test Artifacts: Test Artifacts Test Logic Artifacts: Document Assertion Set, Test Case, Test Suite, Test Assertion Test Configuration Artifacts: Adapter Configuration, Messaging Connection, Message Binding, Test Suite Configuration, Payload File Testing Capability Components TRR contains the references to actual Testing Capability Components: Testing Capability Components and their APIs, metadata, interfaces Metadata, Interfaces, and client APIs for interacting with Testing Capability Components in the Remote Test Beds To efficiently manage Test Artifacts and Testing Capability Components, the GITB TRR should provide their lifetime service, metadata registration, version control, and index/query. To this end, the GITB TRR has the four types of models. Meta model: The list of meta-information for the test service, such as user information, copyright, contacts, version. Test object model: The structure of the Test Artifacts and Testing Capability Components. This can be represented in the form of a test process graph in which nodes represent Test Artifacts and Testing Capability Components, while edges represent the relationships among nodes. Version model: Dimensions of evolution, such as revisions and variants. It provides operations for retrieving old versions and constructing new versions. Query model: The structure of query request and response. 58

60 9.2 Architecture of the GITB TRR The GITB TRR is based on ebxml RIM specification (Registry Information Model, ISO , It also conforms to ebxml RS specification (Registry Service, ISO , Figure 9-1 shows the architecture of GITB TRR. Figure 9-1: GITB Test Registry and Repository Architecture The GITB TRR consists of several components and interfaces as follows: Knowledge Base o TRR Registry: A registry contains the references to GITB Test Artifacts and Testing Capability Components. o TRR Repository: A repository stores the actual Test Artifacts and Testing Capability Components, such as a document itself or a library of APIs. Core functions 59

61 o Test Artifacts Manager It enables the Test Artifacts to be managed in terms of creation, approval, query, update, deprecation and removal. The meta-data and contents (e.g., Test Cases, Test Assertions) of the Test Artifact are stored in the TRR Registry and the TRR Repository, respectively. o Testing Capability Components Manager It enables the Testing Capability Components to be managed in terms of creation, approval, query, update, deprecation and removal. The meta-data and contents (e.g., Testing Capability Components and their APIs) are stored in the TRR Registry and the TRR Repository, respectively. o Remote Testing Capability Components Manager It enables the Remote Testing Capability Components to be managed in terms of creation, approval, query, update, deprecation and removal. The meta-data (e.g., the information of the service location) of the Remote Testing Capability Components are stored in the TRR Registry. If a client API is needed and provided to interface with the Remote Testing Capability Components, the client APIs is stored in the TRR Repository. Interfaces o GITB TRR Services Messaging Interface It provides a remote interface based on the standard messaging protocols, such as ebxml MS, SOAP for the interaction with the GITB TRR Client in the GITB Test Bed. It receives a service request from the GITB TRR Client and relays it to the Test Artifacts Manager, Testing Capability Components Manager, or Remote Testing Capability Components Manager. Web Interface It provides a web-based user interface to publish, search and download the Test Artifacts, the Testing Capability Components, and the Remote Testing Capability Components for the user who does not want to use the GITB Test Bed interface. o GITB TRR Client It is located in the GITB Test Bed and communicates with the GITB TRR and the GITB Test Deployment Manager. The user who wants to use the GITB Test Bed Interface can publish, search and download the Test Artifacts, the Testing Capability Components and the Remote Testing Capability Components through this interface. 9.3 Usage Scenarios GITB TRR for the Test Artifacts Figure 9-2 shows how the GITB TRR is used for registration and storage of information about the reusable Test Artifacts. 60

62 (1) The Domain Expert registers particular Test Artifacts into the GITB registry and stores their contents into the repository, (2) The Repository Client searches for the relevant Test Artifacts, and (3) The Test Designer deploys the discovered Test Artifacts to the Test Suite. Figure 9-2: GITB TRR s usage for the Test Artifacts GITB TRR for the Testing Capability Components This scenario shows in Figure 9-3 how The GITB TRR is used for registration and storage of information about the reusable Testing Capability Components used to coordinate and implement the Test Bed. (1) The Test Designer registers the references of specific Testing Capability Components into the GITB registry and stores their API codes into the repository, (2) The Test Designer searches for the relevant Testing Capability Components with the help of the Repository Client, (3) The Test Designer deploys the discovered Testing Capability Components to the GITB compliant Test Bed, and 61

63 (4) The Test Manager invokes the deployed Test Components during the testing. Figure 9-3: GITB TRR s usage for the Testing Capability Components GITB TRR for the Remote Testing Capability Components Figure 9-4 shows how Testing Capability Components can be provided as services via Remote Test Beds. In this scenario, the GITB Test Repository should provide the references of the service location of the Remote Test Bed. (1) The Test Designer registers Remote Testing Capability Components into the GITB registry. (2) The Test Designer searches for the relevant services with the help of the Repository Client, and (3) The Test Agent Client calls the discovered Service remotely in the external Test bed during the testing. 62

64 Figure 9-4: GITB Test Repository for the Remote Testing Capability Components 63

65 10 Testing Methodologies and the Scenarios This chapter describes the testing methodologies and testing scenarios through interaction/sequence diagrams. Finally, their correspondences to the test services are presented in order to cross-check and validate the architecture Testing Methodologies This section describes the testing methodology adopted in GITB Framework. The methodology is aligned with the common testing practice used in real life testing environments Using Test Assertions Ideally, a set of Test Assertions (see definition section 2.1.5) have been defined for an ebusiness Specification before a Test Suite and Test Cases are developed. Test Assertions provide a way to bridge the narrative of an ebusiness specification and the test cases for verifying conformance (or interoperability). Test Assertions help to interpret the specification statements from a testing viewpoint. Test Cases should then be derived from such Test Assertions, as illustrated in Figure Figure 10-1: The Role of Test Assertions Test assertions provide a starting point for writing conformance and interoperability test suites. They simplify the distribution of the test development effort between different groups: often, test designers are not expert in the specification to be tested, and need guidance. By interpreting specification statements in terms of testing terms and conditions, Test Assertions improve confidence in the resulting Test Suite and provide the basis for coverage analysis (estimating the extent to which the specification is tested). OASIS has developed Test Assertions Guidelines that can be used to help develop Test Assertions (TAG) Standalone Document Validation Document validation also sometimes called Instance or conformance/unit testing is a particular form of conformance testing which verifies an artifact (e.g., an HL7 V2 message) against the rules defined in the specification. This form of testing does not directly involve an SUT, but rather a testing artifact (Test Item) that was produced by the SUT. Examples of such testing include validating a Clinical Document Architecture (CDA) document instance against the CDA 64

66 general rules and document type rules, and validating an HL7 V2 message instance against an HL7 V2 conformance profile. Figure 10-2: Workflow of a standalone document validation In standalone document validation, the document under test is obtained somehow by a Test Participant, who directly submits the document to and gets the Test Report from the Test Bed. This document validation is then disconnected from any SUT communication, or larger Test Suite execution, as the Test Participant directly controls all inputs to the Test Bed SUT-Interactive Conformance Testing Conformance Testing is defined as verifying an artifact (e.g., an HL7 V2 message) against the rules defined in the specification. Interactive conformance testing involves direct interaction between Test Bed and SUT, combined with dynamic validation of SUT outputs (document validation). The document validation is usually delegated by the Test Suite engine to a Document Validator. Figure 10-3: Sample workflow in interactive conformance testing In such interactive conformance testing, the Test participant (or Test manager) only needs to interact with the Test Bed for controlling the overall execution and getting the final report Interoperability Testing Interoperability is defined as the ability of two SUTs to interact with each other in compliance with the specification. This interaction usually involves data artifacts (e.g. messages) produced by one SUT and consumed by the other. Interoperability testing (see definition in section 2.1.2) can be conducted in different modes: 65

67 (1) Passive interoperability testing: in this mode, the SUTs are not controlled by the Test Suite, i.e. by a Test Bed. The SUTs interact on their own or under regular business activity. The interoperability Test Suite is only verifying captured traffic: it is a validating Test Suite. (2) Directly driven interoperability testing: in this mode, the interoperability Test Suite is actively driving one or more SUTs in order to cause them to interact: it is an interacting test Suite. In addition, the Test Suite (or another one, in case of two-phase testing see next section) does the verification of captured traffic. (3) Indirectly driven interoperability testing: in this mode, the SUTs are controlled though not directly by the Test Bed. Instead, the interoperability Test Suite is interacting using a different channel with an entity controlling the SUT e.g. sends an to a Test Participant asking her to initiate some message sending from or to the SUT. In addition, the Test Suite (or another one, in case of two-phase testing see next section) does the verification of captured traffic. Ideally, the message capture should not interfere with the way the SUTs interoperate as they would under real business conditions. The three most common ways to capture message traffic between SUTs are: a) Using a man-in-the-middle system operating and re-routing messages at transport level (e.g. an HTTP proxy or a TCP intermediary). This is typically the least intrusive approach, although it imposes restrictive conditions (the messages and sessions should not be encrypted). b) Instrumenting of one of the SUT so that message capture is performed at the endpoint, e.g. on the message handler of the SUT. Later on this message capture can be consolidated in a Test Execution Log. c) Configuring the sending SUT(s) so that they duplicate messages sent and forward a copy a Monitoring component or directly to the Test Bed Two-phase Testing Figure 10-4: Basic Interoperability Testing In two-phase testing, the SUT interaction is decoupled from the actual validation. Phase 1 (SUT interaction) is driven by an interacting Test Suite that produces a Test Execution Log. Phase 2 (validation of SUT execution log) is driven by a separate validating test Suite that produces the final Test Report. 66

68 Figure 10-5: Two-phase testing Two-phase testing may apply to both conformance and interoperability testing. It is an alternative to the use of a single interacting/validating Test Suite, as in interactive conformance testing above. The transition from Phase 1 to Phase 2 may be automated or human-controlled (as in standalone document validation). The two phases may take place at very different locations and times, possibly involving different Test Beds. There are many reasons why such a decoupling may be useful: - Different test designers and skills may be involved in designing a test suite that interacts with SUTs and a test suite that only does validation (e.g. the latter requires expertise in the ebusiness standard to be tested for, while the former may require good knowledge of SUTs and their messaging mode). - The user (test participant or test manager) may want to delay submitting the execution logs for analysis for various reasons: validating test suites or test service may not yet be available, or the validating test suite is better processed once a complete log is obtained for easier message correlation and faster processing. - The user may decide later to run her/his execution log through a different validating test Suite, or several of these. For example, the user may want to re-analyze old logs to make sure that her/his SUT is already compliant with a new profile, or a new version Proposed Testing Practices for SUTs If possible, first perform Document/Message Instance Testing: The Document/Message Instance testing eliminates the problems within a single document. The structure and the business rules are checked. After passing the Document/Message Instance Testing, the System Under Test can be assured that it can generate valid documents/messages. Always perform conformance testing: The Document/Message Instance Testing can assure that a SUT can generate valid documents/messages. However, it cannot guarantee the SUT can send/receive these messages/documents as defined in the standard. Therefore, through the Conformance Tests, a SUT is tested to check whether it can send/receive messages in the order defined by the standard. In the Conformance Tests, all the other roles that the SUT communicates according to the specific standard are simulated by the testing applications or Test Beds. Therefore, the SUT is expected to behave as if it is in real 67

69 life settings. The business rules that should be applied across documents are also controlled in Conformance Tests. Perform interoperability testing after the conformance testing, if possible and design interoperability test suites so that they are not redundant with tests already done during conformance testing: Sometime fatal errors can be found during the interoperability testing. If so, test suits must be designed in such a way that those fatal errors are detected in the conformance testing. Through the interoperability tests more than one SUT is tested. Their ability to act in real-life settings is tested. In the certification process, most of the time, passing the conformance testing is sufficient. However, through interoperability testing, their interoperability with other real-life SUTs are tested Testing Scenarios In the following sections, the methodology adopted by the GITB framework is described through testing scenarios. These scenarios are based on the testing types mentioned above. In these scenarios, the detailed step-by-step interaction of the components and GITB users are explained in detail. These scenarios are representative of the process most users would go through when undertaking conformance testing and interoperability testing, while using the testing framework described in this document. They represent a step-wise approach to testing comprehensive ebusiness scenarios and will be instantiated for each use case in Chapter 11 (automotive industry), Chapter 13 (healthcare) and Chapter 14 (public procurement) Overview of the Testing Scenarios The testing scenarios start by document validation (Scenario 1) which corresponds to Document/Message Instance Test (Section ). They go on to verify business document and business process choreography conformance (Scenario 2), corresponding to Conformance Test (Section ), and finally extend to interoperability testing (Scenario 3) that corresponds to Interoperability Test (Section ). Each scenario comprises all phases of the testing activity: Design and development of the test suite, Deployment on a Test Bed, Operation of the test suite, test execution and related collaborative activities, Publishing and discovery of the test suite, Maintenance and upgrades of the Test suite. More specifically, the three scenarios are: Testing Scenario 1: Creation, deployment and execution of a Validating Conformance Test Suite. This Validating Test Suite is here evaluating Conformance of business documents against a business document specification or a profile of it. (e.g. HL7). Test resources for this ebusiness specification (test cases, and document assertion sets) are expected to be available to the test designer. The test suite will not interact directly with any SUT, but will verify business documents submitted either by an end-user (e.g. via a Web browser) or by a Test Bed component. 68

70 Testing Scenario 2: Creation, deployment and execution of an [SUT] Interacting Conformance Test Suite. This Interacting Test Suite uses the Validating Test Suite (from Testing Scenario 1) for verifying Conformance of an SUT against ebusiness specifications and profiles that specify business documents, and business process choreography. In this scenario, the reuse of the Validating Test Suite (Testing Scenario 1) may involve a different Test Bed, via its Test Execution Service. The existing test cases that handle SUT interaction for the messaging protocol of interest (either for sending or receiving messages) reused to accommodate the B2B profile. Testing Scenario 3: Creation, deployment and execution of an [SUT] Interacting Interoperability Test Suite. This Interacting Test Suite is intended for interoperability testing. Unlike Testing Scenario 2, the test suite will interact with two or more SUTs to verify that these SUTs interoperate properly on most common ebusiness scenarios, while complying with the ebusiness specifications and profiles that were already tested for each SUT separately in Testing Scenario 2. This new interacting interoperability test suite will also make use of the Validating test suite (from Testing Scenario 1) for verifying the business documents exchanged Testing Scenario 1: Creation, Deployment and Execution of a Validating Conformance Test Suite The activities and Test Bed components involved in this scenario are illustrated in Figure 10-6: Figure 10-6: Testing Scenario 1 Table 10-1 is decomposing the scenario into a flow of activities, while identifying how the testing Framework is supporting these activities by means of Test Services, Test Artifacts, and Test Bed Component(s). Table 10-1: Decomposition of Testing Scenario 1 69

71 Activity Test Services Test Artifacts Test Bed Component Indentify Test Scope This step consists of clearly defining the scope of the Conformance testing. It identifies the ebusiness Specifications to be tested, more specifically the business documents,, and the Profile to be verified for these, as well as the tests to be performed. Test Services Repository Test Assertions Test Artifacts Repository Search for existing Test Cases This step consists of researching existing test cases for these business documents, so that the Test Designer has a starting point to define test cases for this particular Profile. Test Service: Repository - Browse Document Assertion Sets related to the B2B specification of interest, in related archives. Document Assertions Sets Test Cases Test Repository Artifacts - Select Test Cases involving such assertions (if any). Designing new Test Cases This step consists of creating or deriving new Profile test cases from those existing for this type of business documents, for addressing the new Profile Test Design Service: - Create and modify Document assertion sets (related to new profile). - Create or duplicate (validating) Test Cases making use of such assertion sets. Document Assertions Sets Test Cases Test Editor Case/Suite Test Artifact Persistence Manager (TAPM)(the design work is persisted in the TAPM) Design a new Validating Test Suite. This step consists of packaging all Profile Test Cases into a new Validating Test Suite. Test Design Service: - Create new (validating) Test Suite. - Add previous (validating) Test Cases in Test Suite. Test Suite Test Cases Test Suite Editor Deployment of the new Validating Test Suite, In this step, the new Validating test suite is deployed on a Test Agent. Test Service: Deployment - Configure and Deploy the new (validating) Test Suite - i.e. upload it on the Test Bed component that will execute it, and make it available to other Test Beds or to endusers via a Web browser. Test Suite Test Suite Configuration Test Agent (as the deployment target, more particularly its Test Deployment Manager) Test Artifact Persistence Manager (TAPM) Web Agent Driver (if a Web access is needed on that test 70

72 Registration of the new Validating Test Suite. In this step, the new test suite is registered and archived in the Test Repository. Test Service: Repository - Archive the new (validating) Test Suite, for future discovery. agent) Test Suite Test Repository (registration / archiving) Quality Assurance on the new Test Suite Beta Users test the new Validating Test Suite. They access its references from the Test Repository, and use the test Agent where the suite is deployed. Test Service: Repository - Get details on the new test suite, along with sample documents. Test Services: Execution Test Suite Payload Files (business documents under test) Test Report Test Repository. Test Agent Web Agent Driver (for driving the test Agent via a Web browser) - Run a (validating) Test Suite over an uploaded Document. - Get Test Report. Issues Filing and Tracking. Early Users file some Issues about the new Validating Test Suite Test Services Execution - Report Issue and Get Status of Issues for test Suite. Payload Files (business documents under test) Test Report Test Issue Tracker Testing Scenario 2: Creation, Deployment and Execution of an [SUT] Interacting Conformance Test Suite The activities and Test Bed components involved in this scenario are illustrated in Figure In a first option, all testing capabilities are supported by the same Test Bed: 71

73 Figure 10-7: Testing Scenario 2 (Option Single Test Bed ) In the second option, the main test suite of a Test Bed is delegating some document validation tasks to another Test Bed as shown in Figure Figure 10-8: Testing Scenario 2 (Option Distributed ) Table 10-2 is decomposing the scenario into a flow of activities, while identifying how the testing Framework is supporting these activities by means of Test Services, Test Artifacts, and Test Bed 72

74 Component(s). It is noted that the Validating Test Suite used in this scenario as resource to the Interacting test suite is supposed to be available and deployed, e.g. as result of executing the previous Testing Scenario 1. Table 10-2: Decomposition of Testing Scenario 2 Activity Test Services Test Artifacts Test Bed Component Test Case Prototype Design Basic SUT-interacting Test Case patterns are identified, and downloaded for reuse. For example: (a) test case with user-request trigger: the test case sends an to the SUT operator to ask her/him to activate the SUT to send a particular Business Document. (b) test case with interaction-based trigger: the test case sends a message to the SUT to cause the sending of the Business Document as response. Test Services Repository Test Design Service: - Create (or duplicate) Test Cases prototypes. Test Case (NOTE: these test cases are addressing ebusiness document standards and/or their choreography in related business transactions) Test Artifacts Repository Test Editor Case/Suite Development of the Interacting Test Suite. This step consists of deriving appropriate Interacting test Cases from previous prototypes, Test Design Service: - Update (copies of) Test Cases prototypes. - Create a Test Suite (add Test Cases). - Configure Test Suite (with existing Message Binding). Test Suite, Test Case, Message Binding, Test Suite Configuration Test Editor Case/Suite Test Artifact Persistence Manager (TAPM)(the design work is persisted in the TAPM) Deployment of the new Interacting Test Suite. In this step, the new Validating Test Suite is deployed on a Test Suite Engine. Test Service Deployment - Deploy the new (interacting) Test Suite (i.e. upload it on the Test Bed that will execute it). Test Suite Test Suite Engine (more particularly its Test Deployment Manager) Test Artifact Persistence Manager (TAPM)(the design work is persisted in the TAPM) Registration of the new Interacting Test Suite. In this step, the new test suite is registered and archived in the Test Repository. Test Service: Repository - Archive the new (interacting) Test Suite, for future discovery. Test Suite Test Repository (registration/archiving ) 73

75 Quality Assurance on the new test Suite Beta Users test the new Interacting Test Suite. They access its references from the Test Repository, and use the Test Bed (test Suite Engine) where the suite is deployed. Test Service: Repository - Get details on the new test suite. Test Services: Execution - Run the Test Suite. Test Suite Test Report Test Repository. Test Suite Engine - Get Test Report. Scheduling and Preparing for a Test Run In this step, the Test Suite is configured for a particular test session. Users are invited to register and to pre-test some of their documents for conformance e.g. using the related Validation test suite (see Scenario 1) as a separate service. Test Services: Deployment (for late configuration) Test Services: - Register User(s). Execution - Configure Test Suite (business Actors). - Run the (validating) Test Suite from Scenario #1 used by this Test Suite, for pre-testing purpose over user's Documents. Test Suite Test Suite Configuration Payload Files (business documents under test) Test Report (for pre-tests) Test Suite Engine (more particularly its Deployment Manager and TAPM) Test Operation Manager (a component that provides all administrative functions (user registration and collaboration, test run planning and workflow) - Get Test Report. Doing an Interaction Test Run In this step, the Interacting Test Suite is run against an SUT. Test Services: Execution - Start the (SUTinteracting) Test Suite. (this test suite will invoke a Test Agent, deployed in Scenario #1, to verify Documents). Test Suite, Test Case, Test Report Test Suite Engine Test Manager Operation - Stop/Start test suite as needed after each exchange. - Get execution status at any time. - Get Test Report. Logging, archiving and download of Test Report. In this step, the resulting Test Report is archived and retrieved. Test Service: Repository - Archive Test Report. - Archive Execution Log (if available). Test Log, Execution Test Report Test Repository Test Suite Engine (more particularly its TAPM where the reports are first generated) 74

76 - Browse Test Reports Testing Scenario 3: Creation, Deployment and Execution of an [SUT] Interacting Interoperability Test Suite This Interoperability test suite will verify that SUTs are able to interoperate with each other, and that they do so while complying with the ebusiness Specifications addressed by the test suite here, ebusiness document profiles, and also the related process choreography. The activities and Test Bed components involved in this scenario may be illustrated Figure Figure 10-9: Testing Scenario 3 Table 10-3 is decomposing the scenario into a flow of activities, while identifying how the testing Framework is supporting these activities by means of Test Services, Test Artifacts, and Test Bed Component(s). In this Testing Scenario, SUTs involved are supposed to have been previously tested for conformance (e.g. by undergoing Testing Scenario 2). Table 10-3: Decomposition of Testing Scenario 3 Unit of Activity Test Services Test Artifacts Framework Component Test Case Prototype Design This step is similar as in Scenario 2, except that the Test Case Prototypes are more complex as they must control the exchange Test Repository Services Test Design Service: - Create (or duplicate) Test Case (NOTE: these test cases are addressing ebusiness Test Artifacts Repository Test Case/Suite Editor 75

77 between two or more SUTs. Either one of two major options can be chosen: (1) The outcome of the Test Suite is an Execution Log (not a Test Report): The Test Cases only do message capture (no validation) and add to an Execution Log, to be validated separately later (possibly by a different Validating Test Suite.) Test Cases prototypes. document standards and/or their choreography in related business transactions) (2) The outcome of the Test Suite is a Test Report: The Test Cases do message capture and validation. Development of the Interacting Test Suite. This step consists of deriving appropriate Interacting test Cases from previous prototypes, Test Design Service: - Update (copies of) Test Cases prototypes. - Create a Test Suite (add Test Cases). - Configure Test Suite (with existing Message Binding). Test Suite, Test Case, Message Binding, Test Suite Configuration Test Editor Case/Suite Test Artifact Persistence Manager (TAPM)(the design work is persisted in the TAPM) Deployment of the new Interacting Test Suite. In this step, the new Validating test suite is deployed on a Test Suite Engine. Test Service Deployment - Deploy the new (interacting) Test Suite (i.e. upload it on the Test Bed that will execute it). Test Suite Test Suite Engine (more particularly its Test Deployment Manager) Test Artifact Persistence Manager (TAPM)(the design work is persisted in the TAPM) Registration of the new Interacting Test Suite. In this step, the new test suite is registered and archived in the Test Repository. Test Service: Repository - Archive the new (interacting) Test Suite, for future discovery. Test Suite Test Repository (registration/archiving ) Scheduling and Preparing for a Test Run In this step, the Test Suite is configured for a particular test session i.e. it is instantiated for specific SUT addresses and participants, and specific run-time Test Deployment Services: (for late configuration) Test Execution Services: Test Suite Test Suite Configuration Test Suite Engine (more particularly its Test Deployment Manager and TAPM) Test Operation Manager 76

78 parameters. Users are invited to register their SUTs. - Register User(s). - Configure Test Suite (business Actors). Doing an Interaction Test Run In this step, the Interacting Test Suite is coordinating the B2B choreography between SUTs. Test Services: Execution - Start the (SUTinteracting) Test Suite. - /Start/ Stop/ Resume test suite as needed after each step). Test Suite, Test Case, Test Report Test Suite Engine Test Manager Operation - Get Test Report. Logging, archiving and download of Test Report. In this step, the resulting Test Report is archived and retrieved. Test Service: Repository - Archive Test Report. - Archive Execution Log (if available). Test Execution Log, Test Report Test Repository Test Suite Engine (more particularly its TAPM where the reports are first generated) - Browse Test Reports Interactions among Test Bed Components for Enabling Testing Scenarios The following subsections describe the relations among the internal components of GITB architecture in the generic testing scenarios Testing Scenario 1 Figure shows the interactions among the internal components of GITB Test Bed in the test design phase of Testing Scenario 1. First the user logs in to the system through the Test Monitoring and Management GUI. After that s/he queries both local Test Artifact Persistence Manager and GITB Test Registry/Repository to check whether there exist similar test cases designed before. Accordingly, the test designer creates/updates the test case, adds the test case into a test suite and deploys the test suite through Test Suite/Case Editor. Finally, the test suite/case is stored to the local Test Artifact Persistence Manager. 77

79 Figure 10-10: Testing Scenario 1 - Test Case Design Phase Figure depicts the sequence flow in the test case execution phase of Testing Scenario 1. In this case, the SUT Operators logs into the system through the Test Monitoring and Management GUI and then selects appropriate test suites/cases. After that the user selects the document type against which the documents will be validated. Then, the SUT Operator uploads his/her documents to the system and initiates the validation. In the background, the Test Suite Engine executes the test case where appropriate Document Validator is selected and run. Finally, the Test Suite Engine gets the validation report from the Document Validator and delivers the result to the Test Monitoring and Management GUI where the result is displayed to the SUT Operator. 78

80 Figure 10-11: Testing Scenario 1 - Test Case Execution Phase Testing Scenario 2 Figure presents the interaction diagram for the test case design phase of Testing Scenario 2 where a conformance test case is created. Like in the Testing Scenario 1, after logging into the system, the test designer first queries both local Test Artifact Persistence Manager and GITB Test Registry/Repository to check whether there exists similar test cases designed before. Accordingly, the test designer creates/updates the test case. In the test case creation, the test designer obtains the list of available Messaging Adapters and Document Validators and assigns the suitable ones to the test case. Afterwards, the user adds the test case into a test suite and deploys the test suite through Test Suite/Case Editor. Finally, the test suite/case is stored to the local Test Artifact Persistence Manager. 79

81 Figure 10-12: Testing Scenario 2/3 - Test Case Design Phase As noted previously, the test cases are templates and they do not contain technical details such as the IP of the sender or receiver. Therefore, they should be configured or instantiated by the SUT Operators before execution. Figure shows the interactions among the components in the instantiation phase. The SUT Operator (or another user called Test Manager) logins to the system and selects a test suite/case for configuration. After that the user instantiates/configures the test case using Test Suite/Case Instantiator and stores the resulting instance to Test Artifact Persistence Manager. 80

82 Figure 10-13: Testing Scenario 2/3 - Test Case Instantiation/Configuration Phase In the execution of Testing Scenario 2, it is assumed that the validating test suite developed in Testing Scenario 1 is deployed to a Test Agent and this Test Agent exposes this validation capability through its Test Agent Interfaces/API component. In other words, the documents will be validated with an external GITB compliant Test Agent. Figure shows the interactions. After logging into system and retrieving the test case instance, the SUT Operator starts the execution of the test case instance. Test Suite Engine initiates necessary Messaging Adapter for message sending or retrieval (Assume that at the first step a message will be received from SUT). The SUT Operator signals his/her SUT to send sample messages to GITB system. In return, the Messaging Adapter gets the message, performs message level validation, extracts the document in the message and delivers it to the Test Suite Engine. After that Test Suite Engine (according to the test case instance) sends this document to the Test Agent Interface/API components of Test Agent of Testing Scenario 1 for document level validation. After receiving the validation report, the Test Suite Engine adds the report to Test Artifact Persistence Manager and displays it to the SUT User through the Test Monitoring and Management GUI. Finally, it sends an acknowledgement message to SUT with the suitable Messaging Adapter. 81

83 Figure 10-14: Testing Scenario 2 - Test Case Execution Phase Testing Scenario 3 Considering Testing Scenario 3, where interoperability of two SUTs is tested, the test design phase and instantiation phase are the same as Testing Scenario 2. In this case, the instantiation is performed by both of the SUT Users. The execution phase is displayed in Figure The GITB Test Bed acts as a proxy in interoperability testing. In other words, the SUTs send/receive their messages to/from the GITB Test Bed. Like in the previous scenarios, the execution starts with the login of the SUT Operators to the system. Please note that usually in the interoperability testing, both of the SUT Operators are logged in simultaneously. However, it may be the case that they can log in to the system in different time intervals. In this scenario, assume that both of them are logged in to the system. The SUT Operator first selects the test case instance to be executed. After the execution, the Test Suite Engine initiates necessary Messaging Adapter to retrieve the message from the first SUT. The SUT Operators signals his/her SUT to send a sample message and in return the Messaging Adapter receives the message, performs message level validation, extracts the document and delivers the document to Test Suite Engine. After that Test Suite Engine (according to the test case instance) sends this document to the Test Agent Interface/API components of Test Agent of Scenario 1 for document level validation. After receiving the validation report, the Test Suite Engine adds the report to Test Artifact Persistence Manager and displays it to the SUT User through the Test Monitoring and Management GUI. Then, it forwards the original message to the receiver SUT with the suitable Messaging Adapter. The sending of the acknowledgement is the same of sending the initial message. At this point the SUT Operator of the receiving SUT operates the test execution. In other words, the other SUT Operator logins to the 82

84 system and makes his/her SUT to send and acknowledgement. Again the sequence of actions shown in Figure is applied on the acknowledgement. Figure 10-15: Testing Scenario 3 - Test Case Execution Phase 83

85 Part III: GITB Application and Validation based on Use Cases from the Automotive Industry, Healthcare and Public Procurement (Target Group: Test Bed Users, ebusiness Experts, SDOs) Part III of this report is describing testing scenarios that are based on the previously selected business Use Cases from three industries. The objective is to validate the GITB Testing Framework based on real-world testing use cases. To this end, the ebusiness scenarios defined for each Use Case are translated into Testing Requirements and Testing Solutions, which are described in terms of the GITB Testing Framework. The general approach is first presented in Chapter 11, before its application to the use cases is presented: In the automotive industry, long distance supply chains Chapter 12 In ehealth, Health Level (HL7) v3 scenarios including the scenarios for cross-border sharing of Electronic Healthcare Records (EHRs) by using the HL7 Clinical Document Architecture (CDA) Section 13 In Public Procurement, PEPPOL and the CEN/BII eprocurement scenarios Section Applying GITB in Use Cases 11.1 Approach Figure 11 1 outlines a step-wise approach for translating ebusiness scenarios into testing requirements and creating testing solutions based on the GITB Framework. Starting point is the business user s need for testing one or more ebusiness Scenarios. Business users define the relevant set of ebusiness Specifications as well as the actors involved in the ebusiness interactions with their roles. From the ebusiness Scenarios, the people testing the business scenarios, typically business users responsible for implementation or the integrators or software vendors working with the business users, elaborate on two types of testing requirements, On the one hand, they analyze what to test by deriving the exact verification scope from the ebusiness Specifications. On the other hand, they determine how to test by specifying the testing environment with its operational requirements. From the testing requirements, the test designers and test managers can set up the appropriate Test Services and Test Artifacts supporting the Testing Scenarios. 84

86 Figure 11-1: Applying GITB in Use Cases 11.2 Deriving Testing Requirements from ebusiness Scenarios From ebusiness Scenarios, business users define the relevant set of ebusiness specifications as well as the actors. The two types of testing requirements have to be taken into account prior to designing a testing solution: the verification scope ( What to test? ), which can be derived from the ebusiness Specifications, and operational requirements (testing environment) that determines operational testing requirements that have to be met by an appropriate testing solution Verification Scope («Test Patterns») When implementing ebusiness Scenarios, business users define the relevant set of ebusiness specifications, which need to be analyzed for deriving the verification scope. The following sections describe the verification scope ( test patterns ) for state-of-the-art ebusiness scenarios, by referring to the different layers of ebusiness: business process, business document and messaging layer. At the Business Document Layer, the verification scope may comprise structural and semantic validations as well as cross-layer validations with the Messaging Layer (cf. Table 11 1). Table 11-1: Verification Scope ( Test Patterns ) for Business Document Layer Validation Type of Validation Verification Scope Description Structural validation Document syntax and structure Data types Testing whether messages conform to the message definitions, e.g. as defined by EDIFACT or XML document schemas (xsd) UN/CEFACT Core Data Type Catalogue (CDT Catalogue) 85

87 Semantic validation Document assembly Mandatory / optional fields Vocabulary and code list verification Business document header definitions Business rules Testing whether messages conform to naming and design rules, e.g. as defined by OAGi or UN/CEFACT Core Components Message Assembly (CCMA) Testing whether all mandatory fields are correctly filled, e.g. as defined by content definition (e.g. xsd) Testing whether data fields comply with defined vocabulary, code lists (e.g. DUNS, ISO, UNECE,...) or core components (e.g. UN/CEFACT CCL,...) Testing whether document headers are correct, e.g. as defined by UN/CEFACT Standardized Business Document Header (SBDH) or OAGI BOD's application area Testing of business rules, e.g. as specified by Schematron QoS Equivalent business document versions Testing whether "equivalent" versions for the same document, could be used for the same transaction Equivalent versions syntax Testing whether "equivalent" document syntax, could be used for the same transaction, e.g. different implementations of syntax neutral business document specifications Consistency of message header and business document content Testing whether message header and business document content are aligned At the Messaging Layer, the verification scope comprises structural validation, such as testing messaging protocols, validating message headers and testing the discovery of endpoints. It may also comprise validations for QoS and other validations (cf. Table 11 2). Table 11-2: Verification Scope ( Test Patterns ) for Messaging Layer Validation Type of Validation Verification Scope Description Structural validation Messaging Protocol Testing transport and communication level protocols, e.g. as defined by ebxml Messaging (ebms), SOAP, EDIFACT ANSI, RosettaNet Implementation Framework (RNIF), Minimal Lower Layer Message Transport protocol (MLLP) Message header Testing whether the message header is valid 86

88 Quality of service (QoS) validation Others Addressing Security Other QoS Equivalent messaging styles / formats / versions Testing the discover of endpoints Testing security protocols, e.g. as defined by WS-Security Testing QoS, e.g. as defined by WS-Policy Testing "equivalent" messaging styles / formats / versions, that could be used for the same transaction At the Business Process Layer, structural validation comprises testing message sequence and process choreography, the correct interpretation of roles as well as timing conditions. In addition, cross-layer validations (or profile validation) are performed with Business Document and Messaging Layer (c.f. Table 11 1). Table 11-3: Verification Scope ( Test Patterns ) for Business Process Layer Validation Type of Validation Verification Scope Description Structural validation Sequence of messages / choreography Roles Timing conditions Testing the correct sequence of messages, e.g. as defined by sequence diagrams; Testing process choreographies which are informally or formally defined Testing the different roles within a business processes, e.g. senders and receivers of messages Testing the timing conditions in business transactions, e.g. as defined by triggering events or reaction times Cross-layer validation / Profile validation Data consistency across business documents Restrictions on the business document format and content Testing data relationships across different messages, e.g. as defined by a common information model Testing syntactic and semantic restrictions on the business document format and content Restrictions message header on Testing restrictions on message header and consistent use of conversation ID Restrictions on transport protocols Testing restrictions on and correct use of transport protocols 87

89 Operational Requirements («Testing Environment») The testing environment determines operational testing requirements that have to be met by an appropriate testing solution: The testing context (cf. Section 3.4.2) denotes the situation when testing is performed During standard development for quality assurance of the developed ebusiness Specifications. Implementation of new or upgrade of existing ebusiness endpoints: Onboarding of new partners During ebusiness operations Testing integration in business environment: Several possibilities exist with regard to integrating in the business environment. Testing system is the in-production system: The user wants to do testing in the inproduction system under exact business conditions (with same firewall setups, security setups, ebusiness gateway setup). Testing system is a non-production system: The user does not want to disturb currently deployed in-production system, but wants to test a system that is configured differently from the currently production system. No integration in business environment: In this case, testing is not integrated at all with the business environment, but is done manually. Testing location: On-premise testing: In this case, end-users do not want to access a remote server to undergo testing of their own ebusiness endpoints. Instead, they are downloading and installing a test server, along with automated test suites. On-premise testing avoids external access on in-production system and reconfiguration of firewall. It provides convenience of local control of test environment. It requires that end-users have the IT expertise to do testing onsite. Remote testing: In this case, end-user do not have to handle any test equipment locally, e.g. because of the IT overhead of doing so, or because it wants to test its SUT exactly in its production context (not in an off-production test harness). Testing may be controlled by user (remotely) or operated by a 3rd party. Combination of remote and on-premise testing: A combined approach is appropriate if end-users want to decouple test execution from test analysis. As an example, test driving is local on user premises, whereas test analysis relies on remote services. Testing workshops: In this case, a testing workshop is organized with different test participants. Testing topology: Direct connection of systems (point-to-point) Mediation via business hub 88

90 Manual testing / no interaction with SUT Interaction with SUT Interactions between SUTs CWA XXX:2011 (E) Mediation via testing hub 11.3 Deriving Test Scenarios and Solutions The GITB methodology for creating testing solutions for ebusiness Specifications relies on the step-wise approach presented in Section Ideally, the Testing Scenarios are performed sequentially, starting with standalone document validation (Testing Scenario 1) and goes on to interactive conformance testing (Testing Scenario 2) and interoperability testing (Testing Scenario 3). The following figure describes how the three testing scenarios differ in terms of verification scope and integration in the business environment. Verification Scope Messaging layer Structural validation: Quality of service (QoS) validation Test 3 scenario Address discovery Others Business document layer Structural validation Semantic validation Test scenario 1 Test scenario 2 Others layer Business process layer Structural validation Cross-layer / profile validation Figure 11-2: Testing scenarios, requirements and integration in business environment 89

91 To setup the test bed realize the testing scenarios, test designers and test managers will search for existing components and artifacts using the TRR. If no existing resources are available, they will have to create the necessary testing capability components and artifacts. If a GITB-compliant test bed is available, it will provide the non-core components and be used as testing platform. 90

92 12 Use Case 1: Long Distance Supply Chains in the Automotive Industry In this section, business and testing requirements for the Material Off-Shore Sourcing (MOSS) use case will be defined: 1. ebusiness Scenarios: The perspective of this section is from the business user s point of view. 2. Testing Requirements What and how to test? The perspective of this section is from the people testing the business scenarios, typically business users responsible for implementation or the integrators or software vendors working with the business users. 3. Testing Solutions How to use and apply the GITB Testing Framework? The perspective of this section is from the software architect responsible for utilizing the GITB framework to set up the appropriate Test Services and Test Artifacts to support the Testing Scenarios ebusiness Scenarios The Materials Off-Shore Sourcing (MOSS) project is designed to provide enhanced electronic communication between trading partners including OEMs, Tier 1 and sub-tier suppliers, transport providers, logistics and related service providers throughout the intercontinental supply chain. Information showing the need and interest for this project was gathered from 210 participants as a result of a survey designed by the Automotive Industry Action Group (AIAG) and AMR Research. Survey results are included in various sections of this business case. This project includes the analysis of the existing business processes and technical processes utilized to move goods from the foreign supplier to the domestic ship to party, and will only include those aspects of manufacturing affected by shipping. It will not include sales and marketing, order management, replenishment, those parts of manufacturing not directly affected by shipping, or accounts receivable. Of the 210 survey respondents, the following summarizations were made: 1. 91% still use manual procedures to correct shipments, communicate status and visibility % of all shipments experience delays due to inaccurate or incomplete data % believe standardizing exchange of info will reduce disruptions in supply chain % believe improvement in long distance supply chains is needed. Recurring problems involving the extensive use of paper documents, s and faxes to effect these complex material movements causes compliance problems, data quality deficiencies, visibility deficiencies, all resulting in avoidable delays and the expenditure of additional resources for problem resolution. Currently, information used by trading partners to effect the intercontinental movement of goods is a cobbled together paper-based system. Pockets of EDI exist but depend upon proprietary message implementation guides which make them unable to pass data to the next trading partner. As a result, they resort to paper, , fax and phone. Figure 12-1 is an example of the complexities of the automotive long distance supply chain problem. 91

93 SUMMARY INBOUND PROCESS FLOW: Foreign to USA Milestone Reporting ACC Arrived Consol Center RCC Received Consol Center PAS Picked up at Supplier DCC Departed Consol Center LOC Loaded Consol Center APD Arrived Port of Departure CEC Cleraed Export Customs HEC Hold Export Customs DOS Departed on Ship CIC Cleared Import Customs APA Arrived Port of Arrival HIC Hold Import Customs OPA Outgated Port of Arrival RDC Received Deconsol Center ADC Arived Deconsol Center PCP Picked up by Carrier at Port LDC Loaded Deconsol Center PDC Picked up from Deconsol Center DIMS Domestic Intermediary Movements DTD Delivered to Destination Messages / Documents Type Orders/850, Commercial Invoice, INVOIC, DELFOR/ 830, DELJIT/862,866, Pack List ExDec, OGA Docs DESPATCH/856, 214 status Shipper B/L 204 Load Tender RECADV/861 PreLoad, Load List Consolidated Container Manifest, DESADV-C, 325 Drayage Load Tender IFTMBF/300, IFTMIN/ 304 IFTMBC/301 CMR Transit 24 hour Manifest 311 manifest, CUSCAR, 309 IFTMCS/310, IFTMAN/312 IFTSAI/ Bill of Lading 322 Container List 309 Manifest,997,350,355 CF7533/eTruck ACE Manifest ABI Entry (CF3461) CF7512QP Intransit RR Waybill, Truck ProBill Note: throughout Event activity there is numerous 997 FA and 824 AA reporting RECADV/861 Event Supplier goods ready for shipping Pre-Advise from Supplier to Material Manager Material Manager issues Load tender to Trucker Trucker picks up goods at Supplier Trucker delivers goods to Consolidation center Consolidation of Goods LCL Goods Accepted at Consolidation Center Freight Forwarder arranges for Ocean transport Drayage to Pier Export Customs Processing Port of Departure Processing 24 hour rule, Container Loaded on Vsl 2 Day out Notifications Vsl Arrives POA POA Customs Clearance Containers Loaded on RR or Truck Containers Outgated at POA Containers In-Transit to USA Advance Elec. Notification Carrier Arrives POE Customs Entry Filed at POE Transit to Ship To Party Customs Release Actor Goods Received by Ship To Party Supplier Material Manager Inland Foreign Trucker Consolidation Center Freight Forwarder Drayage Carrier POD Port Authority Export Customs Steamship Operator POA Customs POA Port Authority Highway Carrier RailRoad Carrier POE Customs Broker POE Service Provider POE Customs POE OGA Domestic Inland Carrier DeConsolidation Center Ship To Party Figure 12-1: Complexities of Long Distance Supply Chain MOSS addresses these problems by identifying the 22 messages sets and the data mapping at the data element level to ensure interoperability. In a proof of concept exercise conducted in early 2009, MOSS has demonstrated that 92% of all the data used in the supply chain is known upfront via the order, forecast, shipping schedule and general shipping instructions. Only 8% of data is inserted in process; e.g. container number, seal number, vessel name, a change in quantity, etc. And, 100% of all data, once input, is reusable; there should be zero rekeying. For example, a Supplier need only add a single data element to issue a complete and timely advance shipping notification (ASN). The messages used are primarily EDI EDIFACT messages. Additionally, an einvoice and epacklist are used. They are based on early versions UNeDocs defined by the United Nations Centre for Trade Facilitation and Electronic Business (UN/CEFACT). Additionally, all recommendations are based upon the ENTDED/ISO 7372 United Nations Trade Data Element Dictionary. There are two principle problems that make providing an efficient supply chain difficult. The first problem is that the channel through which information about the supply chain flows, that is, the paper, phone calls, faxes, and EDI messages, is error prone and inefficient. The second problem is that establishing that channel is costly. There are good reasons why one supply chain differs from another, but there are no good reasons for how difficult it is to establish one. Making the arrangements by which parties communicate with each other, and identify what they need to communicate is extremely difficult. The first problem is solved, in part, by defining the information requirements of the EDI messages driving the process not only by each message, but also between messages. MOSS has defined 92

94 these through the MOSS Data Matrix developed in conjunction with NIST ( Figure 12-2 is an overview of the various actors and the messages and documents they receive and create: Supplier 3PL Freight Ocean Customs Importer Receive/Create Carrier Broker Receive/Create Receive/Create Receive/Create Receive/Create Receive/Create Orders DELFOR DELJIT DESADV INVOIC edoc Invoice DESADV INVOIC edoc Invoice 204 Load Tender Request for DESADV INVOIC edoc Invoice IFTMIN IFTMIN IFTMBF CAMIR CUSCAR/309 IFTMBC IFTMCS DESADV INVOIC edoc Invoice Customs Entry DESADV INVOIC edoc Invoice Orders DELFOR DELJIT Shipping Request for Transport Figure 12-2: Long Distance Supply Chain Activity Diagram Recadv Shipping The second problem is solved architecturally. Supply chains that grow over long periods of time, out of basic necessity and the nearest convenient communication medium (typically paper documents and the telephone) miss opportunities for optimization across steps of the process. Instead of a clean break, each new requirement is addressed adding an additional notify party, an additional document transfer, a phone call, etc. until things become quite complex, invisible, and costly. See illustration 1 in Figure Figure 12-3: MOSS Architecture The way things are in Illustration 1, there are many ad-hoc information channels. Parties get notified second hand. Illustration 2 represents the MOSS approach which is the publish and subscribe architecture whereby common information is shared and known by all parties. Milestone events trigger messages sent to the appropriate parties. The ebusiness Specifications in the MOSS Scenario to be tested are summarized in Table Table 12-1: ebusiness Specifications in the MOSS Scenario 93

95 ebusiness Specifications Business Process Business process documentation: sequence diagram, process descriptions Message choreography: n/a Business Documents Business documents: 22 EDI messages (XML messages planned) Transport Communication Protocols Profiles and Attribute values: n/a EDIFACT, ANSI WS-I Partner-specific message profiles (e.g. varying definitions for the values of a subset of codes, mandatory use of optional fields, ) 12.2 Testing Requirements What and How to Test? Verification Scope (What to Test?) Long distance supply chains, or trade lanes, are typically implemented by one customer and one or more of their suppliers, including their intermediate actors. Each trade lane conducts a current state analysis and future state trade lane design. Figure 12-4 is an example of a high level trade lane design. Each trade lane design should include: All or a portion of the MOSS recommendations Message variations to the MOSS recommendations Relationships between the actors, events, and messages received and created and the associated workflow 94

96 Figure 12-4: High Level Trade Lane Design Example Upon completion of a future state trade lane design, participants will choose the recommended MOSS guidelines and enter into GITB. The guidelines include: Actors, Events, and Workflow Message schemas, including required variations Data relationships between messages, to achieve 92% reuse The goal of GITB is to verify messages conformance and content against a trade lane design. This is accomplished through initial conformance testing and trade lane simulation. During initial conformance testing, actors and their solution providers will conduct message instance conformance to the required schemas to their associated events. Testing can be done with one or more messages and will typically be done by all trade lane participants independently. During a trade lane simulation, or interoperability testing, actors and their solution providers will provide message instances from the beginning of the trade lane to the end. Message schema, workflow, and data relationships are verified. The trade lane simulation is done dependently, in sequence, according to the trade lane design and may be monitored by a 3 rd party. To further illustrate the activities contained in a trade lane design, several transaction diagram examples follow. Each transaction diagram shows the message(s) exchanged between the actors 95

97 and the triggering events to show when a transaction is initiated. The transaction diagrams are not all of those needed for a trade lane design, but are intended to show an example of the workflow of the events. Figure 12-5 is the Prepare Purchase Order transaction diagram. A purchase order is issued from the customer to the supplier to establish master order information, such as shipping instructions. It is updated regularly, perhaps once a quarter. It is important to note that the information contained in the purchase order is used throughout the trade lane to verify information contained in other messages. Figure 12-5: Prepare Purchase Order Figure 12-6 is the Prepare Forecast/Schedule transaction diagram. Note that this activity is depicted in Figure 12-4 Trade Lane Design. This is an operational activity that communicates the goods, quantity, and schedule for delivery. A forecast (DELFOR) is sent every week and contains a planning horizon up to 3 months. A schedule (DELJIT) is sent starting 1 month before the expected delivery and contains a daily delivery schedule. Figure 12-6: Prepare Forecast/Schedule Figure 12-7 is the Prepare Invoice transaction diagram. The invoice is a supplier s request to be paid for goods or services supplied under conditions agreed upon by the supplier and customer. It is a signal from the supplier that the goods are prepared for shipping, which is the trigger. Figure 12-7: Prepare Invoice Figure 12-8 is the Prepare Booking Request transaction diagram. This contains two activities: the request for booking the container on the vessel (the IFTMBF message), and confirmation of that request (the IFTMBC message). The triggering event was a Transportation Request by the supplier. 96

98 Figure 12-8: Prepare Booking Request Figure 12-9 is the Prepare Shipping Instructions transaction diagram. The IFTMIN message signals the shipping instructions that enable the ocean carrier to produce the bill of lading. The triggering event is that the ocean carrier has acknowledged the booking request with a booking confirmation. Figure 12-9: Prepare Shipping Instructions Figure is the Prepare Advanced Ship Notice transaction diagram. The advanced ship notice is a widely-employed means for a shipper to communicate to other stakeholders that a shipment has been made. The triggering event is that the goods are loaded onto the transport and the ocean carrier has acknowledged the booking request with a booking confirmation. Figure 12-10: Prepare Advanced Ship Notice A primary success criterion for MOSS is that data content is reused across messages. The 92% reuse metric cited in the introduction was achieved by not rekeying information for messages as the information flowed through the trade lane. Table 12-2 shows how the data content of the Advanced Ship Notice (DESADV message) was derived from the content of earlier messages. The 1 st column shows the DESADV data element and the 2 nd column shows the messages where that content originated and must be identical. Table 12-2: Advanced Ship Notice Data Verification Example (Partial) Data Element Ship from Party Ship to Party Shipment ID Number Origin DELFOR/DELJIT DELFOR/DELJIT INVOIC 97

99 Buyer Party Seller Party Order Number Pack List Reference Number Container Stuffing Location Bill of Lading Number Main Carriage Vessel Name Dispatched Quantity Actual Shipped DTM ORDERS ORDERS ORDERS INVOIC Transportation Req. IFTMBC IFTMBF Introduced here Introduced here Testing Environment (How to Test?) The testing environment for a MOSS solution contains the following operational requirements: Testing Context (i.e. the moment of testing): This typically occurs during three different time periods: o o o Future State Design: During the time at which the trade lane documents are being created conformance tests on the documents will be done by the schema designers to ensure that the resultant design is complete and consistent. Conformance Testing: After the trade lane design is complete, all trade lane participants will independently and asynchronously upload documents from their test systems to verify schema structural integrity. Trade Lane Simulation: Once all trade lane participants have completed conformance testing, an actual simulation of the trade lane from the beginning of the trade lane to the end. This is done synchronously and may be controlled by a third party. Testing Integration: The testing for a MOSS solution, no matter for which of the above described contexts, will occur in a non-production test system. Additionally, it may be difficult to perform tests directly to the non-production system itself. To send and retrieve messages, proxy addresses may be needed that are outside the non-production system s firewall or in a DMZ. Testing Location: In a MOSS solution, or any North American automotive solution for that matter, remote testing will typically be the preferred testing means. Testing participants will access a remote server to satisfy the testing requirements as they generally prefer not to download and manage test software. Testing Topology: The preferred style of testing would be mediation via a testing hub. As mentioned previously, access to the test systems would be done through a proxy, no direct communication between systems would be allowed. Given that, additionally, the testing would be done remotely, leads to a testing hub mediation topology. 98

100 12.3 Testing Scenarios How to Use GITB? Scenario 1: Creating, Deploying and Executing a Validating Test Suite for Conformance to a MOSS Message As introduced earlier in this section, the goal of GITB is to verify message(s) and contents against a trade lane design. This section describes how an actor, or SUT Operator, in a trade lane verifies the message that they create. This activity is done asynchronously and independently by all the actors participating in a trade lane design. The typical interaction between the trade lane actor and GITB will be that the actor uploads a sample instance message to the GITB Test Bed and then the message is presented to the actor via a web browser Design Phase Step 1: Identify Test Scope The test designer should define a target for testing according to the MOSS business scenarios as shown in Table Testing does not directly involve an SUT, but rather testing artifacts produced by the SUT. Table 12-3: Sample Business Scenarios and Target Messages Business Scenario SUT Target Customer sends ORDER to Supplier Customer system ORDER message Customer sends DELFOR to Supplier Customer System DELFOR message Customer sends DELJIT to Supplier Customer System DELJIT message Supplier sends to TRANSREQUEST to Freight Forwarder Supplier System TRANSREQUEST message Supplier sends INVOIC to Customs Broker Supplier System INVOIC message Freight Forwarder sends IFTMBF to Ocean Carrier Freight System Forwarder IFTMBF message Ocean Carrier sends IFTMBC to Freight Forwarder Ocean Carrier System IFTMBC message Freight Forwarder sends IFTMIN to Ocean Carrier Freight System Forwarder IFTMIN message Supplier sends DESADV to Freight Forwarder Supplier System DESADV message - Step 2: Search for existing Test Cases for Message Types involved: The test designer queries the Test Artifacts repository with message name, actor names and/or message type name. In return, the test designer discovers a test case and its associated test assertions. - Step 3: Extension of Test Cases for new Profile: 99

101 The test designer may realize that the discovered test case does not reflect the specific requirements for the MOSS use case. For this purpose, the test designer implements a new rule script file (like schematron, JESS, XPATH, etc.) for the specific rules for the new test assertions. The test designer then updates the discovered test case for reuse. - Step 4: Packaging of all Profile Test Cases into a new Validating Test Suite: The test designer assembles all the Test Cases into a "validating" Test Suite, which will be invoked when documents need validating. - Step 5: Deployment of the new Validating Test Suite, on a Test Agent: The test designer deploys the Test Suite on a Test Agent. - Step 6: Registration (archiving) of the new Validating Test Suite: After some preliminary quality assurance, the test designer registers the new test suite to the Test Repository, so that it can be archived, discovered later, and re-deployed if necessary. - Step 7: Early Users test the new Validating Test Suite: A SUT Operator (e.g. a solution vendor providing IT services to a participant in the MOSS) connects a SUT to the Test Agent via a Web browser or tool kit, selects the test suite and then uploads his/her MOSS-related sample document for testing. - Step 8: Early Users file some Issues about the new Validating Test Suite: The operator may discover some defects in the Test Case itself. The operator then files the issues to update the Test Case Execution Phase - Step 1: Implement User System The test user should implement their system according to their role in the MOSS business scenario. For example, the actor, Customer should implement the system that can generate ORDER, DELFOR and DELJIT message based on the specification of MOSS. - Step 2: Prepare Test In the MOSS case, AIAG or another organization will provide Test Bed. The test user connects its system to the Test Bed according to a guideline of the Test Bed and downloads a manual for test preparation and operating. The test user should prepare for all the messages needed to pass all the required test cases. - Step 3: Execute Test In this testing scenario, a testing artifact produced by the SUT is a target for testing. The test user can execute a verification process for the message generated by their SUT. - Step 4: Inspect Test Result In the MOSS case, the test user wants test results to be immediately available, so the Test Bed needs to show the test results immediately through web browser. If the test fails, the test user can detect the source of failure by using the result analyzer in the Test Bed. 100

102 - Step 5: Update the System In case of failure, the test user updates the SUT based on the reasons of failure. - Step 6: Re-Test The test user can test again the updated SUT. STEP 5 and STEP 6 should be repeated until all the test cases are executed Scenario 2: Creating, Deploying and Executing an Interacting Test Suite for Conformance to a MOSS Message This section describes how an actor, or SUT Operator, in a trade lane verifies the message that they create. This activity is done asynchronously and independently by all the actors participating in a trade lane design. The difference in scenario 2 is that instead of interacting through a web browser, the SUT Operator will specify physical endpoints to send and receive messages. The SUT and GITB interact directly. These endpoints are typically outside an SUT Operator s firewall. To send and receive messages, proxy addresses may be needed outside the test system s firewall. Another difference is that as the test system posts the message to a physical endpoint, the message envelope needs to be included and verified by the GITB Design Phase - Step 1: Basic Test Case Design: The test designer should construct test cases of interactions for the MOSS use case as shown in Table The Test Suite Driver plays a role of the message receiver and verifies the received messages. It will reuse the validation capability of Test Agent of Scenario #1 when its interacting test suite is executed. For example, in the I1, the Test Suite Driver plays a role of Supplier and verifies the ORDERS message sent by Customer. Table 12-4: Test Cases for Interactions Interaction Requesting Actor Message Responding Actor I1 Customer ORDERS Test Bed I2 Customer DELFOR Test Bed I3 Customer DELJIT Test Bed I4 Supplier TRANSREQUEST Test Bed I5 Supplier INVOIC Test Bed I6 Freight Forwarder IFTMBF Test Bed I7 Ocean Carrier IFTMBC Test Bed I8 Freight Forwarder IFTMIN Test Bed I9 Supplier DESADV Test Bed 101

103 - Step 2: Development of the Interacting Test Suite: The test designer chooses a test case template (for example, a test case with interactionbased trigger) and edits the template in order to use the Test Agent in Scenario #1 for message validation. Furthermore, the test designer assigns an EDI message binding to the test case. The test designer then assembles an interacting test suite and archives it to the Test Repository. - Step 3: Deployment of the new Interacting Test Suite, on a Test Suite Driver: The test designer deploys the Test Suite on a Test Suite Driver using the Test Deployment Manager. - Step 4: Scheduling and Preparing for a Test Run: The Test Manager informs the SUT Operator of the availability of conformance testing for the MOSS Specification Execution Phase - Step 1: Prepare Test In order to test the interaction between the test user and Test Bed, the test user should know the endpoint information of Test Bed. The test user configures the endpoint information to the messaging engine. The test user then checks status of bindings between the message generation system and the messaging engine. - Step 2: Execute Test The test user sends to the Test Bed its "MOSS Document" message through EDI protocol. The Test Bed receives and stores the message to temporary storage. The test user executes the verification process via the web browser. The Test Bed validates the message and generates the test results. - Step 3: Inspect Test Results After verifying, the Test Bed shows the test results immediately through the web browser. If the test fails, the test user can detect the reason by using the result analyzer in the Test Bed. - Step 4: Update the System In case of failure, the test user updates the SUT based on the reasons of failure. - Step 5: Re-Test The test user can test again the updated SUT. STEP 4 and STEP 5 should be repeated until all the test cases are executed. 102

104 Scenario 3: Creating, Deploying and Operating an SUT Interacting Test Suite for Interoperability between several SUTs based on MOSS Trade Lane Design This section describes how an entire trade lane design will be verified. The difference in scenario 3 is that multiple systems are involved and the GITB is the proxy. Another difference is that the implied workflow on the trade lane is executed. The interaction will be the synchronous events that occur from the beginning of the trade to the end Design Phase - Step 1: Basic Test Case Design: The test designer should consider the whole trade lane simulation for the MOSS use case. The test designer should construct test cases of interactions and transactions for the MOSS use case as shown in Table Table 12-5: Test cases for Transactions and Interactions Transaction Interaction Requesting Actor Message Responding Actor T1 I1 Customer ORDERS Supplier T2 I2 Customer DELFOR Supplier T3 I3 Customer DELJIT Supplier T4 I4 Supplier TRANSREQUEST Freight Forwarder T5 I5 Supplier INVOIC Customs Broker T6 I6 Freight Forwarder IFTMBF Ocean Carrier I7 Ocean Carrier IFTMBC Freight Forwarder T7 I8 Freight Forwarder IFTMIN Ocean Carrier T8 I9 Supplier DESADV Freight Forwarder The test designer should also construct the relations between messages for the MOSS use case as shown in Table Table 12-6: Message Relations Interaction I2 I3 I4 I5 Message relation Purchase Order (ORDERS) information must be verified on the (DELFOR) Purchase Order (ORDERS) information must be verified on the (DELJIT) DELFOR/DELJIT information must be verified on the Transportation Request Purchase Order (ORDERS), Delivery (DELFOR/DELJIT), and Transportation Request information must be verified on the Invoice (INVOIC) 103

105 I6 I7 I8 I9 Purchase Order (ORDERS), Delivery (DELFOR/DELJIT), and Transportation Request information must be verified on the Booking Request (IFTMBF) Purchase Order (ORDERS), Delivery (DELFOR/DELJIT), Transportation Request, and Booking Request (IFTMBF) information must be verified on the Booking Confirmation (IFTMBC) Purchase Order (ORDERS), Delivery (DELFOR/DELJIT), Transportation Request, Invoice (INVOIC), and Booking Request (IFTMBF) information must be verified on the Shipping Instructions (IFTMIN) Purchase Order (ORDERS), Delivery (DELFOR/DELJIT), Transportation Request, Invoice (INVOIC), Booking Request (IFTMBF), and Booking Confirmation (IFTMBC) information must be verified on the Advanced Ship Notice (DESADV) - Step 2: Development of the Interacting Test Suite: As in Scenario #2, an EDI Message Binding is assigned to the test case, which is then assembled to an Interacting Test Suite and archived in the Test Repository. - Step 3: Deployment of the new Interacting Test Suite, on a Test Suite Driver: The test designer deploys the Test Suite on a Test Suite Driver using the Test Deployment Manager. - Step 4: Scheduling and Preparing for a Test Run: The Test Manager informs the SUT Operators of the availability of interoperability testing for the MOSS Specification Execution Phase - Step 1: Prepare Test In order to test the interactions between a test user (SUT1) and another test user (SUT2), the test user should know the endpoint information of SUT2. The test user configures the endpoint information of SUT2 to the messaging engine. The test user then checks status of bindings between the message generation system and the messaging engine. - Step 2: Execute Test The test user sends to the SUT2 its "MOSS Document" message through EDI protocol. The SUT2 receives the message and process the message. The message exchanged between two SUTs is reported to the Test Bed and the test user executes the verification process via the web browser. The Test Bed validates the message to analyze the behavior of the two SUTs by using the interaction logs. Finally the Test Bed generates the test results. - Step 3: Inspect Test Result The Test Bed shows the test results immediately through the web browser. If the test fails, the test user can detect the reason by using the result analyzer in the Test Bed. By doing so, the test user can find which SUT causes the reason of failure. - Step 4: Update System 104

106 In case of failure, the test user updates the SUT based on the reasons of failure. - Step 5: Re-Test The test user can test again the updated SUT. STEP 4 and STEP 5 should be repeated until all the test cases are executed. 13 Use Case 2: Health Level (HL7) v3 Scenarios In this section, business and testing requirements for the HL7 use case will be defined: 1. ebusiness Scenarios: The perspective of this section is from the business user s point of view. 2. Testing Requirements What and how to test? The perspective of this section is from the people testing the business scenarios, typically business users responsible for implementation or the integrators or software vendors working with the business users. 3. Testing Solutions How to use and apply the GITB Testing Framework? The perspective of this section is from the software architect responsible for utilizing the GITB framework to set up the appropriate Test Services and Test Artifacts to support the Testing Scenarios ebusiness Scenarios - HL7 Storyboards and IHE Interoperability Profiles This section describes the common processes used in the ehealth domain. Both of the initiatives, Integrating the Healthcare Enterprise (IHE) and Health Level Seven (HL7), define business processes in their corresponding specifications. IHE call it Interoperability Profiles and HL7 call their specifications storyboards. Both of them reflect the processes run in real-life. They also define the actors, messages and protocols used in these processes. In this version of the document the testing requirements and GITB usage scenarios of HL7 domain is described. Furthermore, an IHE usage scenario is based on the assumption that there multiple GITB compliant Test Beds collaborating to test the IHE s Scheduled Workflow profile. In the subsequent versions, necessary IHE testing requirements and further usage scenarios will be included. The conformance to the HL7 v3 standard is based on Application Role concept. In HL7, an application role is an abstraction that defines a portion of the messaging behavior of an information system. However, the Application Role concept should not be interpreted as the specification of complete behavior of a specific healthcare application. It describes only the messaging interface for the application for a specific messaging behavior. A healthcare information system may claim conformance to several or many application roles. The structure and potential behavior of messages from a sending application role to a receiving application role are defined by Interactions. The Message Type is simply the definition of the structure and semantics (the meaning of each message element) of the information transfer for a specific messaging. HL7 has a formal methodology for developing message types starting from its Reference Information Model (HL7 RIM) by using HL7 Vocabulary and HL7 Data Types. However, in the scope of conformance and interoperability testing, we can concentrate on the final product, the Message Type (or Hierarchical Message Descriptions) definitions which define a unique set of 105

107 constraints for the message payload of an interaction. As noted previously in Turkey, the message payload contains CDA documents. Trigger Event is another crucial concept in HL7 for conformance and interoperability. It is actually the explicit set of conditions that initiate the transfer of information between application roles. HL7 defines one or more trigger events for each HL7 interaction. In terms of conformance, the application roles are expected to initiate the corresponding interaction when a trigger event occurs. Furthermore, HL7 defines Storyboards which describes real life scenarios among two or more application roles. The storyboards are the informative part of the standard and described by a sequence diagram and narrative text. Although they are not related with conformance, they illustrate real life scenarios and the choreography among application roles. Figure 13-1 illustrates a storyboard defined in HL7 v3 standard usage in Turkey. Basically, there are five interactions. First, the NHIS Client (Hospital Information System) sends a notification message containing information about a clinical event such as examination. After that on the NHIS side first structural validation is performed (XSD control) and if successful the message is inserted into a message queue for further verification (business rule Schematron) and processing. In return an application response, containing a ticket number, on this validation is returned to the user. After a while, the NHIS client queries the NHIS with query interaction about the status of the initial message, until it gets a success response. If there is still failure in the message, the NHIS client corrects its errors and resends the message through the notification interaction. Furthermore, the NHIS clients are allowed to update or delete a previous record with update or delete interactions. NHIS Client MCCI_AR000002TR01 NHIS MCCI_AR000001TR01 Notification MCCI_IN000001TR01 Application Response MCCI_IN000002TR01 Update MCCI_IN000003TR01 Application Response MCCI_IN000002TR01 Delete MCCI_IN000004TR01 Application Response MCCI_IN000002TR01 Query QUQI_IN000001TR01 Application Response QUQI_IN000002TR01 Figure 13-1: The Storyboard used in Turkey In summary, a system that wants to make a conformance claim to the HL7 v3 standard should first identify the application roles to which it claims conformance. For these application roles, the HL7 V3 specification directly states The trigger events the system shall recognize, 106

108 The messages that the system shall send in response to trigger events or other messages, The data content of these messages. The specification also states the messages that a system conforming to the application role shall receive and process. A system with such a conformance claim should be tested for each application role specified in the claim. This requires the grouping of testing requirements and test cases for each application role. For each application role, there would be many interactions where the role is either on the receiver or sender side. Therefore, a conformance test case can be designed for each interaction that the application role should support. However, there would be some complex cases where interactions are related (state-transitions). For those situations there would be further testing requirements and test cases. The ebusiness Specifications in the MOSS Scenario to be tested are summarized in Table Table 13-1: ebusiness Specifications in the HL7 Scenario ebusiness Specifications Business Process Business Documents Business process documentation and message choreography: Storyboard (interactions between application roles, text and sequence diagram) Business documents: HL7 XML messages based on Reference Information Model (HL7 RIM) Attribute values: HL7 Vocabulary and HL7 Data Types, external vocabulary with code lists (e.g. LOINC, SNOMED) Transport and Communication (Messaging) Protocols Profiles ebms or, WS-I, or MLLP HL7 v3 interoperability profiles (syntactic and semantic restrictions on the formats and payloads) 13.2 Testing Requirements What and How to Test? Verification Scope (What to Test?) Considering the HL7 v3 usage in Turkey, the following issues ( verification scope ) should be tested for NHIS clients claiming conformance to the HL7 v3 specifications: - The messages exchanged between NHIS clients and NHIS are SOAP Web Services messages having security credentials such as username and password in their header. This information is carried in terms of WS-Security Username Token Profile. Therefore, the first requirement is to test the systems whether they can insert their username and password to the SOAP header correctly. For this purpose, the messaging adapters can be used. The messaging adapters receive the SOAP message, check whether it contains the header correctly and extract the HL7 message and then the CDA document within the message. - As mentioned in the previous section, there are five interactions: notification, update, delete, query and application response. Furthermore, there may be 25 types of CDA documents, 107

109 which are customized to specific cases (e.g. examination, diabetes, pregnancy follow-up, vaccine, etc.), to be inserted into the notification message. In Turkey, these CDA documents do not completely conform to the official HL7 v3 CDA R2 specification; however, they are compatible with them. This means there are 25 different document schemas for each specific case. In this respect, when a CDA document is extracted from within a notification message, firstly, it should be validated against the corresponding schema. Document validators can be used for this purpose. They can check the CDA documents against corresponding XSD schemas. - Ministry of Health (MoH) developed a Health Coding Reference Server (HCRS) that maintains all of the healthcare related codes used in Turkey and the codes used in the CDA documents exchanged between NHIS clients and NHIS have to contain these codes. Therefore, the CDA documents should be checked so that they contain these codes correctly. For this purpose, document validators that can connect to the HCRS and check the value can be used. - Apart from the structure which is defined by the CDA schemas, Ministry of Health (MoH) defined business rules that show the relations among the elements of the CDA schemas. For example, a pregnancy follow up message cannot have a male patient. Therefore, these business rules should also be controlled for the received CDA documents. For this purpose, document validators can check the CDA documents against the corresponding Schematron files describing these business rules Testing Environment (How to Test) In the ehealth domain, the testing is usually carried out after a standard is published officially. It is not probable to test a system against a specification/standard which is still under development. The trigger to test systems is generally by the governments. In other words, the governments mandate the ehealth system vendors to conform to a specific standard and they are requested to show their conformance. At this point, the software vendors apply to corresponding testing organizations to prove their conformance. In this domain, the tested systems are test systems under development. Through testing the implementers also identify their deficiencies or bugs and improve their systems. After testing the implementers port their test system to real production environment. Considering how to test the NHIS clients, there may be three mechanisms: - On site testing: In this case, the tests are realized in the local setting of the NHIS clients. The SUT Operators (Test Participants) do not interact with the GITB testing system. An independent test manager executes the test case and the SUT Operators are only informed about the results of the test cases. For this case, necessary test scheduling should be performed beforehand. - Web based testing: In this case, the SUT Operators can test their system whenever and wherever they want. In other words, the SUT Operators are the Test Manager in this case. The SUT Operators connects to the GITB system, executes the test cases and are displayed the test result. - Testing workshops: A testing workshop where SUT Operators from different software vendors can participate and test their systems can be organized. Either the SUT Operators can play Test Manager role or a number of independent expert can play the Test Manager role. Regarding the testing topology, direct connection between participants is the most usual case. In other words, test participants connect their SUTs as they would in a real business environment. 108

110 13.3 Testing Scenarios How to Use GITB? This section describes the usage scenarios in the ehealth domain, especially Health Level Seven (HL7) v3 usage in Turkey, is presented. In Turkey, every hospital report about each clinical activity they performed to the Ministry of Health s National Health Information System (NHIS). This communication is based on HL7 v3 standard, where the messages contain Clinical Document Architecture Release 2 (CDA) based documents, and Web Services are used as the transport protocol. In this section, how the GITB system can be used to test the HL7 v3 usage in Turkey. As mentioned earlier, there are three types of testing envisaged in GITB Framework: (1) Document validation through a Web site (2) Conformance testing with an interacting test suite and (3) Interoperability testing with an interacting test suite. In the following subsections, these will be tailored to HL7 v3 usage scenario Scenario 1: Creating, Deploying and Executing a Validating Test Suite for Conformance to Turkish HL7 V3 Profile - Identify Test Scope: Application Roles and Interactions involved: Assume that a hospital in Turkey sends a HL7 Clinical Document Architecture (CDA) document about a physical examination for a patient to Ministry of Health National Health Information System Servers. In return, the MoH Servers sends an acknowledgement. From HL7 perspective, the interaction name is "Notification: MCCI_IN000001TR01", the interaction name for the acknowledgement is "Application Response: MCCI_IN000002TR01", the name of the sending role is "National Health Information System Client:MCCI_AR000002TR01" and the name of receiver role is "National Health Information System:MCCI_AR000001TR01". The message type is "Clinical Document Architecture for Examination:POCD_MT000005TR". In this Scenario #1, the sending hospital uploads its CDA document for testing. - Search for existing Test Cases for Message Types involved: The test designer queries the local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository with the above mentioned keywords for interaction name, role names and/or message type name. In return, the test designer discovers a test case. - Extension of Test Cases for new Profile: However, the test designer realizes that the discovered test case does not reflect the specific requirements for Turkey. For this purpose, the test designer implements a new Schematron file for the specific rules in Turkey. These rules put further constraints on the "Clinical Document Architecture for Examination" message. After that the test designer updates the discovered test case to further apply the newly developed Schematron file using Test Suite/Case Editor. - Packaging of all Profile Test Cases into a new Validating Test Suite: The Test Designer is assembling all the Test Cases into a "validating" Test Suite, which will be invoked for each document to validate. - Deployment of the new Validating Test Suite, on a Test Agent: Then, the test designer deploys the Test Suite on a Test Agent using Test Deployment Manager. - Registration (archiving) of the new Validating Test Suite: Then, after some preliminary QA, the test designer registers this new test suite to local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository, so that it can be archived, discovered later, and re-deployed if necessary. - Early User test the new Validating Test Suite: In these test cases, a Web based GUI (Test Monitoring and Management GUI) will be displayed to the user and through this GUI, the user will upload his "Clinical Document Architecture for Examination" sample documents and select the document type against which the validation will be realized one by one. After that the structural and semantical validation is performed by the corresponding Document Validator and evaluation result is displayed to the user. The 109

111 evaluation result is also stored to the local Test Artifact Persistence Manager (TAPM) by the Test Suite Engine. - Early Users file some Issues about the new Validating Test Suite: In return, the SUT Operator receives some error messages containing his/her document. The operator may also discover that the Test Case itself was not correct in flagging some errors, and that the Test Case needs to be corrected. The operator files the issue about the need to update the Test Case Scenario 2: Creating, Deploying and Executing an SUT Interacting Test Suite for Conformance to Turkish HL7 V3 Profile - Basic Test Case Design: Assume that the same HL7 scenario described in Scenario #1. In this case the SUT Operator will send an actual message to Test Suite Driver. Please note that the Test Suite Driver in the previous Scenario #1 becomes Test Agent in this Scenario #2. - Development of the Interacting Test Suite: The Test Designer chooses a test case template (assume test case with interaction based trigger) and edits the template to use the Test Agent in Scenario #1 for message validation. Furthermore, the test designer assign Web Service message binding to the test case. Then, s/he assembles an interacting test suite and archives it to Test Artifact Persistence Manager (TAPM). - Deployment of the new Interacting Test Suite, on a Test Suite Driver: Then, the test designer deploys the Test Suite on a Test Suite Driver using Test Deployment Manager. - Scheduling and Preparing for a Test Run: The Test Manager informs the SUT Operator about the availability of the conformance testing. - Doing an Interaction Test Run: The SUT Operator (or Test Manager) logins to the Testing Framework through Test Monitoring and Management GUI and selects the test suite/case. It should be noted that this test suite/case is still in template form and it should be configured to communicate with the HIS. Therefore, SUT Operator configures/instantiates the test suite/case by using Test Suite/Case Instantiator tool and stores the resulting test suite/case instance to local Test Artifact Persistence Manager (TAPM). After that either Test Manager or SUT Operator executes the test suite/case instance. According to the test case and as instructed in the Test Monitoring and Management GUI, the SUT Operator sends its "Clinical Document Architecture for Examination" message through Web Service protocol. The Interacting Test Suite gets the message using appropriate Messaging Adapter, extracts the CDA document in it and validates the document with Test Agent of Scenario #1 using its Test Agent Interface/API. Then the Interacting Test Suite gets the validation test report from Test Agent in Test Report format. After that the Interacting Test Suite sends an acknowledgement using Application Response: MCCI_IN000002TR01" interaction with Web Service protocol using related Messaging Adapter. Meanwhile the Test Report is stored to local Test Artifact Persistence Manager (TAPM) by the Test Suite Engine. - Logging, archiving and download of Test Report: After that, the validation report from Test Agent of Scenario #1 and other validation logs/report considering message level verification is displayed to the SUT Operator Scenario 3: Creating, Deploying and Executing an SUT Interacting Test Suite for Interoperability between several SUTs based on HL7 Turkish Profile - Basic Test Case Design: Assume the same scenario. In this case, there are two SUT Operators: SUT Operator of hospital information system (HIS) and SUT Operator of National Health Information System of MoH. The SUT Operator of HIS sends the "Clinical Document Architecture for Examination" message through Web Services to Test Suite Driver. Test Suite Driver does the validation with Test Agent of Scenario #1 and forwards 110

112 the message to MoH Servers. After that SUT Operator of MoH sends the acknowledgement to Test Suite Driver and the Test Suite Driver this time validates the acknowledgement (either by another external Test Agent or by itself) and forwards the acknowledgement to HIS. Finally, test reports are displayed to the SUT Operators. - Development of the Interacting Test Suite: Like in Scenario #2, Web Service Message Binding is assigned to the test case and then it is assembled to an Interacting Test Suite and archived in the local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository. - Deployment of the new Interacting Test Suite, on a Test Suite Driver: Then, the test designer deploys the Test Suite on a Test Suite Driver. - Scheduling and Preparing for a Test Run: SUT Operator of HIS and SUT Operator of MoH Servers are assigned to the roles specified in the test case as senders and receivers. - Doing an Interaction Test Run: The Interacting Test Suite (Test Suite Driver) acts as proxy between the two systems (HIS and MoH Servers) and validates both CDA for Examination message and acknowledgment message. The execution is monitored by the SUT Operator and Test Manager through the Test Monitoring and Management GUI. During the execution (which is under the control of Test Suite Engine), at each step, the Messaging Adapters receives the message from the SUT, perform message level validation, extracts the document and delivers it to the Test Suite Engine. The Test Suite Engine, according to the test suite description in TDL form, realizes that the document level validation should be performed by the Test Agent of Scenario #1. Therefore, Test Suite Driver communicates with Test Agent of Scenario #1 through the Test Agent Interfaces/API of the Test Agent. In this communication, Test Suite Driver delivers the document to the Test Agent and in return receives the validation in Test Report Format. After that the Test Suite Engine component of Test Suite Driver gives this Test Report to the Test Suite Engine to be processed and stored to the local Test Artifact Persistence Manager (TAPM). Meanwhile, this Test Report is sent to Test Monitoring and Management GUI to be displayed to the SUT operator in the monitoring of the test execution. Finally the message is forwarded to the intended receiver. - Logging, archiving and download of Test Report: Test Reports are displayed to SUT Operators Integrating the Healthcare Enterprise (IHE), Scheduled Workflow (SWF) Profile Scenario In this section, a usage scenario (in addition to the three document instance validation, conformance and interoperability common test scenarios identified by GITB) for the coordination of Test Beds as described in Appendix II is presented. For this purpose an Integrating the Healthcare Enterprise (IHE) scenario is selected. IHE is an international organization that focuses on the development of open and global IHE Integration Profiles and on the regional deployment of interoperable IT systems. IHE created and operates a process through which interoperability of health care IT systems can be improved. The group gathers case requirements, identifies available standards, and develops technical guidelines that manufacturers can implement. For this purpose, IHE publishes integration profiles. IHE Integration Profiles describe a clinical information need or workflow scenario and document how to use established standards (e.g. HL7, DICOM, LOINC...) to accomplish it. A group of systems that implement the same Integration Profile address the need/scenario in a mutually compatible way. One of the profiles in the radiology domain is Scheduled Workflow (SWF) Profile. The Scheduled Workflow Integration Profile establishes the continuity and integrity of basic departmental imaging data. It specifies a number of transactions that maintain the consistency of patient and ordering information as well as providing the scheduling and imaging acquisition procedure steps. One of the processes in SWF Profile is Administrative and Procedure Performance Process where a procedure is scheduled for a patient by Radiology Information System (RIS) and after the 111

113 scheduled procedure is completed by a modality (e.g. MR Device), the image is stored to a Picture Archiving and Communication System (PACS). In Figure 13-2, the process is displayed. First the Order Filler actor (RIS) schedules a procedure from Image Manager actor (PACS) through RAD4 Procedure Scheduled transaction. After that Image Modality actor (e.g. MR device) starts the execution and informs this to Image Manager through RAD6 Modality Procedure Step In Progress transaction. After the image is ready, Image Modality stores the image to Image Manager through RAD8 Modality Images Stored Transaction. Finally Image Modality completes the session with the RAD7 Modality Procedure Step Completed transaction. Order Filler (RIS) Image Mngr. (PACS) Image Modality (e.g. MR) RAD4 Procedure Scheduled RAD6 Modality Procedure Step In Progress RAD8 Modality Images Stored RAD7 Modality Procedure Step Completed Figure 13-2: A part of Administrative and Procedure Performance Process of SWF Profile In this process the RAD4 Procedure Scheduled transaction use HL7 v2 messages and the rest of the transactions use Digital Imaging and Communications in Medicine (DICOM) messages. Assume that a vendor (say VendorPACS) implemented a PACS SUT and wants to test its product against the Administrative and Procedure Performance Process of SWF Profile. For this purpose, the VendorPACS applies to a Test Company (CompanyTest), which has a Test Suite Driver, to realize this test. CompanyTest is highly specialized in HL7 domain and its Test Suite Driver supports HL7 v2 messaging, but the CompanyTest does not have expertise on DICOM messages. Therefore, CompanyTest queries the GITB Test Registry/Repository, which is a well-known registry of Test services, and discovers the TestAgentDicom company, who can provide a interacting Test service to test RAD6, RAD7 and RAD8 DICOM based IHE SWF transactions. CompanyTest examines the Coordination Service Metadata of the discovered Test Service and understand all of the technical details to execute the service. If there is any contractual issue (e.g. service fee) CompanyTest contacts TestAgentDicom and handles these issues externally. Afterwards, CompanyTest starts to develop a test suite for the testing of the PACS system (Image Manager Actor). It should be noted that both CompanyTest s Test Suite Driver and TestAgentDicom s Test Service supports the Test Report Format and Coordination Service Metadata. CompanyTest decides that in the test, four messages will be sent to the SUT (PACS). The first message will be sent by the Test Suite Driver of CompanyTest, as the message is HL7 based and the rest of the messages will be sent by the TestAgentDicom s test service as they are based on DICOM standard. Finally, CompanyTest deploys the test suite to its Test Suite Driver and realizes the tests with the PACS system. 112

114 apply CompanyTest (Test Suite Driver) query contact GITB Reg/Rep publish VendorPACS (SUT) TestAgentDicom (Test Agent) Figure 13-3: Interactions among the Parties At test run time the interactions are sequentially as follows (as shown in Figure 13-3): 1. The SUT user (e.g. implementer of PACS system) connects to the Test Suite Driver of CompanyTest and starts the test. 2. After that the test between the PACS system and Test Suite Driver starts. 3. An HL7 based RAD4 message is sent to the PACS system. 4. Now it is turn to send DICOM messages. 5. For this purpose, Test Suite Driver calls the configuretestcaseortestsuite() operation of TestAgentDicom and obtains a sessionidentifier. 6. In this operation, Test Suite Driver also provides the endpoints to which the RAD6, RAD7 and RAD8 DICOM based IHE SWF messages sent. 7. Test Suite Driver calls the starttestsuiteexecution() operation of TestAgentDicom. 8. The starttestsuiteexecution() operaton triggers the sending of three messages (RAD6, RAD8 and RAD7) messages to the PACS by the TestAgentDicom. 9. As the execution the test for the DICOM based messages are external to Test Suite Driver, it checks the status of the execution with the getstatusoftestsuiteexecution() operation of TestAgentDicom. 10. If the test exection between the PACS and TestAgentDicom s Test Service is completed, the Test Suite Driver gets the test validation result (in Test Report Format) by calling the getstatusoftestsuiteexecution() operation. 11. Finally, the whole report (consolidated RAD4 report and RAD6, RAD7 and RAD8 report) is displayed to the SUT user. 113

115 CompanyTest (Test Suite Driver) Test Services TestAgentDicom (Test Agent) (3)RAD4 messag e (1)Start test (11)Display Test Report (5) (7) (9,10) ConfigureTestCa seortestsuite() StartTestSuiteEx ecution() GetStatusOfTest SuiteExecution() (8)RAD6, RAD7 and RAD8 messages VendorPACS (SUT) Figure 13-4: Run time Interactions 114

116 14 Use Case 3: Public Procurement CEN/BII Scenarios In this section, business and testing requirements for the PEPPOL use case will be defined: 1. ebusiness Scenarios: The perspective of this section is from the business user s point of view. 2. Testing Requirements What and how to test? The perspective of this section is from the people testing the business scenarios, typically business users responsible for implementation or the integrators or software vendors working with the business users. 3. Testing Solutions How to use and apply the GITB Testing Framework? The perspective of this section is from the software architect responsible for utilizing the GITB framework to set up the appropriate Test Services and Test Artifacts to support the Testing Scenarios ebusiness Scenarios PEPPOL (Pan-European Public Procurement Online) aims to implement common standards enabling EU-wide public electronic procurement. Existing national systems of electronic public procurement should be linked so that all participants can enjoy the full benefits of a single European market. PEPPOL is a project operated under the European Commission s Competitiveness and Innovation Framework Programme s ICT Policy Support Programme. The broader vision of PEPPOL is that any company in the EU can communicate electronically with any EU governmental institution for pre-award and post-award electronic procurement activities. PEPPOL should allow any supplier in the EU to respond to any European public tender and conduct any resulting purchasing utilizing their existing national infrastructure as shown in Figure Figure 14-1: PEPPOL Architecture 115

117 As shown in the figure above, the PEPPOL project is structured in four main building blocks, covering part of the public procurement processes in Europe from etendering to einvoicing. It defines a common infrastructure for digital signature support and a transport infrastructure to enable document exchange. There are two sets of PEPPOL specifications: Business Interoperable Specifications (BIS) Transport Profiles BIS are based on the CEN Workshop Agreement CWA on Business Interoperable Interfaces (BII). This CWA addresses the standardization for the data exchange within an infrastructure shared by business partners. The focus is the semantics of the public procurement business processes built by XML-based vocabularies and expressed as a set of profile descriptions. A profile description is a technical specification describing: The choreography of the business processes. The business rules governing the execution of these business processes. The information content of the electronic business transactions exchanged. CWA does not define how documents should be exchanged between parties; hence transport profiles have been defined in the PEPPOL project to cover document exchange. Table 14-1: ebusiness Specifications in PEPPOL ebusiness Specifications Business Process Business Documents Business process specification documentation and message choreography based on CEN BII Profiles (CWA 16073) Business documents: UBL XML documents based on CEN BII core transaction data models Transport Communication (Messaging) Protocols Profiles and Attribute values: CEN BII defined Genericode code lists. PEPPOL Transport Profiles: START protocol for transport and communication between service providers (4- corner model). SMP / SML protocols to enable address discovery and routing LIME protocol for user to service provider communication PEPPOL Business Interoperability Specifications (process choreography and syntactical and semantic restrictions on data contents) PEPPOL Transport profiles. 116

118 14.2 Testing Requirements What and How to Test? Verification Scope (What to Test?) PEPPOL can be seen from different perspectives: From a buyer or seller point of view, being PEPPOL conformant means their system is capable of running a business specification defined in a Business Interoperable Specification (BIS), following the defined choreography and exchanging documents with the constrains specified in the normative data models. From a service provider point of view, to claim for PEPPOL conformance means that he is able to provide transport services to every different PEPPOL service provider by means of the PEPPOL transport infrastructure Business Interoperable Specifications Business Interoperable Specifications are specifications for common processes in PEPPOL based on the CWA BII Profiles. They describe the choreography of the messages exchanged and the contents of the documents being exchanged. For every BIS implemented in an information system, the GITB must verify: The correct sequence or choreography of the messages exchanged among the parties The validity of the syntax or structure of the documents being exchanged The validity of the business rules specified for the contents of the documents Pre Award Business Processes Processes executed before the awarding of a contract in a public procurement context. Pre award processes are focused on ensuring equal treatment and non-discrimination to facilitate fairer and more effective competition in the European market by enabling suppliers to compete in an open and transparent way. PEPPOL has focused on two main areas of standardization within pre award business processes: Providing interoperable solutions for economic operators in any European country to utilize qualification company information and to submit this information electronically to a contracting authority in any Member State when these economic operators decide to apply for public contracts. This qualification information is known in the PEPPOL project as Virtual Company Dossier (VCD). Pre-award catalogues to allow a common definition of catalogue information used when defining and bidding to a call for tenders. 117

119 Post Award Business Processes Post award business processes are primarily intended for public procurement but they can also be used in private procurement. The business processes covered in PEPPOL in this phase of the procurement are: 1. Electronic catalogue. Exchanging catalogue information from seller to buyer is specified in BIS 1a. 2. Electronic ordering. The ordering process is defined in BIS 3a and it covers single order submission. 3. Electronic invoicing. PEPPOL has defined two different BIS for the invoicing business process, 4a and 5a. The former is a specification for submitting an invoice only and the latter covers the invoice submission and dispute resolution. Apart from these process specifications, PEPPOL has defined a cross-business process specification: BIS 6a. This is a specification that covers two business processes, ordering and invoicing. It is a complex profile and will be detailed in section Business Interoperability Specification sample: BIS 6a as the sample for this PEPPOL testing scenario Transport Infrastructure The goal of the PEPPOL transport infrastructure is to provide secure and reliable document exchange to trading partners across border in PEPPOL participant countries. Each country or trading area can provide an access point that uniquely identifies companies and government agencies within his domain. An objective from PEPPOL transport infrastructure is to protect investment in national infrastructures and document formats, creating a secure way to exchange them cross borders and leveraging the current state-of-the-art in deployment of national solutions and infrastructures. Figure 14-2: PEPPOL Transport Infrastructure 118

120 As a consequence, how documents are sent within the participating countries or trading areas to and from a PEPPOL Access Point (AP) and in what format is not in scope of the PEPPOL project so the verification scope for transport profiles only covers the behaviour of the Access Points when: To perform endpoint lookup for address discovery To exchange Secure and reliable document Infrastructure Components The PEPPOL Transport infrastructure has three main components: Access Point (AP): The access point is the entry point into the PEPPOL infrastructure Service Metadata Publisher (SMP): A service metadata publisher offers a service on the network where information about services of specific participant businesses can be found and retrieved. It is necessary for a client application to retrieve the metadata about the services for a target participant business before the client can use those services to send messages to the participant business. Service Metadata Locator (SML): A service which provides a client with the capability of discovering the Service Metadata Publisher endpoint associated with a particular participant identifier. A client uses this service in order to find where information is held about services for a particular participant business. Figure 14-3: Transport and Delivery through PEPPOL 119

121 Endpoint Lookup When exchanging documents across the PEPPOL infrastructure, an Access Point has to make a lookup for the receiver s capabilities and technical endpoint information. 1. The sender creates the document specifying endpoint information for the receiver and identification of the document and customization if any. 2. The sender Access Point receives the document and gets the receiver endpoint identifier 3. It queries the SML with this endpoint data and retrieves the URI of the SMP responsible for storing receiver capabilities information. 4. The sender Access Point creates a query for the SMP with the document identifier and the receiver s identifier. 5. SMP replies with the URI for the receiver s Access Point. Figure 14-4: Endpoint Lookup Document Exchange PEPPOL does not mandate any specific protocol for document submission or delivery within the same community of users. The requirement is to use an Access Point as an entry point to the PEPPOL infrastructure and the START protocol when exchanging documents between them. 120

122 Figure 14-5: Document Exchange The sequence for exchanging documents is the following: Sender (C1) submits the document to his Access Point (AP1) using its own protocol AP1 uses document metadata to create a Query to lookup for receiver s AP2 address SML/SMP replies with receiver AP2 endpoint If the Receiver is able to receive this document AP1 submits the document to AP2 using the START protocol AP2 forwards the document to C2 using their own communications protocol Business Interoperability Specification sample: BIS 6a PEPPOL BIS 6a (Procurement) is based on CEN BII 06 Profile (CWA 16073) and covers eordering and einvoicing processes. The process of ordering involves transmission of order and order response documents between trading partners. The process of billing involves transmission of invoice and optional credit note and corrective invoice documents between trading partners. At technical level, document models are bound to UBL 2.0 ( 2.0/UBL-2.0.html) syntax. Therefore, technical validation can then be done using UBL 2.0 XML Schema ( and using additional CEN ISSS WS/BII and PEPPOL Business rules through the use of Schematron. 121

123 Actors/Roles BIS 6a defines document exchanges between an Economic Operator (supplier) acting in the role of a seller or creditor and a Contracting Authority (customer) acting in the role of a buyer or debtor: In this BIS Figure 14-6: PEPPOL Actors and Roles the Customer is the legal person or organization requesting goods or services; the Supplier is the legal person or organization that provides a product or service; the buyer is the issuer of the order acting on behalf of the customer; the seller is acting on behalf of the supplier; the creditor is the issuer of the invoice, meaning that no self-billing or factoring is involved; the debtor receives the invoice and is responsible settling debt Profile Choreography Documents need to be exchanged between the Customer and the Supplier using a specific choreography, agreed and supported by both parties. The choreography used in BIS 6a is defined by the CEN BII profile BII06 and contains two business processes: 1. The ordering process that contains two business collaborations, Ordering collaboration and OrderResponse collaboration. 2. The billing process that contains two business collaborations, Invoicing collaboration and ResolveInvoiceDispute collaboration. The following diagram shows the relationships between the collaborations of each business process implemented by the profile. The choreography of business collaborations defines the sequence of interactions when the profile is run within its context. 122

124 Figure 14-7: PEPPOL BIS 6a Choreography Each sequence of interactions can be understood as a run-time scenario. Ordering collaboration: The BiiColl001 collaboration contains the BII transaction BiiTrns001 that includes a data model specifying the information to be sent in any Order from the buyer to the supplier. The buyer activates the Ordering collaboration to order goods or services from the seller. Figure 14-8: Ordering Collaboration Order response collaboration: The BiiColl003 collaboration contains the BII transactions BiiTrns002 and BiiTrns003 that include data models used to specify whether the order has been accepted or rejected from the supplier to the buyer. 123

125 Figure 14-9: Order Response Collaboration Invoicing collaboration: The BiiColl004 collaboration contains the BII transaction BiiTrns010 that includes a data model that specifies the information to be sent in an Invoice from the supplier to a buyer. Figure 14-10: Invoicing Collaboration ResolveInvoiceDispute collaboration: The BiiColl005 collaboration contains the BII transactions BiiTrns014 and BiiTrns015 that include data models that specify the information to be sent in any CreditNote and CorrectiveInvoice from the supplier to the buyer. Any invoice dispute is resolved outside of this BIS description. 124

126 Figure 14-11: Resolve Invoice Dispute Collaboration Transaction sequence BIS 6a follows a sequence of transaction exchanges between customer and supplier in their different roles. The order of the transactions follows the process choreography specified in the chapter above. Figure 14-12: CEN BII 06 Transaction Sequence 125

127 Testing Environment (How to Test?) Testing context There are different testing contexts for PEPPOL related parties: 1. During the standardization process in CEN BII Workshop, test cases and artifacts should be produced to ensure the future compliance of real-life projects such as PEPPOL or eprior to the Profiles defined within the Workshop. When creating PEPPOL BIS, PEPPOL consortia developers should do testing using these artifacts to ensure CEN BII Profiles are properly used promoting interoperability among different electronic procurement projects. 2. When developing PEPPOL transport infrastructure services such as Access Points or SMP Registries, service providers should test their implementations for conformance to the PEPPOL transport profiles. Interoperability tests should be run for these intended service providers to let them join the PEPPOL infrastructure. 3. When implementing a BIS in and ERP or ICT system, an ERP vendor or system integrator should use PEPPOL conformance testing to claim for conformance to that BIS. 4. Interoperability tests could be run when onboarding of new users to PEPPOL using different ERP systems and service providers. An end-user would benefit from interoperability testing with his customers and providers Testing integration in business environment Depending on the testing contexts, different testing systems can be used by the different PEPPOL user communities: Testing system is the in-production system: End users could benefit from an interoperability testing system allowing them to ensure their in-production systems can reach the ones from his customers and providers through the PEPPOL infrastructure. Testing system is a non-production system: For service providers and ERP system vendors or system integrators, where the testing process may occur parallel to the development process, non-production systems should be used to test their conformance to PEPPOL specifications. No integration in business environment: Manual testing should be done when defining PEPPOL specifications based on CEN BII Profiles. The standardization process run in PEPPOL is not part of the business environment and is only concerned with CEN BII Profile conformance. Manual interoperability test can also be done between PEPPOL and other large projects such as eprior Testing location Considering how to test buyers and sellers, there may be different types of testing locations, all supported by the Testing Framework. - Test subscription procedure: In this case, the tests are realized in the local setting of the participating parties typically using in-production SUTs. The SUT Operators do not need to be directly involved in the testing process of their SUT: they defer the testing process 126

128 e.g. via a subscription process to a third-party Test Manager who remotely drives and executes the test cases. The SUT Operators are only informed about the test results. - Web based remote self-testing: In this case, the SUT Operators can test their system (either their in-production system or a test version) whenever and wherever they want. The SUT Operators connect to the GITB Test Bed, execute the test suite/cases and get the test results. - Staged testing workshops: wherein this setting, SUT Operators (or the vendors providing these SUTs) meet either face-to-face or virtually for interoperability sessions over test versions of their SUTs. These interoperability sessions do not test transport mechanisms between both buyers and sellers as their relationship is done through a service provider that will route the messages through the PEPPOL infrastructure, they only affect the choreography of the documents exchanged between parties. Similar testing procedures can be implemented to test Access Points and SMP repositories for address discovery and document exchange using PEPPOL infrastructure. In this case, interoperability sessions have special relevance between different access points to ensure proper delivery of documents across PEPPOL infrastructure Testing Scenarios How to Use GITB? This section describes how the GITB system can be used to test the PEPPOL Infrastructure. As mentioned in Section Testing Scenarios above, there are three types of testing scenarios envisaged in GITB Framework: (1) Document validation through a Web site (2) Conformance testing with an interacting test suite and (3) Interoperability testing with an interacting test suite. In the following subsections, these will be tailored to PEPPOL usage scenario Scenario 1: Creating, Deploying and Operating a Validating Test Suite for Conformance to a PEPPOL Profile Identify Test Scope: Application Roles and Interactions involved: The organizational process (profile choreography) for PEPPOL Business interoperability Specification is defined by the corresponding CEN BII profile. To illustrate this scenario, BIS 6a will be used as a sample. BIS 6a contains two business processes: (1) The ordering process contains two business collaborations, Ordering collaboration and OrderResponse collaboration (2) The billing process contains two business collaborations, Invoicing collaboration and ResolveInvoiceDispute collaboration. Assume that a software company, called SoftwareCompanyA, develops an ERP system to a buyer organization, called BuyerA. SoftwareCompanyA claims that the ERP system is conformant to PEPPOL Business interoperability Specification Procurement 06a. The business collaboration names, business transaction names and document names that are used in these collaborations are as follows: o o Ordering BiiColl001 Submit Order BiiTms001 SubmitOrder BiiCoreTrdm001 OrderResponse BiiColl003 Reject Order BiiTms002 RejectOrder BiiCoreTrdm002 Accept Order BiiTms

129 o o AcceptOrder BiiCoreTrdm003 Invoicing BiiColl004 Submit Invoice BiiTms010 SubmitInvoice BiiCoreTrdm010 ResolveInvoiceDispute BiiColl005 Correct With Credit BiiTms014 CorrectWithCredit BiiCoreTrdm014 Correct With Invoice BiiTms015 CorrectWithInvoice BiiCoreTrdm015 In this Scenario #1, the SoftwareCompanyA uploads its documents for testing to the GITB. Search for existing Test Cases for Message Types involved: The Test Designer queries the local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository with the above keywords for business collaboration, business transaction and business document names. However, the Test Designer does not find any Test Case. Therefore, the Test Designer develops new Test Cases using Test Suite/Case Editor. In this design, 6 new test cases (one for each business document) are implemented. The test cases use the following XSD documents for structure validation and Schematron files for semantic verification: o o o o o o BiiCoreTrdm001 UBL-Order-2.0.xsd BIIRULES-UBL-T01.sch and EUGEN-UBL-T01.sch BiiCoreTrdm002 UBL-OrderResponseSimple-2.0.xsd BiiCoreTrdm003 UBL-OrderResponseSimple-2.0.xsd BiiCoreTrdm010 UBL-Invoice-2.0.xsd BIICORE-UBL-T10.sch, BIIPROFILES-UBL-T10.sch, BIIRULES-UBL- T10.sch and EUGEN-UBL-T10.sch BiiCoreTrdm014 UBL-CreditNote-2.0.xsd BIICORE-UBL-T14.sch, BIIPROFILES-UBL-T14.sch, BIIRULES-UBL- T14.sch and EUGEN-UBL-T14.sch BiiCoreTrdm015 UBL-Invoice-2.0.xsd BIICORE-UBL-T15.sch, BIIPROFILES-UBL-T15.sch, BIIRULES-UBL- T15.sch and EUGEN-UBL-T15.sch Extension of Test Cases for new Profile: As the test cases are developed for this case from scratch this step is not applied. Packaging of all Profile Test Cases into a new Validating Test Suite: The Test Designer assembles all the Test Cases into a "validating" Test Suite, which will be invoked for each document to validate. Deployment of the new Validating Test Suite, on a Test Agent: Then, the Test Designer deploys the Test Suite on a Test Agent using Test Deployment Manager. Registration (archiving) of the new Validating Test Suite: Then, after some preliminary QA, the Test Designer registers this new Test Suite to local Test Artifact Persistence Manager 128

130 (TAPM) and GITB Test Registry/Repository, so that it can be archived, discovered later, and re-deployed if necessary. Early User test the new Validating Test Suite: In these test cases, a Web based GUI (Test Monitoring and Management GUI) will be displayed to the user and through this GUI, the user will upload his documents and select the document type against which the validation will be realized one by one. After that the structural and semantical validation is performed by the corresponding Document Validators and evaluation result is displayed to the user. The evaluation result is also stored to the local Test Artifact Persistence Manager (TAPM) by the Test Suite Engine. Early Users file some Issues about the new Validating Test Suite: In return, the SUT Operator gets some error messages containing his/her document. The operator may also discover that the Test Case itself was not correct in flagging some errors, and that the Test Case needs to be corrected. The operator files the issue about the need to update the Test Case Scenario 2: Creating, Deploying and Executing an SUT Interacting Test Suite for Conformance to a PEPPOL Profile Basic Test Case Design: Assume the same scenario described in Scenario #1. In this case the SUT Operator will send actual UBL messages to Test Suite Driver. Note that the Test Suite Driver in the previous Scenario #1 becomes Test Agent in this Scenario #2. It was always a Test Agent, but in Scenario #1 it was driven by a Web browser and related Web server, sending requests for testing. The Test Suite Driver will use the validation capability of Test Agent of Scenario #1 when executing its interacting test suite. In other words, the Test Suite Driver will receive the messages from the ERP system (SUT) of SoftwareCompanyA, extract the UBL document from the message, validate the document with the use of Test Agent of Scenario #1 and proceed to the next step in the process. At the end of the process the validation results will be displayed to the SUT User. In this scenario, the Test Designer will design such a Test Case and pack it into an interacting Test Suite. Using the same sample BIS 6a, SoftwareCompanyA will send an Order message to the Test Suite Driver. After that the Test Suite Driver will accept the order with an OrderResponseSimple and will send an Invoice document. After that there will be a dispute in the Invoice because it is realized that there is an overcharge. This overcharge is corrected by the Test Suite Driver with a CreditNote document to SoftwareCompanyA. Development of the Interacting Test Suite: Like in the previous scenario, the Test Designer queries the local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository for an existing template but it cannot be found. Therefore, the Test Designer creates a Test Case covering all the business collaborations in the PEPPOL Business interoperability Specification Profile 06a. The Test Designer creates the Test Case by using Test Suite/Case Editor and saves the resulting test case to local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository in Test Description Language format. The test case description includes the following business collaborations, transactions and documents: o o Ordering BiiColl001 Submit Order BiiTms001 SubmitOrder BiiCoreTrdm001 OrderResponse BiiColl003 Reject Order BiiTms002 RejectOrder BiiCoreTrdm

131 o o Accept Order BiiTms003 AcceptOrder BiiCoreTrdm003 Invoicing BiiColl004 Submit Invoice BiiTms010 SubmitInvoice BiiCoreTrdm010 ResolveInvoiceDispute BiiColl005 Correct With Credit BiiTms014 CorrectWithCredit BiiCoreTrdm014 Correct With Invoice BiiTms015 CorrectWithInvoice BiiCoreTrdm015 Furthermore, the Test Designer binds Web Service Message Adapters to each step of the test case. Deployment of the new Interacting Test Suite, on a Test Suite Driver: Then, the Test Designer deploys the Test Suite on a Test Suite Driver using Test Deployment Manager. Scheduling and Preparing for a Test Run: The Test Manager informs the SUT Operator about the availability of the conformance testing for the PEPPOL Business interoperability Specification Procurement 06a profile. Doing an Interaction Test Run: The SUT Operator (or Test Manager) logins to the Testing Framework through Test Monitoring and Management GUI and selects the Test Suite/Case for the PEPPOL Business interoperability Specification Procurement 06a profile. It should be noted that this Test Suite/Case is still in template form and it should be configured to communicate with the ERP system of SoftwareCompanyA. Therefore, SUT Operator configures/instantiates the Test Suite/Case by using Test Suite/Case Instantiator tool and stores the resulting Test Suite/Case instance to local Test Artifact Persistence Manager (TAPM). After that, either Test Manager or SUT Operator executes the Test Suite/Case instance. The execution is monitored by the SUT Operator and Test Manager through the Test Monitoring and Management GUI. During the execution (which is under the control of Test Suite Engine), at each step, the Messaging Adapters receive the message from the SUT, perform message level validation, extracts the document and delivers it to the Test Suite Engine. The Test Suite Engine, according to the test suite description in TDL form, realizes that the document level validation should be performed by the Test Agent of Scenario #1. Therefore, Test Suite Driver communicates with Test Agent of Scenario #1 through the Test Agent Interfaces /API of the Test Agent. In this communication, Test Suite Driver delivers the document to the Test Agent and in return receives the validation in Test Report Format. After that the Test Suite Engine component of Test Suite Driver gives this Test Report to the Test Suite Engine to be processed and stored to the local Test Artifact Persistence Manager (TAPM). Meanwhile, this Test Report is sent to Test Monitoring and Management GUI to be displayed to the SUT operator in the monitoring of the test execution. Logging, archiving and download of Test Report: After that, the validation report from Test Agent of Scenario #1 and other validation logs/report considering message level verification is displayed to the SUT Operator. 130

132 Scenario 3: Creating, Deploying and Executing an SUT Interacting test Suite for Interoperability between Several SUTs based on a PEPPOL Profile Basic Test Case Design: In this case the interoperability testing of PEPPOL Business interoperability Specification Procurement 06a profile is realized. Assume that SoftwareCompanyA passes first two Scenarios. Assume further that there are the following additional parties: o A software company called SoftwareCompanyB, which develops ERP system for a seller organization. o A service provider called ServiceProviderC providing Access Point service to the PEPPOL infrastructure. o A service provider called ServiceProviderD providing Access Points service too. o A service provider called ServiceProviderF providing Service Metadata Publishing (SMP) services. In this interoperability testing, the following layout is used for testing: o ServiceProviderC will be the access point of SoftwareCompanyA. o ServiceProviderD will be the access point of SoftwareCompanyB. o ServiceProviderF will be the SMP service provider for both access points. SoftwareCompanyA (buyer) will send an Order message to the SoftwareCompanyB (seller). For this purpose, SoftwareCompanyA will first submit the Order document to its access point ServiceProviderC using Web Services Protocol. Then, ServiceProviderC will use Order document metadata to create a query to the PEPPOL SML service. This service will resolve the Query and return the recipient s SMP IP Address of ServiceProviderF. Afterwards, ServiceProviderC Queries ServiceProviderF sending document and profile metadata and ServiceProviderF replies with receiver access point (ServiceProviderD) endpoint. ServiceProviderC submits the document to ServiceProviderD using the START protocol and ServiceProviderD forwards the document to SoftwareCompanyB using Web services communications protocol. Then, SoftwareCompanyB will accept the order with OrderResponseSimple and will send an Invoice document through the same channel. In this case, SoftwareCompanyB will use its access point ServiceProviderD to lookup for the address of ServiceProviderA. The SMP service will be shared in this scenario and provided by ServiceProviderF. After that there will be a dispute in the Invoice as there is an overcharge. This overcharge is corrected by the SoftwareCompanyB with a CreditNote document to SoftwareCompanyA. Furthermore, the Test Designer wants to test cross document validation and therefore, the following constraints must be added to the test case description: 1- Order reference Invoice: An Invoice must reference the Order. 2- Invoice reference CreditNote: A Credit Note must reference the invoice that is being corrected. In these exchanges the Test Suite Driver will act as a proxy and forward the messages without any modification to the intended receivers. The Test Designer, by using the Test 131

133 Suite/Case Editor, creates a Test Suite/Case from scratch and stores it in TDL format to local Test Artifact Persistence Manager (TAPM) and GITB Test Registry/Repository. - Development of the Interacting Test Suite: Like in Scenario #2, Web Service Message Binding is assigned to the Test Case and then it is assembled to an Interacting Test Suite and archived in Test Repository. - Deployment of the new Interacting Test Suite, on a Test Suite Driver: Then Test Designer deploys the Test Suite on a Test Suite Driver. - Scheduling and Preparing for a Test Run: The Test Manager informs the SUT Operators about the availability of the interoperability testing for the PEPPOL Business interoperability Specification Procurement 06a profile. - Doing an Interaction Test Run: The SUT Operators (or Test Manager) logins to the Testing Framework through Test Monitoring and Management GUI and selects the Test Suite/Case for the PEPPOL Business interoperability Specification Procurement 06a profile. This Test Suite/Case is still in template form and it should be configured to communicate with the ERP systems of SoftwareCompanyA and SoftwareCompanyB. Therefore, SUT Operators configures/instantiates the Test Suite/Case using Web based graphical Test Suite/Case Instantiator tool and stores the resulting Test Suite/Case instance to local Test Artifact Persistence Manager (TAPM). After that either Test Manager or SUT Operators executes the Test Suite/Case instance. SUT Operator and Test Manager monitor the execution through the Test Monitoring and Management GUI. During the execution (which is under the control of Test Suite Engine), at each step, Messaging Adapters receive the message from the SUT, perform message level validation, extracts the document and delivers it to the Test Suite Engine. The Test Suite Engine, according to the test suite description in TDL form, realizes that the document level validation should be performed by the Test Agent of Scenario #1. Therefore, Test Suite Driver communicates with Test Agent of Scenario #1 through the Test Agent Interfaces /API of the Test Agent. In this communication, Test Suite Driver delivers the document to the Test Agent and in return receives the validation in Test Report Format. After that the Test Suite Engine component of Test Suite Driver gives this Test Report to the Test Suite Engine to be processed and stored to the local Test Artifact Persistence Manager (TAPM). Meanwhile, this Test Report is sent to Test Monitoring and Management GUI to be displayed to the SUT operator in the monitoring of the test execution. Finally the message is forwarded to the intended receiver. - Logging, archiving and download of Test Report: Test Reports are displayed to SUT Operators. 132

134 Part IV: Test Bed Governance and Development Process 15 Governance The implementation A distributed GITB infrastructure and a network of test beds requires an appropriate governance, covering different aspects. The GITB infrastructure comprises is comprised of several entities, such as GITB specifications and test beds as well as registries and test resources to support them. Each of these entities must be developed and maintained in an appropriate manner. Several models to govern these entities were considered. Following are a description of these entities, the models considered, and a governance recommendation Governance Models In order to explore how to best manage a distributed GITB infrastructure and network of test beds governance models are needed. Three governance models are described and considered: Single Governance Model: A single organization is responsible for governance. This organization would contain a collection of test bed experts that are recognized as owning all aspects of GITB. This GITB Organization is a virtual one and will be comprised of members of other organizations involved with the creation and testing of document and messaging standards. Shared Governance Model: The governance is shared with a GITB Organization and other organizations. These other organizations are test bed providers, SDOs, vertical industry associations, trade associations, etc. These organizations are referred to as industry consortia in the following descriptions. Clear lines of responsibility must be established and administered. Open Governance Model: This is similar to the shared model, but instead of organizations owning and administering test bed entities, the entities are published and freely available to the public. This is somewhat analogous to an Open Source model. The items or entities that must be managed by all of these models are described in Figure 5-1, the GITB Framework: GITB Specification: The document describing the GITB Framework, the specification of what a GITB compliant test bed should contain. GITB Registry / Repository: The facility used to store and manage test resources that are accessible for users. Test Bed Instances: The physical implementations of GITB compliant test beds. Test Bed Artifacts: The test case and test logic that are related to the ebusiness specifications/profiles to be tested for. They include the document schemas and rules (for business document standards), messaging protocols, or business transactions, or 133

135 choreography, etc. These artifacts could be configured into test suites in order to execute in a particular test environment or business context. Test Bed Resources: The services and components in the GITB Framework. The services are the major functional areas that lend themselves to a service based access as general test services. They include Test Design, Test Deployment, Test Execution, and Test Repository Services. The components are the infrastructure of a GITB compliant test bed and include the Core Platform, User-facing, and Testing Capability components. All of these governance entities must be addressed according to their life cycle aspects: Author: An organization(s) or person(s) must originate and control the entity. Endorsement/Certification: Throughout the life of an entity, modifications will be proposed. These proposals and eventual implementations may require endorsement or certification by an organization(s) or person(s). Operation: An organization(s) or person(s) must be responsible for the implementation and operation of an entity Single Governance Model Under the Single Governance Model, as its name implies, all entities are under the control of a single organization, the GITB Organization. The exception to this complete control would be that other organizations such as SDOs, vertical industry organizations, trade associations (i.e. Industry Consortia) could jointly develop Test Bed Artifacts with the GITB Organization. Endorsing and operating these entities would be through the GITB Organization. The GITB Organization would host and maintain the test beds and work collaboratively with other organizations to develop test artifacts to support business processes. Table 15-1 describes the model: Table 15-1: Single Governance Model Shared Governance Model Under the Shared Governance Model, governance is shared between the GITB Organization and Industry Consortia. The thought is that the GITB Organization would still author, endorse, and operate the specification, but other organizations would host and maintain their test bed instances, develop and execute their test suites, and provide the needed services and components for their business processes. The GITB Organization would still endorse test bed instances to ensure compliance with the GITB Specification. Table 15-2 describes the Shared Governance Model. 134

136 Table 15-2: Shared Governance Model The main difference between the Single and Shared Governance model, as distinguished between the two tables, is that the operation of the test beds go from the GITB Organization to the Industry Consortia organizations, which is reasonable as they are the ones most familiar with their business processes Open Governance Model The Open Governance Model takes the Shared Governance Model several steps further. Under this model the GITB Organization only controls (Author, Endorse, Operate) the specification. Anyone can develop a test bed instance with the associated artifacts, services and components under a royalty-free license. Test bed endorsement can be provided by the single organization as well as 3 rd party providers. Table 15-3 describes the Open Governance Model. Table 15-3: Open Governance Model The Open Governance Model allows other organizations to exert more control over the authoring and endorsement aspects of the test bed. Services and components can be developed through this open source-like approach. 135

137 15.2 Governance Recommendation Of the three models evaluated, the Single Governance Model is deemed not appropriate to managing GITB. Having one organization controlling all aspects of GITB is unwieldy and would not be equipped to support the different business processes requiring testing. It would also involve considerable infrastructure and support. The Open Governance Model seems to offer the most flexibility and adaptability to industries and would seem to foster competition among the stakeholders in the testing industry. Test bed providers can set up their own testing business model. In order to be successful, however, an initial reference implementation should be provided as a basis for those desiring a GITB-compliant test bed. Much time could be saved by having a starting point and proving out the GITB specifications ahead of time. To that end, it is recommended to start with the Shared Governance Model. With this model, a single organization familiar with the GITB specification could work with a standards body, industry consortia, or trade association to develop the initial version of the artifacts, services, components, and registry/repository. Once this reference implementation has been proven, the organizations could move toward making the information available using an open source-like approach. Therefore, the recommendation is to start with a Shared Governance Model and move towards and Open Governance Model once a reference implementation has been proven. 136

138 16 Global Cooperation Model (to be defined based on discussions with stakeholders, including SDOs, industry consortia, test bed providers, and others) 137

139 Part V: Appendices 17 Appendix I: Test Service Interfaces The services are defined at two levels: (a) Abstract Interface level: Operations are described independently from protocol considerations or messaging techniques. This is the purpose of this appendix. (b) Concrete level: bindings to most practiced protocols are defined (e.g. SOAP-based Web services compliant with WS-I, or REST-based with XML). These bindings are provided in a following appendix Test Design Services Purpose and Scope: The Design Services defined here are intended to support persistence of Test Artifacts during the design phase. These "design" services do not pretend to cover every fine-grain operation involved in the design of test artifacts. Detailed editing or design operations are to be supported by some specialized tool (e.g. editors). Service provider: These services are supported by a Test Artifact Persistence Manager (TAPM), a persistent store typically associated with a Test Bed, and in general local to this Test Bed. A Test Bed itself can be said to provide this service, by extension. Example: A context in which such Test Design services are used, is of an Integrated Development Environment (IDE) that provides detailed editing functions for a Test Suite, then communicates (via its Design service interface) with a TAPM for persistence and association with a Test Bed. NOTE: The design of this Service assumes the presence of an artifact "header" element that is standard across Test Artifacts. This header contains (see the Artifacts section): metadata: author, date, keywords, artifactid, artifactname, status, description Document Assertion Design Sub-service This sub-service supports design functions for Document Assertion Set artifacts Create a Document Assertion Set Role: create and register a Document Assertion Set (DAS) in the target TAPM. Variants: Create an empty DAS (only provide its header / metadata). Create a new DAS by duplicating an existing one (provide its artifactid, provide new header / metadata). Create a DAS by uploading a DAS definition (provide its definition document). 138

140 Read a Document Assertion Set Role: download a registered test suite from the TAPM. Variants: provide the DAS ID and get corresponding DAS in response. provide header keywords and/or ownerid and get a list of matching DAS headers in response. (a DAS header also contains a manifest of the DAS content: list of the Doc.Assertions by their ID / URI.) Update a Document Assertion Set Role: modify a registered DAS. Variants: Modify just the header of a registered DAS (provide the replacement header or values). Replace the entire content of a registered DAS (provide its definition document). Add an "Document Assertion" (e.g. an XML schema, a semantic rule) in a DAS (provide the related definition document). Substitute a "Document Assertion" in a DAS (provide the Assertion ID, and its new definition document). Delete a "Document Assertion" in a DAS (provide the Assertion ID) Delete a Document Assertion Set Role: delete a registered DAS. Variants: Delete the DAS (provide the DAS artifact ID) Test Case Design Sub-service This sub-service supports design functions for Test Case artifacts Create a Test Case Role: create and register a Test Case in the target TAPM. Variants: create an empty Test Case (just provide the "header" data). 139

141 create a Test Case by duplicating an existing one (provide its artifactid, or testsuiteid + (caseid or casename)). create a Test Case by uploading a Test Case definition (provide its definition document) Read a Test Case Role: download a registered test case from the TAPM. Variants: provide the caseid, or testsuiteid + (caseid or casename) and get corresponding test case in response. provide keywords and/or ownerid and/or testsuiteid, and get a list of matching test cases in response Update a Test Case Role: modify a registered test Case in the target TAPM. Variants: modify just the header of a registered Test Case (provide the replacement header / metadata or values, provide the caseid or testsuiteid+caseid/name). replace the entire content of a registered Test Case (provide its definition document, provide the caseid or testsuiteid+caseid/name) Delete a Test Case Role: remove a test Case from the target TAPM, but only if not belonging to a test Suite (otherwise use Test Suite Update). Variants: delete the Test Case (provide caseid) Configure a Test Case Role: support various configuration operations over a Test Cases stored in a TAPM. Variants: Set the parameters of a Test Case. Associate Message Bindings with Test Case (provide MessageBindings) (this operation overrides previous Bindings if any) Associate some Payload files with test Case. 140

142 Test Suite Design Sub-service This sub-service supports design functions for Test Suite artifacts Create a Test Suite Role: create and register a Test Suite in the target TAPM. Variants: create an empty Test Suite (just provide the "header" data). create a Test Suite by duplicating an existing one (provide its artifactid). create a Test Suite by uploading a Test Suite definition (provide its definition document) Read a Test Suite Role: download a registered test suite from the TAPM. Variants: provide the testsuiteid and get corresponding test suite in response. provide keywords and/or ownerid and get a list of matching test suite headers in response Update a Test Suite Role: modify a registered test suite in the target TAPM. Variants: modify just the header of a registered Test Suite (provide the replacement header or values). replace the entire content of a registered Test Suite (provide its definition document). Add a test Case in a Test Suite (provide the Test Case definition document. If test suite has a defined workflow, provide testcaseid after which this additional test case must be inserted). substitute a test Case in a Test Suite (provide the Test Case ID, and its new definition document). delete a test Case in a Test Suite (provide the Test Case ID) Delete a Test Suite Role: create and register a Test Suite in the target TAPM. 141

143 Configure a Test Suite Role: support various configuration operations over a Test Suite in the target TAPM. Variants: Set the global parameters of a Test Suite. Associate Message Bindings with Test Cases in the test Suite (provide list of MessageBindings, and the IDs of related test Cases) (this operation overrides previous Bindings if any) Associate some Payload files with some test Cases in a test Suite Test Configuration Design Sub-service This sub-service supports design functions for Test Configuration artifacts. Standard CRUD operations designs are provided across the different types of configuration artifact Standard CRUD Operations Design for any Configuration Artifact All configuration artifacts are subject to CRUD operations that share a similar design: Create: an empty artifact is created. The header should be provided with at least the artifact ID value (if no ID value provided, the Service provider will return a generated ID that could be changed later). Read: all or parts of the header data must be provided (i.e. a header pattern ). In case several artifact instances match this header pattern, a list of artifacts is returned that match this header, or a list of headers only in case the response is too voluminous. Update: a substitution artifact is provided. The updated artifact is identified by its ID. Delete: all or parts of the header data must be provided (i.e. a header pattern ). In case several artifact instances match this header pattern, all matching artifacts will be deleted from the TAPM CRUD Operations on a Message Adapter Configuration Artifact Role: edit a Messaging Adapter Configuration item to be associated with a particular Messaging Adapter (depends on the messaging protocol). Variants: access the Adapter Configuration either by its artifactid or by the Connection that refers to it CRUD Operations on a Connection Artifact Role: edit a Connection item. Variants: 142

144 set/replace the Adapter Configuration, set/replace the connection URL, set/replace the Adapter reference. (access the Connection Configuration either by its artifactid or by the Message Binding that refers to it.) CRUD Operations on a Message Binding Artifact Role: edit a Message Binding item. Variants: set/replace the Payload, set/replace the connection URL, set/replace the message parameters (access the Message Binding either by its artifactid or by the TestSuite/Case that refers to it.) CRUD Operations on a Message Payload artifact Role: edit a Message Payload item. Variants: set/replace the Payload, (access the Payload either by its artifactid or by the TestSuite/Case that refers to it.) CRUD Operations on a Test Suite Parameters Set Artifact Role: edit a Test Suite Parameter Set item. Variants: set global parameters, set/replace Message Bindings maps. set/replace participants role maps (in case roles are defined) Test Assertion Design Sub-service This sub-service supports design functions for Test Assertions artifacts. Standard CRUD operations designs are provided across the different types of configuration artifact: Issues Tracking Sub-service This sub-service supports issues tracking and reporting about test artifacts subject to design operations Test Deployment Services Purpose and Scope: The Deployment Services defined here are intended for deploying Test Suites, Test Cases, Document Assertion Sets and for configuring some component such as Messaging Adapter so that a Test Bed is ready for test execution. 143

145 Service provider: These services are targeting a particular Test Bed, and are supported by the Test Deployment Manager of this Test Bed (yet will involve several other components of the Test Bed). Example: A context in which such Test Deployment services are used, is of deploying a test suite that involves SUT interaction and invokes a remote Document Validator. The Test Deployment Manager load the test suite from the TAPM into the Test Suite Engine, ensures that the Messaging Adapter of interest is configured with the Messaging Adapter Configuration for this test suite, and initiates a connection with the remote Document Validator General Deployment Status Functions This sub-service supports the overall deployment for Test Suite artifacts Get Deployment Status of a Test Bed Role: Get all test suites/cases and document assertion sets currently deployed on a Test Bed. Variants: provide the Test Bed reference and get all test suites and DAS currently deployed, with their resource map Undeploy all Test Suites and DAS on a Test Bed Role: Undeploy all test suites/cases and document assertion sets currently deployed on a Test Bed. Variants: provide the Test Bed reference Test Suite Deployment Sub-service This sub-service supports the overall deployment for Test Suite artifacts Deploy a Test Suite Artifact Role: deploy on a Test Bed a registered test suite from the TAPM. Variants: provide the testsuiteid and get a test suite instance description to review before deployment. provide a (partial) testsuite header, browse the matching test suites and select one to deploy. some deployment alternative may be proposed, e.g. if a remove test Agent (e.g. Document Validator) is not available, download of the related Document Assertion Set and local deployment of the Validator) 144

146 Get Test Suite Deployment Status Role: Get the deployment status of a test suite deployed on a Test Bed, i.e. the resources and other components used by this test suite, the status of these components. Variants: provide the testsuiteid or browse the currently deployed test suites on this Test Bed Undeploy a Test Suite Role: Undeploy a test suite deployed on a Test Bed. Variants: provide the testsuiteid, or browse the currently deployed test suites on this Test Bed Document Assertion Set Deployment Sub-service This sub-service supports the overall deployment for Document Assertion Sets Deploy a Document Assertion Set Artifact Role: deploy on a Test Bed (or on a Document Validator) a Document Assertion Set (DAS) registered in the TAPM. Variants: provide the assertion set artifact ID and get a DAS description to review before deployment. browse a set of DAS matching a (partial) artifact header, and select those to deploy Get a Document Assertion Set Deployment status Role: Get the deployment status of a DAS deployed on a Test Bed, i.e. the resources and other components used by this test suite, the status of these components. Variants: provide the DAS ID or browse the currently deployed DAS on this Test Bed Undeploy a Document Assertion Set Role: Undeploy a DAS deployed on a Test Bed. Variants: provide the DAS ID, or browse the currently deployed DAS on this Test Bed. 145

147 17.3 Test Execution Services Purpose and Scope: The Test Execution Services defined here are intended to control all operations of a Test Bed related to the execution of a test Suite. Service provider: These services are supported by the Test Operation Manager (TOM) of a Test Bed. Example: A user will use this Service remotely, from a Web-based Test Bed console Test Suite Execution Control Sub-Service This sub-service allows for controlling the execution of a deployed Test Suite Start a Test Suite Execution Role: Start the execution of a deployed test suite on the Test Bed. This returns a test session ID. Variants: provide the testsuiteid, or browse the currently deployed test suites on this Test Bed. Mode manual: execute only the first test Case in the test suite, then stop Stop a Test Suite Execution Role: Stop the execution of a deployed test suite on the Test Bed. Variants: provide the testsuiteid, or browse the currently running test suites on this Test Bed. In case the Test Bed can run several instances of the same test Suite at same time, provide the test session ID of the session to be stopped Resume a Test Suite Execution Role: Resume the execution of a deployed test suite on the Test Bed. Variants: provide the testsuiteid, or browse the currently deployed and stopped test suites on this Test Bed. Mode manual: execute only the next test Case in the test suite, then stop. In case the Test Bed can run several instances of the same test Suite at same time, provide the test session ID of the session to be resumed. 146

148 Get Status of a Test Suite Execution Role: Get the execution status of a deployed test suite on the Test Bed. Status could be stopped/running, gives which step is currently executing, the current resource map, the partial test results (or complete if the execution is complete), the partial execution trace (or complete if the execution is complete). Variants: provide the testsuiteid, or browse the currently deployed test suites on this Test Bed. in case the Test Bed can run several instances of the same test Suite at same time, provide the test session ID Alter a Test Suite Execution Role: Modify the course of the execution of a deployed test suite on the Test Bed. Some Test Case could be skipped, or execution stopped and resumed at a particular step. Some overriding of test artifact is supported: dynamic replacement of a message Payload (received or sent), some dynamic modification of test suite parameters is possible. Variants: provide the testsuiteid, or browse the currently deployed test suites on this Test Bed. in case the Test Bed can run several instances of the same test Suite at same time, provide the test session ID Document Assertion Set Execution Control Sub-service This sub-service allows for controlling the execution of a deployed DAS. The provider is a Document Validator (itself part of a Test Bed). The objective is to validate one or more Documents (either business documents, or other test items such as a message envelope, etc.) Validate a Document Role: Submit a document to a Document Validator for verification. The Validator returns a session ID. Variants: Submit a set of documents to be validated Get status of a Document Validation Role: Get current status of a document validation from a Document Validator. If the validation is complete, returns the test report. Variants: Provide the session ID, or browse the most recent document validations for their status. 147

149 Test Execution Collaboration and Logistics Sub-service This sub-service allows for managing the overall User collaboration and logistics around the execution of a Test Suite (registrations, notifications, role assignment, issues tracking). Ideally, this is controlled by a workflow system. This sub-service describes a minimal set of services involved. Some of these services may be directly invoked by a Test Suite execution (as test steps). Some may be invoked by a human test operator via a [Web-based] Test Console Register Parties Involved Role: Register a number of Test participants (e.g. participants or an interoperability test session) associated with a Test Suite execution. Provide medium and address information to reach/notify these participants. This registration may be part of a test Suite initial steps, and a positive outcome may be necessary for the Test Suite execution to proceed. Variants: Notification medium may vary ( , text message, IRC, alarm, Web console). Participants may be asked to confirm before the registration is effective Notify Parties Involved Role: Notify (e.g. ) a number of Test operators (e.g. some or all participants or an interoperability test session) about some action to be performed. Variants: Notification medium may vary ( , text message, IRC, alarm, Web console). The notification may require a response from participants in order to be considered complete Report Issue Role: Report an issue related to the execution of a Test Suite or of a Document Validator. This service operation may defer to a specialized issue tracking system. It provides a common operation interface to participants regardless of the tracking system used, which may vary from one Test Bed to another. Issues could be automatically filed by a Test Suite during its execution, or filed by test operator, or yet by participants. Variants: The issue reporting may be free format (for human consumption only) or have some structure (test session ID, originator, date/time, type, details). Response is an issue ID for later tracking. 148

150 Get Issue Report Role: Get the issues related to the execution of a Test Suite or of a Document Validator. This service operation may be used either as a means of cooperation between parties involved during test execution, or as part of the execution status and final log, for later remediation. Variants: The issue reporting may be free format (for human consumption only) or have some structure (test session ID, originator, date/time, type, details) Test Execution Agent Control Sub-service This sub-service allows for additional specialized operations to a Test Agent Execute Operation Role: Execute an operation that may be specific to a Test Agent. A data structure is provided as input (e.g. a document) that specifies the operation and its arguments. Variants: An execution status may be returned synchronously. An operation response may be returned synchronously. In case of asynchronous call, an operation handle is returned (e.g. a session Id) Get Operation Status Role: Get the status of a previously started operation. A handle on the operation instance is provided (e.g. a session Id). An execution status is returned synchronously. Variants: An operation response may be returned synchronously along with the status Test Repository Services Purpose and Scope: The Test Repository Services defined here are intended for two purposes: As a publishing service: central bulletin board for sharing of test material across Test Beds and participants about: Test Beds and provided test services, test cases/suites, document assertion sets, configuration artifacts, messaging adapter configurations. It supports discovery and search of test artifacts, and of related users evaluations and comments. As long-term archival site for all test artifacts (test cases/suites, document assertion sets, configuration artifacts) per category, per ebusiness Specification covered. 149

151 Service provider: These services are supported by a server independent from Test Beds: a Test Repository that may be global or local to a user community. Example: A context in which such Test Repository services are used, is of searching a test suite for a particular B2B specification e.g. HL7 in order to deploy it on a Test Bed. The end-user wants to participate in an interoperability test rounds. S/he was advised by the test operator to download an HL7 conformance test suite prior to the interoperability testing, in order to verify the conformance of her/his B2B endpoint. Then the end-user is advised to also access and browse the HL7 interoperability test suite that will drive the interoperability testing, just for information as this test suite will be running on the remote Test Bed driving the testing General Search Functions This sub-service supports the general search functions for Test artifacts Get Test Artifacts Matching a Pattern Role: Get all test artifacts the header of which matches some filter information. Variants: Select an artifact type (Test Suite, test Report, messagebinding ) to scope the search, in addition to the header filter. Browse and select an archive to scope the search, in addition to the header filter. The header filter may consists of expressions instead of just header values, e.g. origdate later than <a date>. A full query expression may be supported, not only on header metadata but also on artifact content Administration functions This sub-service supports the management of Archives where to store Test artifacts Create an Archive Role: Create an archive i.e. a top-level folder where to organize and store test artifacts. Variants: Set the access rights for this archive at the same time. Define sub-folders in this archive Duplicate an Archive Role: Duplicate an archive i.e. a top-level folder where to organize and store test artifacts. 150

152 Variants: Set the name of the new archive, or automatically generate it. Duplicate only selected folders inside the archive Delete an Archive Role: Delete an archive. Variants: Delete only the artifacts inside, not the archive and folder structure. Delete only selected folders inside the archive Set Access Rights for an Archive Role: Define a general access filter for the archive. Variants: Use conventional levels per roles: read-only, read-write (ability to add/remove material), administrator (ability to create/remove subfolders, change access rights). Use the service as a proxy to commonly used access control systems Archival Functions This sub-service supports the management of Test artifacts inside Archives Store a Test Artifact Role: Upload a test artifact in an archive. Variants: Provide the archive and folder structure, where to store. In case an artifact of same ID is there, override vs. notify user. Automatically generate some header data (artifact ID) Download a Test Artifact Role: Download a test artifact from an archive. Variants: Provide the ID of the artifact to download. 151

153 Provide a set of IDs of the artifacts to download. Provide a selection to download Select a Test Artifact or a Set of Artifacts Role: Select a set of test artifacts from an archive or folder, based on a filter. The selection is scoped by a particular folder or archive. Variants: Provide the header filter (same as in general Search operation) Transfer a test Artifact or a Set of Artifacts Role: Transfer a selected set of test artifacts from an archive or folder somewhere inside the archive or across archives. Variants: Provide one or more destinations paths Protocol and Message Bindings to Services For the Web services solutions: WSDL-SOAP recommendation and best practice. (for maximum portability across WS stacks, WS-I compliance). For XML over HTTP solutions: recommendation and best practice w/r to REST style, use of HTTP. 152

154 18 Appendix II: Reusing and Coordinating Test Beds - How to become a GITB compliant Test Agent? It is critical to reuse some Testing Capabilities of existing testing applications and/or Test Beds in order to facilitate the testing process. Furthermore, in some cases, the existing testing applications and Test Beds themselves want to share (or sell commercially) their Testing Capabilities. In these cases, however, there is no standard mechanism for a testing application or Test Bed to expose its Testing Capability. Each application or Test Bed provides its own mechanism for others to reach its Testing Capability. Therefore, GITB provide standard mechanisms for these applications and Test Beds through which they can coordinate with other entities (Test Suite Drivers). In GITB, this sharing/exposing mechanism is also provided through the Test Services defined in Appendix I. In this section the services towards test bed coordination are described in more technical detail. It should be noted that not all of the test services are described in this section. As mentioned, only the Test Services which allows test bed coordination is detailed. Therefore, in order not to confuse with other Test Services, these services are called Coordination Test Services. In other words, the Coordination Test Services are Test Services which are used for test bed coordination. In GITB, the entities providing/exposing Testing Capability in the execution of a test case are called Test Agents. Therefore, this section can also be regarded as the guidelines for testing applications or Test Beds to become a Test Agent in GITB framework. It should be noted that the Test Agents should support these Test Services as the server. In other words, they should bind their existing code with the operations defined. Needless to say, the entities (i.e. Test Suite Drivers) that want to benefit from these Test Agents should support these Test Services as client. In this section, first the Coordination Test Services are described in detail. After that the Test Report Format and the Coordination Test Service Metadata is presented. It should be noted that these information are first described at the abstract level. Suitable implementation syntax is then recommended Coordination Test Services The Coordination Test Services describes how a Testing Capability of a Test Agent can be used by Test Suite Drivers. The Coordination Test Services are the following Test Services: - Validate a Document (Section ) - Start a Test Suite Execution (Section ) - Configure a Test Case (Section ) - Configure a Test Suite (Section ) - Get Status of a Test Suite Execution (Section ) Depending on the type of the testing ability, the number of Coordination Test Services that they should support differs as well. In GITB, three types of Test Agents are envisaged: 1- Validating-only Test Agent: This Test Agent only performs document validation. They only support Validate a Document. 153

155 2- Interacting-only Test Agent: This Test Agent interacts with an SUT in the test executions and exchange real life messages. In other words, this type of Test Agents simulates parties in business processes. For document validation they use Validating-only Test Agents. They support: a. Start a Test Suite Execution b. Configure a Test Case c. Configure a Test Suite d. Get Status of a Test Suite Execution 3- Interacting-Validating Test Agent: They are hybrid of Validating-only Test Agents and Interacting-only Test Agents. They support: a. Validate a Document b. Start a Test Suite Execution c. Configure a Test Case d. Configure a Test Suite e. Get Status of a Test Suite Execution At this point it is worth mentioning about the layers in a data exchange. In a data exchange between two parties, first a business document is put into the payload of a message. After that the message can be further put into an application layer message which is put into a transport layer message. Afterwards the exchange occurs. Table 18-1 provides an example from HL7 v3 domain. Table 18-1: HL7 v3 Example Exchange item Document HL7 v3 Example HL7 CDA Message HL7 v3 Transmission Wrapper and Control Act Wrapper Application Message Transport Message Layer Layer SOAP HTTP It should be noted that this hierarchy differs from standard to standard. Some of the layers can overlap. For example, in HL7 v2.x, the document and message levels are not separated. Figure 18-1 displays how the tests are realized through this interface in both Validating-only and InteractingOnly Test Agent cases. 154

156 Validating-Only Test Services System Under Test Test Suite Driver validatedocume nt() Interacting-Only Test Services Test Agent A configuretestcase OrTestSuite() starttestsuiteexec ution() Test Agent B getstatusoftestsu iteexecution () Figure 18-1: Interactions between Test Suite Drivers, Test Agents and SUT The Test Services are modeled as operations. Basically the operations that should be supported by the Test Agents are as follows: Operations for the Validating-only Test Agents: o Validate Document: This operation accepts the Test Items (e.g. the document, message) to be validated and some configuration parameters (e.g., the profile against which the item should be validated) as input and returns the test validation report. The Test Suite Driver may provide authentication information (username, password) if it is required by the Test Agent. Operations for the Interacting-only Test Agents: Interacting-only Test Agents simulate an actor in a real-life setting and test a process in which all the layers of the interoperability stack are considered. It should be noted that as long as it is a simulation of real-life case; a single message exchange is also considered as a process. For the Interacting-only Test Agents, there is more than one operation, because in real-life message exchanges, the response that the sender obtained from the exchange may not contain the full validation report. Furthermore, the message exchange may not be synchronous, where the sender obtains the application level response in a separate connection. It should be noted that unlike Validating-only Test Agents case, where the actual test is realized by the operation, in the Interacting-only Test Agents Case, the actual test is realized between the SUT and the Test Agent. However, the SUT is not necessarily aware of this issue. In this case, the Test Suite Driver obtains the validation report of the test execution and displays it to the SUT. The operations are as follows: o Configure Test Case or Test Suite: This operation establishes a test session between a Test Suite Driver and a Test Agent. As the output, the Test Suite Driver obtains a session identifier from the Test Agent. This identifier is used in the other operations during the test execution. The Test Suite Driver may provide authentication information (username, password) if it is required by the Test Agent. 155

157 Additionally, through this operation, the Test Suite Driver sends its configuration parameters (such as endpoints) for the message exchanges to the Test Agent and as a response obtains the configuration parameters of the Test Agent. o o Start Test Suite Execution: This operation accepts the session identifier from the Test Suite Driver and executes its test with the SUT. As a response, it returns whether the initiation of the test succeeds or fails. This operation can be called with the same session identifier, meaning the restarting of the session. The precondition for restarting the session is that the session should be stopped previously. Get Status Of Test Suite Execution: This operation provides the status information of the test execution between the SUT and Test Agent. Possible values are not configured, not started, in progress, completed, failed, aborted. If the status is one of completed, failed, aborted, this operation returns the validation report in Test Report Format. Interacting-Validating Test Agents supports all of the above mentioned operations. In the following subsections, more technical detail is presented on the operations Validate Document Operation The test operation is the only operation for the Validating-only Test Agents. It validates a Test Item and returns a validation report in Test Report format. As the abstract syntax Java Programming Language is chosen. The signature of the test operation is as follows: TestReport validatedocument(string testitem, Hashtable<String, String> configurations); The TestReport item corresponds to the validation report of executed test. The testitem contains the whole test item to be validated. It can be a Document, Message, Application Layer Message or Transport Layer Message. The configurations item contains the configuration parameters as name-value pairs. These configuration parameters are specified by the Test Agent in the configuration and description part of the Test Service Metadata Configure Test Case or Test Suite Operation This operation is the first operation to be called in the Interacting-only Test Agents communications to obtain the session identifier and configure the test session. All of the subsequent operations require this value and without providing the session identifier to these operations should result in exception condition. The signature of this operation is as follows: Configuration configure(configuration testserviceconsumerconfigurations); Configuration { } String sessionidentifier; Hastable<String, String> BCID2EndpointAssociations; Hashtable<String, Hashtable<String, String>> BCConfigurations; 156

158 Through this operation the Test Suite Driver sends its configuration parameters to the Test Agent and obtains the configuration parameters of Test Agent. As mentioned previously, the test that Test Agent provides may contain more than one message exchanges (i.e. a process). The process may be described through ebxml Business Process (ebbp) in the process description part of the Test Service Metadata. A business process definition in ebbp contains business collaborations which points to Business Collaborations (BC) specifying technical details in a message exchange. In the interface, this BC concept is used to specify individual message exchanges. The signature of the configure operation is as follows: The Configuration item contains two hash tables. The first one (BCID2EndpointAssociations) contains the endpoints (in the form of IP:port format) corresponding to given BCs and the second hash table (BCConfigurations) holds the configuration parameters for each BC. Like in the Test Operation the configurations for each BC are in the form of name-value pairs. These configuration parameters are specified by the Test Agent in the configuration parameters and description part of the Test Service Metadata. As an example, assume that a Test Agent provides a test consisting of three message exchanges. The first two of these message exchanges are initiated by the Test Suite Driver and the last one is initiated by the Test Agent. In other words, there are three BCs: BC1, BC2 and BC3. As the Test Suite Driver should initiate BC1 and BC2, it needs to know the endpoints (and necessary configuration parameters) to which it needs to send its messages. Likewise, for BC3, the Test Agent needs to know the endpoints and configuration parameters. Therefore, when the configure operation is called, the Test Suite Driver provides the Configuration item for BC3 and obtains the Configuration items for BC1 and BC2 as shown in Figure Configuration configure(configuration testserviceconsumerconfigurations) Figure 18-2: Configure Operation Example Start Test Suite Execution Operation The Start Test Suite Execution Operation executes the test session given the session identifier. As a response the Test Agent returns an acknowledgement whether the initiation of the session succeeds or fails. It should be noted that this acknowledgement is not the status of the test execution. It is for technical problems that may occur on the Test Agent side (e.g. connection failure to a database). The signature of the operation is as follows: TestReport starttestsuiteexecution(string sessionidentifier); Although the response is not for test validation result, the TestReport item can be used for this purpose too

Global ebusiness Interoperability Test Beds (GITB) Test Registry and Repository User Guide

Global ebusiness Interoperability Test Beds (GITB) Test Registry and Repository User Guide Global ebusiness Interoperability Test Beds (GITB) Test Registry and Repository User Guide CEN Workshop GITB Phase 3 October 2015 Global ebusiness Interoperability Test Beds (GITB) 2 Table of Contents

More information

Proposed Revisions to ebxml Technical. Architecture Specification v1.04

Proposed Revisions to ebxml Technical. Architecture Specification v1.04 Proposed Revisions to ebxml Technical Architecture Specification v1.04 Business Process Team 11 May 2001 (This document is the non-normative version formatted for printing, July 2001) Copyright UN/CEFACT

More information

CEF e-invoicing. Presentation to the European Multi- Stakeholder Forum on e-invoicing. DIGIT Directorate-General for Informatics.

CEF e-invoicing. Presentation to the European Multi- Stakeholder Forum on e-invoicing. DIGIT Directorate-General for Informatics. CEF e-invoicing Presentation to the European Multi- Stakeholder Forum on e-invoicing 20 October 2014 DIGIT Directorate-General for Informatics Connecting Europe Facility (CEF) Common financing instrument

More information

Proposed Revisions to ebxml Technical Architecture Specification v ebxml Business Process Project Team

Proposed Revisions to ebxml Technical Architecture Specification v ebxml Business Process Project Team 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 Proposed Revisions to ebxml Technical Architecture Specification v1.0.4 ebxml Business Process Project Team 11

More information

CEN and CENELEC Position Paper on the draft regulation ''Cybersecurity Act''

CEN and CENELEC Position Paper on the draft regulation ''Cybersecurity Act'' CEN Identification number in the EC register: 63623305522-13 CENELEC Identification number in the EC register: 58258552517-56 CEN and CENELEC Position Paper on the draft regulation ''Cybersecurity Act''

More information

RESOLUTION 47 (Rev. Buenos Aires, 2017)

RESOLUTION 47 (Rev. Buenos Aires, 2017) Res. 47 425 RESOLUTION 47 (Rev. Buenos Aires, 2017) Enhancement of knowledge and effective application of ITU Recommendations in developing countries 1, including conformance and interoperability testing

More information

E-Commerce Integration Meta-Framework Introduction (ECIMF-Intro) CEN/ISSS/WS-EC/ECIMF. Draft, version 0.3 November 28, 2001

E-Commerce Integration Meta-Framework Introduction (ECIMF-Intro) CEN/ISSS/WS-EC/ECIMF. Draft, version 0.3 November 28, 2001 1 E-Commerce Integration Meta-Framework Introduction (ECIMF-Intro) CEN/ISSS/WS-EC/ECIMF Draft, version 0.3 November, 001 1 0 3 3 3 3 0 1. Background and the Goal Statement There have been many standardization

More information

E-business Integrated Test Framework Model

E-business Integrated Test Framework Model E-business Integrated Test Framework Model Pasha Vejdan Tamar Department of Information Technology Tarbiat Modares University Tehran, Iran pvejdan@modares.ac.ir Abbas Asosheh Department of Information

More information

ENISA s Position on the NIS Directive

ENISA s Position on the NIS Directive ENISA s Position on the NIS Directive 1 Introduction This note briefly summarises ENISA s position on the NIS Directive. It provides the background to the Directive, explains its significance, provides

More information

M403 ehealth Interoperability Overview

M403 ehealth Interoperability Overview CEN/CENELEC/ETSI M403 ehealth Interoperability Overview 27 May 2009, Bratislava Presented by Charles Parisot www.ehealth-interop.eu Mandate M/403 M/403 aims to provide a consistent set of standards to

More information

Toward Horizon 2020: INSPIRE, PSI and other EU policies on data sharing and standardization

Toward Horizon 2020: INSPIRE, PSI and other EU policies on data sharing and standardization Toward Horizon 2020: INSPIRE, PSI and other EU policies on data sharing and standardization www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting legislation The Mission of the Joint Research

More information

FHA Federal Health Information Model (FHIM) Information Modeling Process Guide

FHA Federal Health Information Model (FHIM) Information Modeling Process Guide Office of the National Coordinator for Health IT Federal Health Architecture Program Management Office FHA Federal Health Information Model (FHIM) Information Modeling Process Guide Version 0.1 Draft,

More information

BPMN Working Draft. 1. Introduction

BPMN Working Draft. 1. Introduction 1. Introduction The Business Process Management Initiative (BPMI) has developed a standard Business Process Modeling Notation (BPMN). The primary goal of BPMN is to provide a notation that is readily understandable

More information

Dictionary Driven Exchange Content Assembly Blueprints

Dictionary Driven Exchange Content Assembly Blueprints Dictionary Driven Exchange Content Assembly Blueprints Concepts, Procedures and Techniques (CAM Content Assembly Mechanism Specification) Author: David RR Webber Chair OASIS CAM TC January, 2010 http://www.oasis-open.org/committees/cam

More information

Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary

Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary Vocabulary-Driven Enterprise Architecture Development Guidelines for DoDAF AV-2: Design and Development of the Integrated Dictionary December 17, 2009 Version History Version Publication Date Author Description

More information

Vendor: The Open Group. Exam Code: OG Exam Name: TOGAF 9 Part 1. Version: Demo

Vendor: The Open Group. Exam Code: OG Exam Name: TOGAF 9 Part 1. Version: Demo Vendor: The Open Group Exam Code: OG0-091 Exam Name: TOGAF 9 Part 1 Version: Demo QUESTION 1 According to TOGAF, Which of the following are the architecture domains that are commonly accepted subsets of

More information

Accelerate Your Enterprise Private Cloud Initiative

Accelerate Your Enterprise Private Cloud Initiative Cisco Cloud Comprehensive, enterprise cloud enablement services help you realize a secure, agile, and highly automated infrastructure-as-a-service (IaaS) environment for cost-effective, rapid IT service

More information

INSPIRE status report

INSPIRE status report INSPIRE Team INSPIRE Status report 29/10/2010 Page 1 of 7 INSPIRE status report Table of contents 1 INTRODUCTION... 1 2 INSPIRE STATUS... 2 2.1 BACKGROUND AND RATIONAL... 2 2.2 STAKEHOLDER PARTICIPATION...

More information

Beginning To Define ebxml Initial Draft

Beginning To Define ebxml Initial Draft Beginning To Define ebxml Initial Draft File Name Version BeginningToDefineebXML 1 Abstract This document provides a visual representation of how the ebxml Architecture could work. As ebxml evolves, this

More information

Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard

Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard Version # : 1.6 Status: Approved Prepared under the delegated authority of the Management Board of Cabinet Queen's

More information

Government of Ontario IT Standard (GO ITS)

Government of Ontario IT Standard (GO ITS) Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard Version # : 1.5 Status: Approved Prepared under the delegated authority of the Management Board of Cabinet Queen's

More information

M2 Glossary of Terms and Abbreviations

M2 Glossary of Terms and Abbreviations M2 Glossary of Terms and Abbreviations 11 June 2015 M2: Electronic Standards for the Transfer of Regulatory Information Updated at ICH Expert Working Group meeting, Fukuoka, June 2015 Definitions... 2

More information

U.S. Japan Internet Economy Industry Forum Joint Statement October 2013 Keidanren The American Chamber of Commerce in Japan

U.S. Japan Internet Economy Industry Forum Joint Statement October 2013 Keidanren The American Chamber of Commerce in Japan U.S. Japan Internet Economy Industry Forum Joint Statement 2013 October 2013 Keidanren The American Chamber of Commerce in Japan In June 2013, the Abe Administration with the support of industry leaders

More information

XML based Business Frameworks. - II- Description grid for XML frameworks

XML based Business Frameworks. - II- Description grid for XML frameworks 1 / 14 XML based Business Frameworks - II- Description grid for XML frameworks 2 / 14 Document administration Reference Version State Exploitation Sender 20030905.D2.2.XML-BBF.1 2.1 A.Rizk Written by Checked

More information

HL7 Development Framework

HL7 Development Framework HL7 Development Framework Version 3.0 Model Driven Standards Development Abdul-Malik Shakir Principal Consultant, Shakir Consulting October 2005 Introduction to Health Level Seven Health Level Seven (HL7)

More information

ehealth Ministerial Conference 2013 Dublin May 2013 Irish Presidency Declaration

ehealth Ministerial Conference 2013 Dublin May 2013 Irish Presidency Declaration ehealth Ministerial Conference 2013 Dublin 13 15 May 2013 Irish Presidency Declaration Irish Presidency Declaration Ministers of Health of the Member States of the European Union and delegates met on 13

More information

1 Executive Overview The Benefits and Objectives of BPDM

1 Executive Overview The Benefits and Objectives of BPDM 1 Executive Overview The Benefits and Objectives of BPDM This is an excerpt from the Final Submission BPDM document posted to OMG members on November 13 th 2006. The full version of the specification will

More information

strategy IT Str a 2020 tegy

strategy IT Str a 2020 tegy strategy IT Strategy 2017-2020 Great things happen when the world agrees ISOʼs mission is to bring together experts through its Members to share knowledge and to develop voluntary, consensus-based, market-relevant

More information

Information Model Architecture. Version 1.0

Information Model Architecture. Version 1.0 Information Model Architecture Version 1.0 1 introduction...2 2 objective...2 3 definition of terms...3 4 conformance...4 4.1 UBL conformance...4 4.2 NES conformance...4 4.3 NES profile conformance...4

More information

Office of the Government Chief Information Officer XML SCHEMA DESIGN AND MANAGEMENT GUIDE PART I: OVERVIEW [G55-1]

Office of the Government Chief Information Officer XML SCHEMA DESIGN AND MANAGEMENT GUIDE PART I: OVERVIEW [G55-1] Office of the Government Chief Information Officer XML SCHEMA DESIGN AND MANAGEMENT GUIDE PART I: OVERVIEW [G-] Version. November 00 The Government of the Hong Kong Special Administrative Region COPYRIGHT

More information

BPMN Working Draft. 1. Introduction

BPMN Working Draft. 1. Introduction 1. Introduction The Business Process Management Initiative (BPMI) has developed a standard Business Process Modeling Notation (BPMN). The primary goal of BPMN is to provide a notation that is readily understandable

More information

DIGITIZING INDUSTRY, ICT STANDARDS TO

DIGITIZING INDUSTRY, ICT STANDARDS TO DIGITIZING INDUSTRY, ICT STANDARDS TO DELIVER ON DIGITAL SINGLE MARKET OBJECTIVES ETSI When Standards Support Policy 14 November 2016 Emilio Davila Gonzalez Unit Start ups & Innovation, EC DG Connect 72%

More information

Digital Platforms for 'Interoperable and smart homes and grids'

Digital Platforms for 'Interoperable and smart homes and grids' 25 October 2017, Brussels Digital Platforms for 'Interoperable and smart homes and grids' Focus Area "Digitising and Transforming European Industry and Services by Svet Mihaylov DG CONNECT WP 2018-2020

More information

ISO/IEC/ IEEE INTERNATIONAL STANDARD. Systems and software engineering Architecture description

ISO/IEC/ IEEE INTERNATIONAL STANDARD. Systems and software engineering Architecture description INTERNATIONAL STANDARD ISO/IEC/ IEEE 42010 First edition 2011-12-01 Systems and software engineering Architecture description Ingénierie des systèmes et des logiciels Description de l'architecture Reference

More information

Building an Assurance Foundation for 21 st Century Information Systems and Networks

Building an Assurance Foundation for 21 st Century Information Systems and Networks Building an Assurance Foundation for 21 st Century Information Systems and Networks The Role of IT Security Standards, Metrics, and Assessment Programs Dr. Ron Ross National Information Assurance Partnership

More information

Securing Europe's Information Society

Securing Europe's Information Society Securing Europe's Information Society Dr. Udo Helmbrecht Executive Director European Network and Information Security Agency 16 June 2010 FIRST AGM Miami 16/6/2010 1 Agenda ENISA overview Challenges EU

More information

Towards a European e-competence Framework

Towards a European e-competence Framework Towards a European e-competence Framework Projects, trends, multistakeholder activities towards a European ICT sectoral framework, related to the EQF Jutta Breyer Brussels, 24 June 2008 Overview 1. Intro

More information

DON XML Achieving Enterprise Interoperability

DON XML Achieving Enterprise Interoperability DON XML Achieving Enterprise Interoperability Overview of Policy, Governance, and Procedures for XML Development Michael Jacobs Office of the DON CIO Vision The Department of the Navy will fully exploit

More information

Promoting semantic interoperability between public administrations in Europe

Promoting semantic interoperability between public administrations in Europe ISA solutions, Brussels, 23 September 2014 Vassilios.Peristeras@ec.europa.eu Promoting semantic interoperability between public administrations in Europe What semantics is about? ISA work in semantics

More information

Designing a System Engineering Environment in a structured way

Designing a System Engineering Environment in a structured way Designing a System Engineering Environment in a structured way Anna Todino Ivo Viglietti Bruno Tranchero Leonardo-Finmeccanica Aircraft Division Torino, Italy Copyright held by the authors. Rubén de Juan

More information

EUROPEAN COMMISSION DIRECTORATE GENERAL FOR INTERPRETATION

EUROPEAN COMMISSION DIRECTORATE GENERAL FOR INTERPRETATION EUROPEAN COMMISSION DIRECTORATE GENERAL FOR INTERPRETATION RESOURCES AND SUPPORT DIRECTORATE Management of Technical Infrastructure Brussels, 23 January 2013 M/516 EN Ref. Ares(2013)136537-04/02/2013 REQUEST

More information

Research Infrastructures and Horizon 2020

Research Infrastructures and Horizon 2020 Ana Arana Antelo DG Research & Head of Research Infrastructures ERF Workshop - Hamburg, 31 May 2012 Research Infrastructures and Horizon 2020 The EU Framework Programme for Research and 2014-2020 Research

More information

Enhancing Business Processes Using Semantic Reasoning. Monica. J. Martin Sun Java Web Services. 26 May

Enhancing Business Processes Using Semantic Reasoning. Monica. J. Martin Sun Java Web Services. 26 May Enhancing Business Processes Using Semantic Reasoning Monica. J. Martin Sun Java Web Services www.sun.com 26 May 2005 Presentation Outline Industry landscape Standards landscape Needs for and use of semantic

More information

Building Ontology Repositories for E-Commerce Systems

Building Ontology Repositories for E-Commerce Systems Building Ontology Repositories for E-Commerce Systems JIANMING YONG 1,2, YUN YANG 1 and JUN YAN 1 1 CICEC - Centre for Computing and E-Commerce School of information technology Swinburne University of

More information

case study The Asset Description Metadata Schema (ADMS) A common vocabulary to publish semantic interoperability assets on the Web July 2011

case study The Asset Description Metadata Schema (ADMS) A common vocabulary to publish semantic interoperability assets on the Web July 2011 case study July 2011 The Asset Description Metadata Schema (ADMS) A common vocabulary to publish semantic interoperability assets on the Web DISCLAIMER The views expressed in this document are purely those

More information

lnteroperability of Standards to Support Application Integration

lnteroperability of Standards to Support Application Integration lnteroperability of Standards to Support Application Integration Em delahostria Rockwell Automation, USA, em.delahostria@ra.rockwell.com Abstract: One of the key challenges in the design, implementation,

More information

STAR Naming and Design Rules. Version 1.0

STAR Naming and Design Rules. Version 1.0 Version 1.0 March 2007 Revision History Revision Date Version Initial Version March 13, 2007 1.0 Table of Contents 1. Introduction...1 1.1 Purpose...1 1.2 Objective... 1 1.3 Scope...1 1.4 Prerequisites...1

More information

Security and resilience in Information Society: the European approach

Security and resilience in Information Society: the European approach Security and resilience in Information Society: the European approach Andrea Servida Deputy Head of Unit European Commission DG INFSO-A3 Andrea.servida@ec.europa.eu What s s ahead: mobile ubiquitous environments

More information

This is a preview - click here to buy the full publication TECHNICAL REPORT. Part 101: General guidelines

This is a preview - click here to buy the full publication TECHNICAL REPORT. Part 101: General guidelines TECHNICAL REPORT IEC TR 62325-101 First edition 2005-02 Framework for energy market communications Part 101: General guidelines IEC 2005 Copyright - all rights reserved No part of this publication may

More information

Service Oriented Architectures Visions Concepts Reality

Service Oriented Architectures Visions Concepts Reality Service Oriented Architectures Visions Concepts Reality CSC March 2006 Alexander Schatten Vienna University of Technology Vervest und Heck, 2005 A Service Oriented Architecture enhanced by semantics, would

More information

Conceptual Modeling and Specification Generation for B2B Business Processes based on ebxml

Conceptual Modeling and Specification Generation for B2B Business Processes based on ebxml Conceptual Modeling and Specification Generation for B2B Business Processes based on ebxml HyoungDo Kim Professional Graduate School of Information and Communication, Ajou University 526, 5Ga, NamDaeMoonRo,

More information

Implementing the Army Net Centric Data Strategy in a Service Oriented Environment

Implementing the Army Net Centric Data Strategy in a Service Oriented Environment Implementing the Army Net Centric Strategy in a Service Oriented Environment Michelle Dirner Army Net Centric Strategy (ANCDS) Center of Excellence (CoE) Service Team Lead RDECOM CERDEC SED in support

More information

ASSURING DATA INTEROPERABILITY THROUGH THE USE OF FORMAL MODELS OF VISA PAYMENT MESSAGES (Category: Practice-Oriented Paper)

ASSURING DATA INTEROPERABILITY THROUGH THE USE OF FORMAL MODELS OF VISA PAYMENT MESSAGES (Category: Practice-Oriented Paper) ASSURING DATA INTEROPERABILITY THROUGH THE USE OF FORMAL MODELS OF VISA PAYMENT MESSAGES (Category: Practice-Oriented Paper) Joseph Bugajski Visa International JBugajsk@visa.com Philippe De Smedt Visa

More information

European Standards- preparation, approval and role of CEN. Ashok Ganesh Deputy Director - Standards

European Standards- preparation, approval and role of CEN. Ashok Ganesh Deputy Director - Standards European Standards- preparation, approval and role of CEN Deputy Director - Standards 1 European Standarization why?, 2010-10-14 CEN-CENELEC 2010 2 What standards do enhance the safety of products allow

More information

Consolidation Team INSPIRE Annex I data specifications testing Call for Participation

Consolidation Team INSPIRE Annex I data specifications testing Call for Participation INSPIRE Infrastructure for Spatial Information in Europe Technical documents Consolidation Team INSPIRE Annex I data specifications testing Call for Participation Title INSPIRE Annex I data specifications

More information

Step: 9 Conduct Data Standardization

Step: 9 Conduct Data Standardization Step: 9 Conduct Data Standardization Version 1.0, February 2005 1 Step Description/Objectives: Step 9, Conduct Data Standardization, is intended to reduce the life cycle cost of data through data integration,

More information

Department of the Navy XML Naming and Design Rules (NDR) Overview. 22 September 2004 Federal CIO Council XML WG Mark Crawford LMI

Department of the Navy XML Naming and Design Rules (NDR) Overview. 22 September 2004 Federal CIO Council XML WG Mark Crawford LMI Department of the Navy XML Naming and Design Rules (NDR) Overview 22 September 2004 Federal CIO Council XML WG Mark Crawford LMI Why do you need XML rules? To achieve interoperability! Department (e.g.

More information

cybersecurity in Europe Rossella Mattioli Secure Infrastructures and Services

cybersecurity in Europe Rossella Mattioli Secure Infrastructures and Services Enhancing infrastructure cybersecurity in Europe Rossella Mattioli Secure Infrastructures and Services European Union Agency for Network and Information Security Securing Europe s Information society 2

More information

ISO INTERNATIONAL STANDARD. Information and documentation Managing metadata for records Part 2: Conceptual and implementation issues

ISO INTERNATIONAL STANDARD. Information and documentation Managing metadata for records Part 2: Conceptual and implementation issues INTERNATIONAL STANDARD ISO 23081-2 First edition 2009-07-01 Information and documentation Managing metadata for records Part 2: Conceptual and implementation issues Information et documentation Gestion

More information

ISO/IEC JTC 1 N 13145

ISO/IEC JTC 1 N 13145 ISO/IEC JTC 1 N 13145 ISO/IEC JTC 1 Information technology Secretariat: ANSI (United States) Document type: Title: Status: Business Plan BUSINESS PLAN FOR ISO/IEC JTC 1/SC 40, IT SERVICE MANAGEMENT AND

More information

Security Assertions Markup Language (SAML)

Security Assertions Markup Language (SAML) Security Assertions Markup Language (SAML) The standard XML framework for secure information exchange Netegrity White Paper PUBLISHED: MAY 20, 2001 Copyright 2001 Netegrity, Inc. All Rights Reserved. Netegrity

More information

Business Architecture Implementation Workshop

Business Architecture Implementation Workshop Delivering a Business Architecture Transformation Project using the Business Architecture Guild BIZBOK Hands-on Workshop In this turbulent and competitive global economy, and the rapid pace of change in

More information

Workshop IT Star IT Security Professional Positioning and Monitoring: e-cfplus support

Workshop IT Star IT Security Professional Positioning and Monitoring: e-cfplus support Workshop IT Star 2016 IT Security Professional Positioning and Monitoring: e-cfplus support Roberto Bellini AICA-Milan October, 28 th 2016 agenda 1. e-cf standard and the enriched e-cfplus System 2. IT

More information

GREEN DEFENCE FRAMEWORK

GREEN DEFENCE FRAMEWORK GREEN DEFENCE FRAMEWORK Approved by the North Atlantic Council in February 2014 GREEN DEFENCE FRAMEWORK OVERVIEW 1. Green Defence could, at this stage, be defined as a multifaceted endeavour cutting across

More information

Interoperability and Service Oriented Architecture an Enterprise Architect's approach

Interoperability and Service Oriented Architecture an Enterprise Architect's approach Interoperability and Service Oriented Architecture an Enterprise Architect's approach Peter Bernus and Ovidiu Noran 1 Griffith University, Nathan (Brisbane) Queensland 4111, Australia P.Bernus@griffith.edu.au,

More information

ASSESSMENT SUMMARY XHTML 1.1 (W3C) Date: 27/03/ / 6 Doc.Version: 0.90

ASSESSMENT SUMMARY XHTML 1.1 (W3C) Date: 27/03/ / 6 Doc.Version: 0.90 ASSESSMENT SUMMARY XHTML 1.1 (W3C) Date: 27/03/2017 1 / 6 Doc.Version: 0.90 TABLE OF CONTENTS 1. INTRODUCTION... 3 2. ASSESSMENT SUMMARY... 3 3. ASSESSMENT RESULTS... 5 4. ASSESSMENT OBSERVATIONS... 5

More information

Scope of the Member State mechanism

Scope of the Member State mechanism FIRST MEETING OF THE MEMBER STATE MECHANISM ON SUBSTANDARD/SPURIOUS/FALSELY-LABELLED/ 2 November 2012 FALSIFIED/COUNTERFEIT MEDICAL PRODUCTS Provisional agenda item 4 Scope of the Member State mechanism

More information

2 The BEinGRID Project

2 The BEinGRID Project 2 The BEinGRID Project Theo Dimitrakos 2.1 Introduction Most of the results presented in this book were created within the BEinGRID project. BEinGRID, Business Experiments in GRID, is the European Commission

More information

WORLD TELECOMMUNICATION STANDARDIZATION ASSEMBLY Hammamet, 25 October 3 November 2016

WORLD TELECOMMUNICATION STANDARDIZATION ASSEMBLY Hammamet, 25 October 3 November 2016 I n t e r n a t i o n a l T e l e c o m m u n i c a t i o n U n i o n ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU WORLD TELECOMMUNICATION STANDARDIZATION ASSEMBLY Hammamet, 25 October 3 November

More information

Test Architect A Key Role defined by Siemens

Test Architect A Key Role defined by Siemens Test Architect A Key Role defined by Siemens Siemens Munich, Germany January 30 February 3, 2017 http://www.oop-konferenz.de Agenda Why do we need a Test Architect? What are the responsibilities and tasks

More information

United States Government Cloud Standards Perspectives

United States Government Cloud Standards Perspectives United States Government Cloud Standards Perspectives in the context of the NIST initiative to collaboratively build a USG Cloud Computing Technology Roadmap NIST Mission: To promote U.S. innovation and

More information

LIMITE EN COUNCIL OF THE EUROPEAN UNION. Brussels, 21 October /13 LIMITE CO EUR-PREP 37. NOTE General Secretariat of the Council

LIMITE EN COUNCIL OF THE EUROPEAN UNION. Brussels, 21 October /13 LIMITE CO EUR-PREP 37. NOTE General Secretariat of the Council COUNCIL OF THE EUROPEAN UNION Brussels, 21 October 2013 12397/13 LIMITE CO EUR-PREP 37 NOTE from: To: General Secretariat of the Council Council Subject: European Council (24-25 October 2013) - Draft conclusions

More information

Anvisning för Svensk Livfaktura

Anvisning för Svensk Livfaktura Anvisning för Svensk Livfaktura Bilaga D: General description of PEPPOL BIS Version 1.1 Upphovsrätt Den här anvisningen för Livfaktura BIS 5A 2.0 är baserad på PEPPOL BIS 5A 2.0 som i sin tur baseras på

More information

COLLECTION OF RAW DATA TASK FORCE MEETING N 7 12 MARCH Doc. CoRD 096. XML for Foreign Trade Statistics. For information

COLLECTION OF RAW DATA TASK FORCE MEETING N 7 12 MARCH Doc. CoRD 096. XML for Foreign Trade Statistics. For information COLLECTION OF RAW DATA TASK FORCE MEETING N 7 12 MARCH 2003 Doc. CoRD 096 XML for Foreign Trade Statistics For information Abstract This paper gives the updated progress report on the development of EDIFACT

More information

EU EHEALTH INTEROPERABILITY,

EU EHEALTH INTEROPERABILITY, EU EHEALTH INTEROPERABILITY, STANDARDIZATION AND DEPLOYMENT STRATEGY Benoit Abeloos, Standardization and Interoperability DG CNECT, Health and Wellbeing Unit COCIR Workshop on Importance of Interoperability:

More information

Background Note on Possible arrangements for a Technology Facilitation Mechanism and other science, technology and innovation issues

Background Note on Possible arrangements for a Technology Facilitation Mechanism and other science, technology and innovation issues Background Note on Possible arrangements for a Technology Facilitation Mechanism and other science, technology and innovation issues Introduction This background note outlines, in a non-exhaustive manner,

More information

Direct, DirectTrust, and FHIR: A Value Proposition

Direct, DirectTrust, and FHIR: A Value Proposition Direct, DirectTrust, and FHIR: A Value Proposition August 10, 2017 Authors: Grahame Grieve, HL7 Product Director for FHIR; David Kibbe, Luis Maas, Greg Meyer, and Bruce Schreiber, members of the DirectTrust

More information

Global Infrastructure Connectivity Alliance Initiative

Global Infrastructure Connectivity Alliance Initiative Global Infrastructure Connectivity Alliance Initiative 1. Background on Global Infrastructure Connectivity Global Infrastructure Connectivity refers to the linkages of communities, economies and nations

More information

Guidelines 1/2018 on certification and identifying certification criteria in accordance with Articles 42 and 43 of the Regulation 2016/679

Guidelines 1/2018 on certification and identifying certification criteria in accordance with Articles 42 and 43 of the Regulation 2016/679 Guidelines 1/2018 on certification and identifying certification criteria in accordance with Articles 42 and 43 of the Regulation 2016/679 Adopted on 25 May 2018 Contents 1. Introduction... 2 1.1. Scope

More information

INCEPTION IMPACT ASSESSMENT. A. Context, Problem definition and Subsidiarity Check

INCEPTION IMPACT ASSESSMENT. A. Context, Problem definition and Subsidiarity Check TITLE OF THE INITIATIVE LEAD DG RESPONSIBLE UNIT AP NUMBER LIKELY TYPE OF INITIATIVE INDICATIVE PLANNING December 2017 ADDITIONAL INFORMATION - INCEPTION IMPACT ASSESSMENT Governmental Satellite Communications

More information

ehealth EIF ehealth European Interoperability Framework European Commission ISA Work Programme

ehealth EIF ehealth European Interoperability Framework European Commission ISA Work Programme ehealth EIF ehealth European Interoperability Framework European Commission ISA Work Programme Overall Executive Summary A study prepared for the European Commission DG Connect This study was carried out

More information

National Information Exchange Model (NIEM):

National Information Exchange Model (NIEM): National Information Exchange Model (NIEM): DoD Adoption and Implications for C2 D r. S c o t t R e n n e r Presented at 19th International Command and Control Research and Technology Symposium (ICCRTS)

More information

Glossary of Exchange Network Related Groups

Glossary of Exchange Network Related Groups Glossary of Exchange Network Related Groups CDX Central Data Exchange EPA's Central Data Exchange (CDX) is the point of entry on the National Environmental Information Exchange Network (Exchange Network)

More information

WHO-ITU National ehealth Strategy Toolkit

WHO-ITU National ehealth Strategy Toolkit WHO-ITU National ehealth Strategy Toolkit Context and need for a National Strategy A landscape of isolated islands of small scale applications unable to effectively communicate and to share information

More information

Response to the. ESMA Consultation Paper:

Response to the. ESMA Consultation Paper: Response to the ESMA Consultation Paper: Draft technical standards on access to data and aggregation and comparison of data across TR under Article 81 of EMIR Delivered to ESMA by Tahoe Blue Ltd January

More information

Solving the Enterprise Data Dilemma

Solving the Enterprise Data Dilemma Solving the Enterprise Data Dilemma Harmonizing Data Management and Data Governance to Accelerate Actionable Insights Learn More at erwin.com Is Our Company Realizing Value from Our Data? If your business

More information

EUROPEAN PLATFORMS AND INITIATIVES FOR C-ITS DEPLOYMENT

EUROPEAN PLATFORMS AND INITIATIVES FOR C-ITS DEPLOYMENT EUROPEAN PLATFORMS AND INITIATIVES FOR C-ITS DEPLOYMENT Dr. Angelos Amditis Research Director, ICCS OUTLINE Introduction C-ITS Platform GEAR2030 Digital Transport & Logistics Forum (DTLF) EC Directive

More information

OMG Specifications for Enterprise Interoperability

OMG Specifications for Enterprise Interoperability OMG Specifications for Enterprise Interoperability Brian Elvesæter* Arne-Jørgen Berre* *SINTEF ICT, P. O. Box 124 Blindern, N-0314 Oslo, Norway brian.elvesater@sintef.no arne.j.berre@sintef.no ABSTRACT:

More information

SPACE for SDGs a Global Partnership

SPACE for SDGs a Global Partnership SPACE for SDGs a Global Partnership for the Coordination of the Development, Operation and Utilization of Space related Infrastructure, Data, Information and Services in support of the 2030 Development

More information

Cybersecurity. Quality. security LED-Modul. basis. Comments by the electrical industry on the EU Cybersecurity Act. manufacturer s declaration

Cybersecurity. Quality. security LED-Modul. basis. Comments by the electrical industry on the EU Cybersecurity Act. manufacturer s declaration Statement Comments by the electrical industry on the EU Cybersecurity Act manufacturer s declaration industrial security Cybersecurity Quality basis security LED-Modul Statement P January 2018 German Electrical

More information

ESFRI Strategic Roadmap & RI Long-term sustainability an EC overview

ESFRI Strategic Roadmap & RI Long-term sustainability an EC overview ESFRI Strategic Roadmap & RI Long-term sustainability an EC overview Margarida Ribeiro European Commission DG Research & B.4 - Research Infrastructure Research and What is ESFRI? An informal body composed

More information

European Transport Policy: ITS in action ITS Action Plan Directive 2010/40/EU

European Transport Policy: ITS in action ITS Action Plan Directive 2010/40/EU European Transport Policy: ITS in action ITS Action Plan Directive 2010/40/EU Hermann Meyer, CEO ERTICO IMPACTS, Barcelona, 31 March 2011 This presentation is mainly based on charts which were already

More information

Research Infrastructures and Horizon 2020

Research Infrastructures and Horizon 2020 Research Infrastructures and Horizon 2020 Christos VASILAKOS DG Research & 1 st CoPoRI Workshop on EoE 11-12 June 2012 Hamburg, DE The EU Framework Programme for Research and 2014-2020 Research and Europe

More information

The emerging EU certification framework: A role for ENISA Dr. Andreas Mitrakas Head of Unit EU Certification Framework Conference Brussels 01/03/18

The emerging EU certification framework: A role for ENISA Dr. Andreas Mitrakas Head of Unit EU Certification Framework Conference Brussels 01/03/18 The emerging EU certification framework: A role for ENISA Dr. Andreas Mitrakas Head of Unit EU Certification Framework Conference Brussels 01/03/18 European Union Agency for Network and Information Security

More information

SLOVAK FOREST CERTIFICATION SYSTEM September 1, 2008

SLOVAK FOREST CERTIFICATION SYSTEM September 1, 2008 SLOVAK FOREST CERTIFICATION SYSTEM September 1, 2008 REQUIREMENTS FOR CERTIFICATION BODIES CONDUCTING FOREST CERTIFICATION AND CHAIN - OF - CUSTODY OF WOOD VERIFICATION SFCS 1005:2004 Effective as of September

More information

Standardization of Knowledge and Skills for IT Security

Standardization of Knowledge and Skills for IT Security Standardization of Knowledge and Skills for IT Security Milan Friday, October 28th 2016 Veronica Salsano Overview Standardization in general Legislation Technical foundations Actors Current situation Security

More information

Testing for Reliable and Dependable Health Information Exchange

Testing for Reliable and Dependable Health Information Exchange Testing for Reliable and Dependable Health Information Exchange Presented by Didi Davis, Testing Programs Director 1 Copyright 2016 The Sequoia Project. All rights reserved. Discussion Topics 1. ehealth

More information

ConCert FAQ s Last revised December 2017

ConCert FAQ s Last revised December 2017 ConCert FAQ s Last revised December 2017 What is ConCert by HIMSS? ConCert by HIMSS is a comprehensive interoperability testing and certification program governed by HIMSS and built on the work of the

More information

IT Governance Framework at KIT

IT Governance Framework at KIT [unofficial English version; authoritative is the German version] IT Governance Framework at KIT 1. Recent situation and principle objectives Digitalization increasingly influences our everyday life at

More information

DAPhNE Danube Ports Network Motivation & Project Status Q4 / 2018

DAPhNE Danube Ports Network Motivation & Project Status Q4 / 2018 DAPhNE Danube Ports Network Motivation & Project Status Q4 / 2018 1 Project co-funded by European Union Funds (ERDF, IPA) Danube Ports and the challenges ahead Approx. 70 ports along 2.414 km of navigable

More information