Common Graphics Engine-1 Interface Software Development Plan

Similar documents
Certification Authorities Software Team (CAST) Position Paper CAST-25

3 August Software Safety and Security Best Practices A Case Study From Aerospace

Standard Development Timeline

ida Certification Services IEC Functional Safety Assessment Project: Masoneilan Smart Valve Interface, SVI II ESD Customer: GE Energy

Hitachi-GE Nuclear Energy, Ltd. UK ABWR GENERIC DESIGN ASSESSMENT Resolution Plan for RO-ABWR-0028 Safety System Logic & Control (SSLC) Class 1 HMI

Standard Development Timeline

Federal Communication Commission (FCC) Office of Engineering and Technology (OET) Program Accreditation Procedure

<PROJECT NAME> IMPLEMENTATION PLAN

FAA Order Simple And Complex Electronic Hardware Approval Guidance. Federal Aviation Administration

NIAC Membership Application Checklists

Guidelines for deployment of MathWorks R2010a toolset within a DO-178B-compliant process

Standard CIP Cyber Security Critical Cyber Asset Identification

IBM Rational Rhapsody

Standard CIP Cyber Security Critical Cyber Asset Identification

Standard Development Timeline

IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF COLUMBIA

Internal Audit Follow-Up Report. Multiple Use Agreements TxDOT Office of Internal Audit

Audit Report. Mineral Products Qualifications Council (MPQC) 31 March 2014

Software Requirements Specification. <Project> for. Version 1.0 approved. Prepared by <author(s)> <Organization> <Date created>

Database Integrity Policy for Aeronautical Data

Code of Practice for the TL 9000 Certification Process. Release 8.0

1. Post for 45-day comment period and pre-ballot review. 7/26/ Conduct initial ballot. 8/30/2010

Case 1:98-cv CKK Document Filed 06/15/2006 Page 1 of 7 IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF COLUMBIA

CIP Cyber Security Configuration Change Management and Vulnerability Assessments

IBM Rational Rhapsody

CYBERSECURITY IMPLEMENTATION FOR THE MARINE AND OFFSHORE INDUSTRIES

Test and Evaluation of Autonomous Systems in a Model Based Engineering Context

IATF Transition Strategy Presenter: Cherie Reiche, IAOB

TEXAS DEPARTMENT OF INFORMATION RESOURCES. Test Scenario. Instructions. Version DEC 2006

American Association for Laboratory Accreditation

DO-330/ED-215 Benefits of the new Tool Qualification

NERC Transmission Availability Data System (TADS): Element Identifier Data Submission Addendum

A Model-Based Reference Workflow for the Development of Safety-Related Software

Circular Logic. Robotic Tram Data Collection System Software Configuration Management Plan Version 2.3 4/8/ Circular Logic

QA Best Practices: A training that cultivates skills for delivering quality systems

Audit Report. Chartered Management Institute (CMI)

DQS Inc. Management Systems Solutions Certification Requirements

ONC Health IT Certification Program

DATA ITEM DESCRIPTION

FedRAMP: Understanding Agency and Cloud Provider Responsibilities

Audit Report. The Chartered Institute of Personnel and Development (CIPD)

This section is maintained by the drafting team during the development of the standard and will be removed when the standard becomes effective.

Due Diligence Checklist #20

Battery Program Management Document

Inter American Accreditation Cooperation. IAAC, IAF and ILAC Resolutions Applicable to IAAC MLA Peer Evaluations

Development Guidance and Certification Considerations

Recommended Practice for Software Requirements Specifications (IEEE)

Article I - Administrative Bylaws Section IV - Coordinator Assignments

Common Operating Environment (COE) Platform Certification Program. Certification Policy

Smart Meters Programme Schedule 6.3. (Development Process) (CSP North version)

WinWedge Pro Validation Report

Certification Report

Certification Report

Standard Development Timeline

APPROVAL SHEET PROCEDURE INFORMATION SECURITY MANAGEMENT SYSTEM CERTIFICATION. PT. TÜV NORD Indonesia PS - TNI 001 Rev.05

Software design descriptions standard

Integrated Modular Avionics Development Guidance and Certification Considerations

Advisory Circular. Subject: INTERNET COMMUNICATIONS OF Date: 11/1/02 AC No.: AVIATION WEATHER AND NOTAMS Initiated by: ARS-100


Standard COM-002-2a Communications and Coordination

Accreditation Criteria For Conformity Assessment Bodies

21 CFR PART 11 FREQUENTLY ASKED QUESTIONS (FAQS)

REVISION 5. International Automotive Task Force TRANSITION STRATEGY ISO/TS IATF Dated 1 March

IATF Transition Strategy Presenter: Mrs. Michelle Maxwell, IAOB

Part 5. Verification and Validation

Provider Monitoring Report. City and Guilds

Protection Profile for Connected Diabetes Devices (CDD PP) Extended Package: Moderate

LACS Link Product Developer Licensee Performance Requirements

Provider Monitoring Process

COMMERCIAL FURNACES CERTIFICATION PROGRAM

Verification and Validation

NEWCASTLE CLINICAL TRIALS UNIT STANDARD OPERATING PROCEDURES

Verification and Validation

Government of Ontario IT Standard (GO ITS)

Audit Report. English Speaking Board (ESB)

Accreditation Body Evaluation Procedure for AASHTO R18 Accreditation

Release of Octopus/UML

Chapter 4 Objectives

Certification Report

CIP Cyber Security Configuration Management and Vulnerability Assessments

POSIX : Certified by IEEE and The Open Group. Certification Policy

NOTE: This includes Aerospace Auditors (AAs) and Aerospace Experienced Auditors (AEAs)

Programming Practices By Joe Feliu in conjunction with Harris Kern s Enterprise Computing Institute

SAI GLOBAL PRODUCT SERVICES

APPENDIX B STATEMENT ON STANDARDS FOR CONTINUING PROFESSIONAL EDUCATION (CPE) PROGRAMS

Helix Test Case Management Best Practices

Harmonization of usability measurements in ISO9126 software engineering standards

Government of Ontario IT Standard (GO ITS) GO-ITS Number 56.3 Information Modeling Standard

Software Verification and Validation (VIMMD052) Introduction. Istvan Majzik Budapest University of Technology and Economics

Quality Indicators for Automotive Test Case Specifications

Request for Qualifications for Audit Services March 25, 2015

Test Procedure for (s) Integrity

CASA External Peer Review Program Guidelines. Table of Contents

Part I: Preliminaries 24

Proposed Clean and Redline for Version 2 Implementation Plan

Vol. 1 Technical RFP No. QTA0015THA

The Open Group Professional Certification Program. Accreditation Requirements

Certification Report

Certification Report

DEMOCRATIC SOCIALIST REPUBLIC OF SRI LANKA MERCHANT SHIPPING SECRETARIAT MINISTRY OF PORTS AND SHIPPING

Transcription:

Common Graphics Engine-1 Interface August 16, 1998 Avionics Graphics Corporation Cedar Rapids, Iowa Copyright ª 1998 Roger Shultz page: 1 of 26

Notices and Signatures Notices and Signatures Notices Notice: This document represents plans for a fictitious product and in that vein, has never been sold or used for commercial gain. It represents an industrial quality plan but similarity to any actual products is coincidental. Approval Signatures NAME SIGNATURE/DATE Prepared by: Approved by: Checked by (SCL): Roger Shultz Department Manager SCL Quality Assurance: Document Production Software The following software was used to produce this document. Word Processor: Microsoft Word for Windows, Version 6.0A Graphics Software: Microsoft Word for Windows, Version 6.0A page: 2

Revision History Revision History Ver/Rev Doc Chg # Release Date Originator Reason(s) for Change 1.0 NA 10/11/95 R. K. Shultz Original version. 2.0 NA 08/15/98 R. K. Shultz Sanitized for Class Use page: 3

Table of Contents 1. SCOPE... 6 1.1. PURPOSE... 6 1.2. APPLICABILITY... 6 2. REFERENCE DOCUMENTS... 7 3. SOFTWARE DEVELOPMENT... 9 3.1. SOFTWARE STANDARDS... 9 3.2. LIFE CYCLE...10 3.2.1. Initiate Work Packages...10 3.2.2. Software Product Development Cycle...11 3.2.2.1. Develop Software Requirements... 11 3.2.2.1.1. Transition Criteria... 11 3.2.2.1.2. Tasks... 11 3.2.2.2. Develop Software Design... 11 3.2.2.2.1. Transition Criteria... 11 3.2.2.2.2. Tasks... 11 3.2.2.3. Develop Software Requirements Test Cases... 14 3.2.2.3.1. Transition Criteria... 14 3.2.2.3.2. Tasks... 14 3.2.2.4. Develop Code... 14 3.2.2.4.1. Transition Criteria... 14 3.2.2.4.2. Tasks... 14 3.2.2.5. Develop Software Design Test Cases... 14 3.2.2.5.1. Transition Criteria... 14 3.2.2.5.2. Tasks... 15 3.2.2.6. Develop Test Procedures... 15 3.2.2.6.1. Transition Criteria... 15 3.2.2.6.2. Tasks... 15 3.2.2.7. Integrate and Test Software... 15 3.2.2.7.1. Transition Criteria... 15 3.2.2.7.2. Tasks... 16 3.2.2.8. Structural Coverage Analysis and Testing... 16 3.2.2.8.1. Transition Criteria... 16 3.2.2.8.2. Tasks... 16 3.3. DEVELOPMENT ENVIRONMENT...17 4. SOFTWARE VERIFICATION...18 4.1. ORGANIZATION...18 4.2. INDEPENDENCE...19 4.3. VERIFICATION METHODS...19 4.3.1. Review methods...19 4.3.2. Analysis methods...19 4.3.3. Test Procedure Development...20 4.3.3.1. Test Procedure Verification... 20 4.3.3.2. Test Analysis... 21 4.3.3.3. Retest... 21 4.4. VERIFICATION ENVIRONMENT...22 4.5. TRANSITION CRITERIA...23 page: 4

Scope 5. SOFTWARE QUALITY ASSURANCE...24 5.1. ENVIRONMENT...24 5.2. AUTHORITY...24 5.3. ACTIVITIES...25 5.3.1. Life Cycle Audits...25 5.3.2. Problem Reporting and Tracking...25 5.3.3. Software Conformity Review...25 5.4. TRANSITION CRITERIA...25 5.5. TIMING...26 5.6. SQA RECORDS...26 List of Figures FIGURE 3-1. DEVELOPMENT CYCLE PROCESS...10 FIGURE 3-2. SOFTWARE PRODUCT DEVELOPMENT PROCESS...13 List of Tables TABLE 3-1. CGE1 ARTIFACT STANDARDS... 9 TABLE 3-2. SOFTWARE ENVIRONMENT...17 TABLE 4-1. TEST PROCEDURE DEVELOPMENT ENVIRONMENT...22 TABLE 4-2. INTEGRATE/TEST SOFTWARE ENVIRONMENT...22 page: 5

Scope 1. Scope This is the software development plan, verification plan, and quality assurance plan for creating a product ready version of the Common Graphics Engine Interface specialized for the Graphics Engine 1. 1.1. Purpose Provide Avionics Graphics Corporation display applications with a common software interface to the Graphics Engine 1 (GE1) hardware[1]. The application interface is called the Common GE1 Interface (CGE1) and is a specialization of the display systems object oriented development project, Common Graphics Engine Interface, done in Advanced Technology and Engineering [2]. The CGE1 will meet product area requirements for memory space, speed, functionality and regulatory verification [3]. In addition, the CGE1 will provide an easy to use graphics interface similar to industry standards and hiding the future changes in graphics engine hardware. 1.2. Applicability The software addressed in this plan includes all layers, classes, and objects comprising the Common Graphics Engine Interface as targeted for the GE1. This GE1 version of the CGEI is a CPCI called CGE1 to be integrated by product developers into future Avionics Graphics Corporation display products. The following deliverables will be produced for use by Avionics Graphics Corporation product developers. CGE1 User s Guide - A CGE1 User s Guide will provide a description of the functional interface and its performance. Software Design Document - This document details the interface supplied to the application programmer. It provides the formal contract of functionality and performance. The states of the interface, algorithms and data structures are provided in an object oriented design language. The design of the CGE1 is a specialization of the Software Design Document (SDD). GE1 Hardware Interface Specification - This document provides a detailed view of the Graphics Engine-1 programming interface. It is used for correct specialization of the CGE1 interface. Ada Source Code - This code is an implementation of the CGE1 interface and targeted to run on an Intel 486 board, which communicates with the GE1. Specifications for the GE1 are taken from the GE1 Hardware Interface Specification. CGE1 Test Cases - These are timing test specifications giving results listed in the CGE1 User s Guide and design test specifications for verification of the design specified in the SDD. Steps for executing the tests are summarized. Each test case will specify expected testing results and implementation of test drivers. Software Verification Test Procedures and Results Summary - These procedures are Ada code executable within a test environment architecture designed for regression testing [4]. Many tests will execute on a host environment while all tests will execute on the GE1 target. This is a summary of the traceability of each test case implementation to the SDD, Modified Condition/Decision (MC/DC) Coverage of CGE1 source code, and peer reviews. Software Accomplishment Summary - This summarizes the project and details compliance for CGE1 interface use in FAA regulated applications. page: 6

Reference Documents 2. Reference Documents [1] Graphics Engine 1 Microarchitecture Manual. Special Process Technology Section, Rockwell Corporation, August 1995 [2] Common Graphics Engine Project Plan. Software Process Technology Section., Collins Commercial Avionics, Rockwell Corporation, April 1995. [3] Software Considerations in Airborne Systems and Equipment Certification, Document No. RTCA/DO-178B, December 1992, RTCA, Washington, D.C. [4] Test Environment Software Architecture. Software Process Technologies, Rockwell International, June 1995. [5] Software Development Process Manual. CPN 523-0777-331, Collins Commercial Avionics, Rockwell Corporation, August 15, 1995. [6] Daugherty, Gary. Object-Oriented Development Handbook, Rockwell International, March 15, 1993. [7] Embley, David and Scott Woodfield. Assessing the Quality of Abstract Data Types Written in Ada, in Proceedings of the Tenth International Conference on Software Engineering, IEEE Computer Society Press, April 1988, pp. 144-153. [8] Daugherty, Gary. OCL: A Ninety Minute Introduction, Rockwell International, internal paper, September 4, 1992, available from the author. [9] Daugherty, Gary. OCL: A Text Notation for Class Hierarchies, Rockwell International, internal paper, July 9, 1992, available from the author. [10] Meyer, Bertrand. Eiffel: The Language, Prentice-Hall, Englewood Cliffs, NJ, 1992. [11] The RAISE Specification Language, Prentice-Hall, Englewood Cliffs, NJ, 1992. [12] Harel, David. ``Statecharts: A visual formalism for complex systems, Science of Computer Programming, vol. 8, pp. 231.274, 1987. [13] Harel, David. ``On the Formal Semantics of Statecharts, Proceedings of the 2 nd IEEE Symposium on Logic in Computer Science, 1987, pp. 54.64. [14] The Semantics of State Charts, i-logix Inc., 22 Third Avenue, Burlington, Mass. 01803, January 1991. [15] Daugherty, Gary. An Ada Coding Standard for OO Programming & Reuse, Rockwell International, March 9, 1992. page: 7

Reference Documents [16] Ada Coding Standard (SES-101), Rockwell International, CPN: 826-8765-001, November 22, 1991. [17] Daugherty, Gary. The Implementation of CGE Classes in Ada, internal paper, available from the author [18] Daugherty, Gary. A Proposed Common Graphics Engine Interface, draft of a proposed internal standard, available from the author. [19] Software Configuration Management Plan for the CGEI. Software Process Technology. Advanced Technology and Engineering, 1995. [20] Method for Software Peer Reviews. Collins Commercial Avionics. CPN 829-5744-010, March 1993. [21] Automated Test Coverage Analyzer User's Manual and Operational Requirements. Software Process Technologies, AT&E, Rev 3.1, March 12, 1993. [22] Myers, Glenford J. The Art of Software Testing. John Wiley and Sons, New York, 1979. [23] Software Quality Assurance Plan. Collins General Aviation Division. CPN 827-3478-001. February 1994. page: 8

Software Development 3. Software Development The overall process for development is defined by the CCA Software Development Process Manual [5]. Our object oriented development methods and examples of their use can be found in the Object Oriented Development Handbook [6]. This handbook serves as a further tailoring of the CCA Software Development Process. 3.1. Software Standards Standards for the development process artifacts are listed in Table 3-1. Table 3-1. CGE1 Artifact Standards Artifact Standards CGE1 User s Guide TBD Software Design Document (SDD) [7] Context diagrams TBD Entity relationship diagrams TBD Interface specifications [8][9][10][11] State diagrams [12][13][14] Design scenarios TBD Test Cases TBD Ada`83 source code [15][16][17] page: 9

Software Development 3.2. Life Cycle The software development process is a tailored version of the CCA Software Development Process. The software development process is described using a technique called Structured Process Flows. The process flows are hierarchical and each process box is numbered. The numbers can be used to find the next level of detail for that process if it is expanded and to find the text which describes the process step. Lines with arrowheads flowing into a process box indicate data that is used by the process step. Lines with arrowheads flowing out of the process step indicate data that is modified or produced by the process step. Figure 3.2-1, Perform Development Cycle, illustrates the software development process at the highest level. Figure 3-1. Development Cycle Process DO Initiate Work Packages 3.2.1 New Features / Requirements Change Requests Software Product Development Cycle 3.2.2 Work Packages Until No More Work Packages Schedules SW Build Plan Work Packages 3.2.1. Initiate Work Packages The initiation of a work package causes the development cycle to begin. Work packages are created at the beginning of the project and when a major change or enhancement is planned for the software. Work packages will be assembled at the CPCI level containing the applicable data based on functionality. The lead or assigned member of the team will maintain the work package until the applicable product is released at which time it will be filed into a project file and maintained by the project librarian. These work packages consist of paper copies. Work packages include the following information: page: 10

Software Development the name of the specific individual or team responsible for the work package the definition of the product to be developed the development schedule an estimate of the expected development effort applicable change request peer review forms and action item records (as they become available) a listing of the artifacts to be produced by the development effort Change requests, software build plans, and schedules are used to create work packages. Change requests logged into the file go through a change control board (CCB) for assignment into a software build plan. A complete description of the change management process is provided in the Software Configuration Management Plan [19]. 3.2.2. Software Product Development Cycle Figure 3-2 shows the software development and verification process flow. 3.2.2.1. Develop Software Requirements 3.2.2.1.1. Transition Criteria System Requirements documentation and hardware requirements and design documentation, or portions thereof, have been reviewed and approved. 3.2.2.1.2. Tasks Develop Software Requirements - The CGE1 Software Requirements are based on a paper developed from previous display systems reuse studies [18]. A User s Guide for the CGE1 will be developed that summarizes the functional and performance requirements. In addition, derived requirements will be developed based on a general interface design and the GE1 hardware architecture. These derived requirements provide the application interface details, are found in the Software Design Document (See Section 3.2.2.2) and are the primary definition of a correct CGE1 interface. The CGE1 User s Guide will be under engineering configuration management [19]. Review the Software Requirements - The CGE1 User s Guide will be reviewed per the department Peer Review Method [20] using the checklists distributed with the review invitations. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder maintained by the assigned lead during development and the project librarian following release. 3.2.2.2. Develop Software Design 3.2.2.2.1. Transition Criteria Development of the design may begin once the software requirements specification, or portions thereof, have been reviewed and approved and contain sufficient content and detail to allow the design activities to begin. 3.2.2.2.2. Tasks Develop Software Design - The CGE1 will use Object Oriented Design (OOD) techniques for developing the software. The Software Design Document (SDD) will be maintained under engineering configuration control [19]. The SDD is the primary definition of what constitutes a correct CGE1 interface. Thus, verification is focused on the testing of requirements specified in this SDD. page: 11

Software Development Review the Software Design Documentation - The SDD will be reviewed per the CCA Peer Review Method [20] using the checklists distributed with the review invitations. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder. page: 12

Software Development Figure 3-2. Software Product Development Process Develop SW Requirements 3.2.2.1 Advanced Software Methods Display System Requirements CGE1 User s Guide And Develop SW Design 3.2.2.2 CGE1 User s Guide Display System Design Software Design Develop SW Requirements Test Cases 3.2.2.3 CGE1 User s Guide Timing Test Cases User s Guide Coverage Analysis Results And Develop Code 3.2.2.4 Software Design Source Code Object Code Develop SW Design Test Cases 3.2.2.5 Software Design SW Design Test Cases SW Design Coverage Analysis Results And And Develop Test Procedures 3.2.2.6 SW Design Test Cases Timing Test Cases CGE1 User s Guide Software Design Test Procedure (Driver Code) Integrate and Test Software 3.2.2.7 Source Code Object Code Software Design CGE1 User s Guide Test Procedures GE1 Hardware Software Build Test Results Structural Coverage Analysis and Testing 3.2.2.8 Source Code Object Code Software Design CGE1 User s Guide GE1 Hardware SW Design Test Cases Timing Test Cases Structural Test Cases Structural Test Procedures Structural Based Test Results Structural Coverage Analysis Results page: 13

Software Development 3.2.2.3. Develop Software Requirements Test Cases 3.2.2.3.1. Transition Criteria The software requirements specification, or portions thereof, have been reviewed and approved. 3.2.2.3.2. Tasks Develop Software Requirements Test Cases - Software requirements test cases are developed and reviewed in parallel with the development of the software design. Test cases are developed at this point in the process to ensure that the software requirements are testable and to ensure that tests are available when needed during the integration and test process step. Test cases define inputs, expected results, and the anticipated testing environment, and they are not executable. The requirements test cases will be developed as described in section 4.3.3, Testing Methods subsection Requirements Based Functional Tests. Requirements test cases will be documented in the CGE1 Test Cases Document. The Software Verification Test Procedure Result (SVPR) summary will include a traceability matrix mapping test cases to the requirements in the CGE1 User s Guide. The SVPR will be developed in Interleaf and maintained under engineering configuration control [19]. Review Software Requirements Test Cases - The CGE1 Test Cases will be reviewed per the CCA Peer Review Method [20] using the checklists distributed with the review invitations. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder maintained by the assigned lead during development and the project librarian following release. 3.2.2.4. Develop Code 3.2.2.4.1. Transition Criteria This activity may begin once the software design, or portions thereof, have been reviewed and approved. 3.2.2.4.2. Tasks Develop Code - Code generation for the CGE1 will be done with the Alsys Ada programming environment, using the Ada 83 language. The Alsys environment will also be used for automating and controlling the build procedures. All newly developed and modified code shall comply with the format and style guides as documented in the appropriate language coding standards. All developmental source code will be maintained under engineering configuration control [19]. Each source code module shall be placed under configuration control prior to being reviewed in a peer review. Review Code - All newly developed and modified source code will be reviewed per the Review Method specified in section 4.3.1, Review Methods. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder. If a code review checklist is not available within the applicable coding standard, the sample checklists distributed with the review invitations. 3.2.2.5. Develop Software Design Test Cases 3.2.2.5.1. Transition Criteria Development of the software design test cases may begin once the software design documentation for the corresponding portion has been reviewed and approved. page: 14

Software Development 3.2.2.5.2. Tasks Develop Software Design Test Cases - Software design, or low-level requirements, test cases are developed and reviewed in parallel with the development of the code. Test cases are developed at this point in the process to ensure that the software design is testable and to ensure that tests are available when needed during the integration and test process step. Test cases define inputs expected results, and the anticipated testing environment, and they are not executable. The design test cases will be developed as described in section 4.3.3. Design test cases will be documented in the CGE1 Test Cases document. The Software Verification Test Procedures and Results (SVPR) summary will include a table mapping test cases to the derived requirements in the SDD. The SVPR will be developed in Interleaf and maintained under engineering configuration control [19] Review Software Design Requirements Test Cases - The SVPR and Test Cases will be reviewed per the department Peer Review Method [20] using the checklists distributed with the review invitations. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder maintained by the assigned lead during development and the project librarian following release. 3.2.2.6. Develop Test Procedures 3.2.2.6.1. Transition Criteria Development of the test procedures may begin once the software requirements test cases and the design requirements test cases, or portions thereof, have been reviewed and approved. 3.2.2.6.2. Tasks Develop Software Test Procedures - Test procedures are developed which implement the tests defined by the test cases. Test procedures include detailed instructions for the step by step execution of the test cases, instructions for evaluating the test results, and Ada code for driving the CGE1 interface. The test procedures will be developed as described in section 4.3.3, Testing Methods subsection Requirements Based Functional Tests. Test procedures for software development for each CPCI will be documented in the CGE1 Test Cases Document. A traceability matrix will be included in the SVPR relating each test case and test procedure to its associated requirement in the CGE1 User s Guide and in the SDD. Review Software Test Procedures - The CGE1 Test Cases Document, Ada driver code and SVPR will be reviewed per the CCA Peer Review Method [20]. An analysis will be performed on the Software Test Procedures as described in section 4.3.2, Analysis Methods subsection Coverage Analysis. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder. 3.2.2.7. Integrate and Test Software 3.2.2.7.1. Transition Criteria Software/software and software/hardware integration may begin once the source code, or portions thereof, have been reviewed and approved. Performing requirements-based tests may begin once the source code and test procedures, or portions thereof, have been reviewed and approved. page: 15

Software Development 3.2.2.7.2. Tasks Integrate Software with Software - As software components become available from the Develop Code activity, they will be linked together as specified in the build plan to create executable object code. Integrate Software with Hardware - Executable object code from the integrate software with software tasks will be loaded into the target computer as specified in the build plan. Execute Requirements-Based Test Procedures - The requirements test procedures will be executed per the build plan to verify that the product functions properly and per section 4.3.3, Testing methods subsection Requirements Based Functional Tests. Requirements-based testing will be performed on the target computer twice, once with the code instrumented and once without. A coverage analysis tool will be used to record the Modified Condition Decision coverage of the requirements-based tests when they are run the first time. The second run of the tests will be without instrumentation. The results from both tests will be reviewed to ensure that the same results are achieved and that the results are consistent with the expected results. Test results and structural coverage results will be documented in the Software Verification Test Procedures and Results (SVPR). The SVPR will be maintained under engineering configuration control [19]. 3.2.2.8. Structural Coverage Analysis and Testing 3.2.2.8.1. Transition Criteria The structural coverage analysis and testing can begin once the requirements test procedures have been successfully run on a software build or a portion thereof. 3.2.2.8.2. Tasks Analyze Structural Coverage of Requirements-Based Tests - The Automated Test Coverage Analyzer (ATCA) produces structural coverage reports [21]. These Structural coverage reports will be analyzed to determine if there are missing requirements, missing requirements tests, dead code, or deactivated code. Missing structural coverage found during the analysis will have change requests written and will be considered by the change control board process for future work packages. Missing coverage may lead to the discovery of Missing requirements Additional test cases Structural coverage test cases and procedures. Develop and Review Structural Coverage test Cases and Procedures - Additional structural coverage test cases and procedures will be written to achieve MC/DC coverage of the source code missed by the requirements-based testing. A traceability matrix will be included in the SVPR relating each test case and test procedure to its associated software component. The SVPR will be reviewed per the CCA Peer Review Method [20]. The peer review form, along with the action item records and checklists, will be filed in a Software Development Folder (SDF). Build and Test Software - Software builds are created as needed to run the structural test procedures. These software builds are produced solely for the purpose of executing the structural based tests. They may include special drivers or stubs as needed to perform the given test. They may also require special external hardware configurations in order for the test to be executed. Hardware and software configurations used to perform the tests will be documented in the SVPR document along with the test results. Analyze Structural Test Results - The structural test results are analyzed to ensure that the expected results match the actual results. page: 16

Software Development 3.3. Development Environment This section describes the development environment to be used to develop the graphics engine layer software. Tools are grouped by major life cycle activity and shown in Table 3-2. The CGE1 Users Guide, the CGE1 Test Cases Document, and Software Design Document will be created with Interleaf. Table 3-2. Software Environment Tool Description Platform Interleaf 5, e1990 Publishing software Sun Sparc LSE Source code editor VAX DEC CMS Configuration Management VAX DEC Ada v3.0a-9 Host Ada`83 compiler VAX SIMTEL Ada Pretty Printer (16- February-1994) DEC Link (Link command) DEC ACS (Ada compilation system) Alsys Ada compiler (ActivAda for Windows) ver. 5.1.3 Ada pretty printer Host linker Compilation ordering tool Ada`83 cross compiler (with linker) for IBM PC/GE1 demonstration system VAX VAX VAX IBM PC compatible running Windows page: 17

Software Verification 4. Software Verification Software verification methods will be consistent with the highest level of criticality to which equipment using the CGE1 interface will be certified and these methods will comply with all requirements of DO-178B [3]. The verification process is part of the Avionics Graphics Corporation Software Development Process [5]. Verification of the Common Graphics Engine begins with the start of the Develop/Verify Software Requirements activity [5]. Once a reasonable set of requirements have been documented, peer reviews, analysis and test development proceed throughout Develop/Verify Software Design, Develop/Verify Source Code, Develop Software Test Procedures, and Integrate Software activities. A combination of activities will be employed to provide the necessary level of coverage for verification activities commensurate with the requirements of DO-178B for level A certification. Test results, analysis, and reviews will combine to satisfy Level A verification requirements listed in Tables A-3 through A-7 of DO- 178B [3]. Because the CGE1 is a reusable component, its software requirements will be analyzed and reviewed by a team of display system experts, the CGEI Working Group, from CCA and CACD product areas. The software design will be analyzed and reviewed in conjunction with both software requirements and product area working group defined system requirements to ensure coverage of those requirements. The software source code will be analyzed and reviewed with the software design to ensure the coding correctly and completely implements the design. CGE1 System, hardware/software integration, and module test cases will be defined. Test cases will validate requirements for correct execution of the CGE system, interface functions, design specifications and modules. Test cases will be analyzed and reviewed by the project team. From each of the test case definitions, test procedures will be implemented, and executed. Each implementation will be analyzed and reviewed for correct and complete implementation of its test case specification. Results of test procedure execution will be reviewed by the development team and traced to the CGE requirements showing complete coverage of requirements over both normal and abnormal ranges of values. The verification test procedures will be generated using techniques described below. Testing will be performed in accordance with the CCA Software Development Process [5]. Testing results will be analyzed to determine the status of each test and re-testing will be performed, as necessary, based on the test results analysis. The Software Verification Test Procedures Results document will include summaries, matrices and results for each of the following. The SVPR contains a table which records the correspondence of requirement (User s Guide and/or SDD), test case, test procedure driver code, peer review dates, and MC/DC source code coverage report generated by ATCA [21]. As additional supportive materials, peer review records for the CGE1 User s Guide, SDD, Test Cases, Test Procedures, Test Results, and MC/DC coverage reports will be placed in software development folders. 4.1. Organization The Software Development Team has primary responsibility for meeting all the objectives of the software verification process as well as the planning and development processes. The interfaces between the development and verification processes are defined throughout section 3.2, Life Cycle. page: 18

Software Verification 4.2. Independence Since members of the Software Development Team perform development and verification tasks, independence is obtained by having a team member other than the developer of the item complete the associated verification task(s). Development, review, analysis, and test records are maintained which demonstrate verification independence by citing the names of the team member(s) who completed the development and the verification tasks. 4.3. Verification Methods Since the development and verification tasks are performed incrementally throughout the software life cycle, both the development and verification methods for this project are described in section 3.2, Life Cycle. The following subsections provide additional insights into the types of reviews, analysis, and testing methods performed during the project s life cycle. 4.3.1. Review methods The objectives of the Code Review / Inspection are as follows: Ensure that the software design meets the applicable software requirements. Ensure that the software design is properly implemented in code. Confirm there are no mechanizations in the code that are not intended by the design. Software requirements, software design, software source code, test cases, test procedures, and test results will be reviewed in accordance with CCA Method for Peer Reviews [20]. Peer review teams will include at least 2 people other than the author. Prior to review, checklists will be generated from evaluation criteria described in verification plans below. As described in the peer review method, checklist categories are used to classify each discrepancy found during preparation for the peer review and to prompt the reviewer into searching for particular kinds of errors. Peer reviews will be used to verify work products from development of requirements, design, test procedures, and implementation. Also, peer reviews of test result analysis will be conducted. All review results will be stored as specified in the CGE Configuration Management Plan [19]. Any problems or action items that are identified during a review will be quickly corrected, designated not applicable to the current build plan, or assigned to a problem report. A review closure memo is signed by the review moderator to ensure all action items are promptly categorized and tracked. The CCA Method for Peer Reviews contains a copy of Invitation, Exit, and Closure Memos issued for each review [20]. 4.3.2. Analysis methods Coverage Analysis Coverage analysis is done for two purposes. The first objective of coverage analysis is to show traceability from each functional software application interface requirement, specified by the SDD, to the applicable functional test procedure(s) and assuring that the requirement performs the intended function. The second objective of coverage analysis is to determine the source code Modified Condition Decision coverage provided by the SDD based functional test procedures and additional coverage tests. Requirement Coverage page: 19

Software Verification Each test procedure that is written will be related to a specific documented test case. Each test case is created to verify requirements found in the CGE1 User s Guide and/or the SDD. As stated earlier, the SVPR contains a table which records the correspondence of requirement (User s Guide and/or SDD), test case, test procedure driver code, peer review dates, and source code coverage report. MC/DC Coverage Analysis of reports from running the ATCA[21] tool with the Test Cases will determine what code was or was not exercised in a way that meets MC/DC coverage. The requirement based test procedures may not completely exercise the code structure, so additional verification is produced to provide the structural coverage where deficiencies are found. Structural Coverage Analysis Resolution Unexecuted code structures may be the result of the following items. Shortcomings in the requirement based test procedures: The test cases should be supplemented or test procedures changed to provide the missing coverage. Inadequacies in software requirements: The software requirements should be modified and additional test cases developed and test procedures executed. Dead code: The code should be removed and an analysis performed to assess the effect and the need for reverification. Deactivated code: For deactivated code which is not intended to be executed in normal aircraft operation, a combination of analysis and testing should show that the means by which such code could be inadvertently executed are prevented, isolated or eliminated. For deactivated code which is only executed in certain configurations of the target environment, the operational configuration needed for normal execution of this code should be established and additional test cases and test procedures developed to satisfy the required coverage objectives. Code unreachable by functional tests: See Software Module Testing Method 4.3.3. Test Procedure Development CGE1 requirements based testing, design based testing, and added coverage tests will be performed in accordance with tasks described in Sections 3.2.2.3 and 3.2.2.5. The test environment and activities will be documented to allow verification that the test procedures are repeatable. Test results will be compared with the test case expected results generated by analysis of the CGE Requirements Document. A test procedure will only pass when the actual test results agree with the expected results. Accuracies for test results are stated in the corresponding test cases When a module test is executed, an automated test coverage analysis tool will be used [21]. Such tools provide coverage criteria completeness reports as immediate feedback to the test procedure writer. 4.3.3.1. Test Procedure Verification The test procedures will be analyzed and reviewed in conjunction with all applicable documents listed in the CGE to ensure test coverage consistent with DO-178B requirements for the criticality category A. During the review process the test procedures will be analyzed to verify the following. 1. That test cases exist for each software requirement 2. That test case to User s Guide traceability has been correctly identified page: 20

Software Verification 3. That test case to design specification has been correctly identified 4. That test procedures correctly implement test case specifications 5. That test procedures provide traceable MC/DC coverage of all source code, object code coverage, data coupling tests, and control coupling tests 6. That all tools are identified, setup specifications are correct, instructions are clear and concise, that all input data definitions and expected output definitions are correct Test case reviews will occur during the Develop/Verify Requirements and Develop/Verify Design activities. Reviews of test procedures and test results occur during the Integrate Software activity. The SVPR will contain a table summarizing the correspondence between paragraph numbers of the CGE1 User s Guide and/or the SDD with verification data. Peer review dates, test cases, test procedure code, and ATCA reports will be referenced in this table. Test procedures for the CGE1 will be executed on the GE1 demonstrator board. These tests will be written to the specification of the application interface found in the CGE1 User s Guide and/or the SDD. Output from the modules will be recorded as display lists and test logs produced by the Test Environment Architecture [4]. Trace information will be obtained through the use of a source code instrumentation tool called ATCA [21]. This tool generates condition and decision value data that is automatically analyzed for MC/DC coverage. These coverage reports will be inspected for complete coverage of all decisions in the modules. If coverage is not complete, either module requirements will be corrected and tests re-written, or tests explicit to the module code structure will be written. In either case, complete coverage is mandatory. Module test results will be contained in the Software Verification Test Results document. A summary of the test procedure verification activities and results will be included in the Accomplishment Summary Document and supporting documentation will be archived. These outputs are compliant with Table A-7 of DO-178B [3]. 4.3.3.2. Test Analysis Results of each test will be compared with the expected results to determine whether a particular test passes or fails. Criteria for acceptance of a test as passed are detailed in the test case. A failed test will be assigned to a problem report to ensure action items, any resulting changes, and retest activities are tracked. Records from the peer review of these test results will be archived. 4.3.3.3. Retest When test analysis indicates that a test procedure does not achieve the output specified in the test case, the cause of the failure will be identified, a problem report opened, and the corrective action will be defined. After the corrective action has been implemented and the necessary reviews and tests repeated, the associated problem report will be closed. The failed test procedure will be evaluated and augmented to enhance the test coverage in areas related to the section of corrected code. The testing that is performed will ensure coverage of areas related to the affected areas in addition to the affected area itself. When test analysis indicates that the failure was due to an incorrect test procedure, the corrective action will be defined and implemented. After the corrective action has been implemented, and the necessary reviews and tests repeated, the associated problem report will be closed. When additions or changes to software requirements occur, a problem report will be generated to track the resulting changes and the corrective action will be defined. After the corrective action has been implemented, page: 21

Software Verification and the necessary reviews are performed, the applicable test procedure(s) will be evaluated and modified to implement the necessary coverage in areas related to the added or modified code. After all the necessary reviews and tests have been repeated, the associated problem report will be closed. 4.4. Verification Environment Test procedures will be developed on a DEC Vax host environment using tools listed in Table 4-1. The CGE1 communication with the GE1 is emulated by file output. The test procedures will be executed on a circuit board with the GE1 which will execute within an Intel based PC. Tools used for integration test execution are shown in Table 4-2. Table 4-1. Test Procedure Development Environment Tool Description Platform LSE Source code editor VAX DEC Ada v3.0a-9 Host Ada`83 compiler VAX SIMTEL Ada Pretty Printer (16-February- 1994) DEC Link (Link command) DEC ACS (Ada compilation system) COMP_ORDER CPN: 829-7357-001 March 04, 1994 Alsys Ada compiler (ActivAda for Windows) ver. 5.1.3 Ada pretty printer Host linker Compilation ordering tool Compilation ordering tool Ada`83 cross compiler (with linker) for IBM PC/GE1 demonstration system Table 4-2. Integrate/Test Software Environment VAX VAX VAX VAX Tool Description Platform PCDisk, v4.2 Pathworks Alsys Ada compiler (ActivAda for Windows) ver. 5.1.3 Software to download source code from VAX to PC, VAX side Software to download source code from VAX to PC, PC side Ada`83 cross compiler (with linker) for IBM PC/GE1 demonstration system IBM PC compatible running Windows VAX IBM PC or compatible IBM PC compatible running Windows ATCA Ada 83 test coverage analyzer VAX, Target page: 22

Software Verification Interleaf will be used to create the Software Verification Test Procedures and Results document. Test procedure development will take place on a VAX workstation environment utilizing the VMS operating system and Interleaf document processor. Verification procedures, documents, and test results will be placed under configuration management [19]. The verification activity includes creating the verification documentation: Test Cases for the CGE1 Interface and the SVPR. The Test Cases and SVPR will be released once, just before product delivery. Early, unreleased copies of the documents will be used for review and comments. Support Tools The verification effort will be augmented with a number of tools, selected to improve productivity, reduce cost, and improve the verification process. Additionally, tools will be used to improve communication and status reporting. Tools used in the verification process will be under configuration control. The following tools will be used for the verification effort. VAX VMS text search tools VAX VMS text editors (EVE) 4.5. Transition Criteria The transition criterion for entering the verification activities are described in section 3.2, Software Life Cycle. page: 23

Common Graphics Engine 1 Interface 5. Software Quality Assurance The General Aviation Software Quality Assurance Group will assure the quality of the CGE1 software. This assurance will be conducted according to the Software Quality Assurance Plan [23]. 5.1. Environment The Software Quality Assurance (SQA) process accesses the project software life cycle processes and their outputs to obtain assurance that the objectives are satisfied, that deficiencies are detected, evaluated, tracked and resolved, and that the software product and software life cycle data conform to established requirements. The SQA standards, procedures, tools and methods are defined in the General Aviation Division Software Quality Assurance Plan [23]. 5.2. Authority SQA is independent from engineering management. An organization chart illustrating the separation of the quality organization from engineering is contained in the General Aviation Division Software Quality Assurance Plan [23]. In addition, Advanced Technology and Engineering Division which has a separate management reporting structure from that of General Aviation are developing the CGE1. SQA is responsible for auditing and reviewing the software s supporting documentation and the processes that produced the software and documentation. This assures appropriate standards and processes exist and are being adhered to for the entire program life cycle. SQA has full authority to make quality judgments on deliverable software and documentation. This includes the authority to stop delivery when deficiencies warrant. page: 24

Common Graphics Engine 1 Interface 5.3. Activities The following subsections describe the SQA activities that are to be performed for each software life cycle process and throughout the software life cycle. 5.3.1. Life Cycle Audits SQA is responsible for conducting reviews and audits throughout the life cycle of the project. The methods used by SQA for monitoring the software life cycle processes are documented in the General Aviation Division Software Quality Assurance Plan [23]. These methods cover (but are not limited to): Reviews and audits to show that a project is adhering to its published software development, configuration management, and verification plans. Reviews and audits of software verification activities such as: requirements, design, and code reviews; test case (scenario) and test procedure reviews; and test procedure execution. Reviews and audits of the software release process and all software release artifacts. 5.3.2. Problem Reporting and Tracking SQA will audit the project change management process to assure that the process is clearly defined, there is consistent adherence to the process, and problem reports have been properly closed. 5.3.3. Software Conformity Review SQA shall conduct a software conformity review prior to release of the software through the Software Control Library. This review will assure that the software life cycle processes are complete. As a minimum, this review shall include the following items. The planned software life cycle activities have been completed and records of their completion are retained. Software life cycle data developed from specific system requirements, safety-related requirements, or software requirements are traceable to those requirements. Software life cycle data complies with software plans and standards and is controlled in accordance with the SCM Plan [19]. Change requests comply with the requirements of the SDP. Change requests have been evaluated and their status is defined Software requirements deviations are recorded and approved. The released software can be loaded successfully by the use of released instructions. If there are post-certification software modifications, a subset of the software conformity review activities, as justified by the significance of the change, will be performed. 5.4. Transition Criteria The transition criterion for entering the SQA process is the completion of the initial draft of the project software development plan is complete. The SQA interface with the project is initiated when project management provides SQA a copy of the plan for review. page: 25

Common Graphics Engine 1 Interface 5.5. Timing The SQA process activities occur throughout product development and verification. SQA shall schedule reviews and audits in parallel with the software life cycle processes. 5.6. SQA Records SQA process activities for the project are recorded on a Process/Procedure Audit Form in accordance with the General Aviation Division Software Quality Assurance Plan[23]. page: 26