Data Management & Test Scenarios Exercise MDD CDD Validation Dev. RFP Release A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment Operations & Support YOU ARE HERE 1
Learning Objectives Recognize DoD policy on T&E data management, including data security, and archiving and releasing test data. Describe the data authentication process of verifying and validating the test data set, protecting the integrity of test data, and ensuring validity of collected data to meet test objectives. Recognize the need for measurable, high-quality, timely, and cost-effective data; to enable unbiased T&E results. Describe the processes for data failure definition and scoring; including reliability, availability and maintainability scoring conferences. Develop information for a data management plan in support of test and evaluation. Given key requirements of a notional weapon system, develop a test scenario (highlevel test plan); including identification of test conditions, and controlled and uncontrolled variables. Given key requirements of a notional weapon system, develop a test scenario (highlevel test plan) that supports the overall program plan, including opportunities for combined DT/OT. 2
This lesson will cover the following topics: 1. Data Management 2. Student Exercise Lesson Topics 3
Data Management Lesson Topics: 1) Data Management 2) Student Exercise 4
T&E Data / Data Management Policy Data must be collected that will contribute towards assessing: key performance parameters, critical technical parameters, key system attributes, interoperability requirements, cybersecurity requirements, reliability growth, maintainability attributes, developmental test objectives, and others as needed. Paraphrased from DODI 5000.02, Encl 4 par 5a(11) Note: the service T&E regulations give additional information concerning data management. Additional information can be found in the Service folders, on the student CD-ROM.
Data Requirements Base data requirements on: MOEs / MOSs / MOPs Test variables to be measured Sample sizes Evaluation Plan contents Identify agency responsible to collect the data Determine data source, type & format before starting test Exercise sound judgment in determining type & amount of data to be collected Data should be high quality, measurable, timely, and cost effective; and should enable unbiased T&E results 6
Data Analysis, Collection, and Management Plans Purpose of the plans Provide detailed procedures for the collection, reduction, quality assurance, collation, analysis, storage and disposition of data gathered to support evaluations. Objectives of the plans Eliminate duplication of efforts Provide guidance to collection/analysis effort Provide adequate and timely analytical info Manage resources: Instrumentation Data transmission, reduction & storage Data analysis teams (Analysis, Evaluation, Reporting) 7
Elements Of A Test Database The following are essential elements, for designing a test database: Accessible to all stakeholders Used for all T&E data for the organization and/or the system under test Ease of use, and ease of data mining Fields for all necessary data Appropriate choice of software Traceability to the originator / generator of the data Current status of the data (for approval, for info., etc.), version / control number, and date Security of the database Permissions (read / write vs. read only) and other controls 8
Archiving Data Data from all T&E phases must be stored and archived to support both current and future uses (such as future T&E efforts) When practical, use electronic media to store data Set up databases for ease of use and ease of data mining Provide for periodic reviews of the database Follow your organization s guidance concerning data retention, disposition, and disposal The Program Manager and test agencies for all programs will provide the Defense Technical Information Center (DTIC) with all reports and the supporting data and metadata for the test events in those reports. DoDI 5000.02 Encl 5, par 10c(5) 9
Archiving Data Example From ATEC Reg. 73-1 (March 2006) Data Category Raw data data in its original form (Level 1) Audio/video tape and film (Level 1) Written Level 2 data Processed & smoothed automated instrumentation data (Level 2) Test database of record (Level 3) Plans and reports (Levels 4-7) Supplemental analyses (Levels 4-7) Retention Retained for 1 year after end of event Retained for 1 year after end of event Retained for 1 year after end of event Archived for 1 year after end of event Archived permanently Archived permanently Archived for 3 years (nonoversight), 10 years (oversight) 10
DoD Policy For Accessing Test Data The acquisition chain of command, including the Program Manager, and the DASD(T&E) and their designated representatives will have full and prompt access to all ongoing developmental testing, and all developmental test records and reports... Data may be preliminary and will be identified as such. DoDI 5000.02 Encl 4 par 6c(1) DOT&E, the Program Manager and their designated representatives who have been properly authorized access, will all have full and prompt access to all records, all reports, and all data... Data may be preliminary and will be identified as such. DoDI 5000.02 Encl 5 par 10c(1)
Releasing Test Data Within DoD: Test organization commanders determine processes & release authority for reports & information under their control Classified information must be handled per DODD 5200.01, and associated documents Outside the DoD: Freedom of Information Act requests (from individuals or private industry) should be processed according to DoD Regulation 5400.7, and service policy Report news media or civic organization requests to the Public Affairs Officer of the appropriate agency Follow service guidance concerning information released to Congress, the GAO, the DoD Inspector General, and similar agencies Follow service guidance concerning release of info to foreign governments, foreign liaison officers, or foreign nationals 12
Data Authentication & Scoring Prior to testing, the procedure & rules for data / test authentication must be developed Data Authentication Group (DAG) determines the validity of test events & test data Prior to testing, it must also be determined what constitutes a failure (DT&E) or a mission failure (OT&E) This information typically comes from the requirements documents and/or failure definition & scoring process Scoring conference(s) Assigns the reason(s) for test failures 13
Data Authentication Process The services/organizations have processes for data authentication. A typical process includes: Data Authentication Group (DAG) charter and standard operating procedures are developed prior to the start of testing After the test data has been collected, the DAG determines whether the data is valid and/or acceptable Whether the test was a valid test Whether the data represents what really happened (instrumentation error, for example) Once the DAG process has been completed, the DAG releases an authenticated event database 14
Failure Definition & Scoring The services / organizations have processes for failure definition & scoring. A typical process includes: Failure Definition and Scoring Criteria (FD/SC) are developed prior to the start of testing The FD/SC typically lists detailed descriptions of what constitutes a failure, for each essential function. Classification (for example, in which essential function or nonessential function did the failure occur?) Chargeability of test incidents cause(s) of the failures. (For example, accident, crew, HW CFE, HW GFE, SW CFE, SW GFE, maintenance, support equipment, tech docs/manuals, training, secondary failure, or unknown) Scoring conferences occur after test data has been authenticated FD/SC are used to determine classification & chargeability of test incidents that occur during R&M testing which failures count against R&M, and which don t?
Test Scenarios & Data Management Exercise Lesson Topics: 1) Data Management 2) Student Exercise 16
Test Scenarios and Data Management Exercise Given: Key operational, technical, and programmatic requirements Objective: Develop a developmental or operational test scenario, along with Data Collection and Data Management Plans Overview: Task 1. Identify mission objective What s the focus? (The instructor will assign you a CTP or COI) Task 2. Identify test variables (controlled / uncontrolled). Task 3. Develop an operational or developmental test scenario. (Develop test conditions to satisfy variables, and develop a DT or OT scenario) Task 4. Determine information for at least two of your data elements. (This information is needed for the Data Collection, Data Analysis, & Data Management Plans). Task 5. Identify opportunities to combine DT/OT. Are there any opportunities for combined DT/OT in your test scenario? 17
Operational Mission Scenarios Impact on Test Planning Operational mission scenarios allow the following to be identified (which facilitates test planning): Needed test resources (platforms, users, support personnel, instrumentation, range time, etc.) Cost and schedule Necessary terrain & weather conditions Environmental or safety restrictions 18
Dependent Variables Independent Variables Observations Controlled Uncontrolled Primary Factors Background Factors Background Factors Held Constant Natural Group Random Measured Not Measured
Difference Between Test Mission Plans & Detailed Test Plans Test Mission Plans High Level Issue focus COI CTP Detailed Test Plans Detail Level Data focus Detailed info on data collection, data analysis, data mgmt., etc. 20
Data Collection, Analysis, & Data Management Planning Some Data Collection, Data Analysis & Data Mgmt. info. is typically developed along with the Test Scenario: Important data (specific data elements) to be collected Purpose of the data (it will be analyzed to determine what?) Data accuracy and estimated sample sizes needed, for the data elements For the purposes of this exercise, you may state high, medium, or low data accuracy & sample sizes Data collection methods / instrumentation needed, for the data elements Note: more detailed Data Collection/Analysis/Mgmt. planning is typically done later, along with the detailed test plans
Exercise Tasks and Timeline Task 1: Identify controlled / uncontrolled variables for your assigned CTP or COI. Task 2: Develop test conditions for one test scenario / mission plan Task 3: Outline your test scenario / mission plan (Note: you DON T need to assess the entire CTP or COI) Task 4: Determine the following information for at least two data elements that you plan to collect: Purpose of the data, data accuracy & sample sizes needed, and data collection method/instrumentation needed. Task 5: Identify opportunities for Combined DT/OT in your scenario / mission plan (40 minutes to complete all five tasks) 22
COIs and CTPs COIs Can the SPAW be rapidly inserted into the combat environment? Can the SPAW deliver sufficient and accurate fire on the battlefield? Is the SPAW survivable on the battlefield? CTPs Must protect the crew (90% probability of crew survival) against AT mine blast beside or under the platform. MTBF of 128 hours. 23
Thursday Night Homework Read the course material, for the DT&E Test Execution Exercise Read the slides (starting with the DT&E Test Execution Exercise slide) Read the four checklists & supplemental information for the exercise As you read the material, think about how you might write a DRAFT test plan, using the template provided in your book Your team will write a DRAFT test plan, as part of this exercise 24