Title: Verification and Validation of the Tool for Risk-Informed Operations, Version 1.0

Size: px
Start display at page:

Download "Title: Verification and Validation of the Tool for Risk-Informed Operations, Version 1.0"

Transcription

1 Calculation Note Number Revision Shop Order Number Network/Activity Page CN-RAM N/A / Project Releasable (Y/N) Open Items (Y/N) Files Attached (Y/N) Total No. Pages Tool for Risk-Informed Operations Y N Y 142 Title: Verification and Validation of the Tool for Risk-Informed Operations, Version 1.0 Author Name(s) Signature / Date Scope Richard W. Rolland III Electronically Approved* All John S. White Electronically Approved* , 7.36 Stacy A. Davis Electronically Approved* , 7.36 Ryan D. Griffin Electronically Approved* Verifier Name(s) Signature / Date Scope Kyle D. Hope Electronically Approved* All Reviewer Name(s) Signature / Date Scope David S. Teolis Electronically Approved* Methodology Responsible Engineer Name Richard W. Rolland III Manager Name Kyle Christiansen for Daniel L. Sadlon Signature / Date Electronically Approved* Signature / Date Electronically Approved* *Electronically approved records are authenticated in the electronic document management system Westinghouse Electric Company LLC All Rights Reserved

2 CN-RAM Record of Revisions Rev Date Revision Description 1.0 See EDMS Original Issue.

3 CN-RAM Table of Contents 1.0 Introduction Background / Purpose Input Identification of Program Limits of Applicability Summary of Results and Conclusions References Assumptions and Open Items Discussion of Significant Assumptions Open Items Reference to Uniquely Identified Source Listing in a Controlled Location Reference to Baseline Documents Test Cases Verifying the Program Installs Correctly Test Case Description Comparison of Test Results with Expected Results Verifying the Software License Agreement Functions Test Case Description Comparison of Test Results with Expected Results Verifying the Functionality of the Creation, Loading, Saving, and Closing of a Database Test Case Description Comparison of Test Results with Expected Results Verifying the Functionality of the About Window Test Case Description Comparison of Test Results with Expected Results Verifying the Insertion of Plant Definitions Test Case Description Comparison of Test Results with Expected Results Verifying the Creation of System Definitions Test Case Description...24

4 CN-RAM Comparison of Test Results with Expected Results Verifying the Copying of Customer Components Window Test Case Description Comparison of Test Results with Expected Results Verifying the Add Customer Component List Window Test Case Description Comparison of Test Results with Expected Results Verifying the Picklist-Safety Window Functions Test Case Description Comparison of Test Results with Expected Results Verifying the Assign Component Functionality Window Functions Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Stress Model Definition Window Test Case Description Comparison of Test Results with Expected Results Verifying the Picklists-Drawing Window Functions Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Segment Definition Window Test Case Description Comparison of Test Results with Expected Results Verifying the Treatment-Picklist Window Functions Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Consequence Definitions Window Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Map Components Window Test Case Description Comparison of Test Results with Expected Results Verifying the Copy PRA Data Window Test Case Description...37

5 CN-RAM Comparison of Test Results with Expected Results Verifying the Passive Quantitative Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Qualitative Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Shutdown and External Events Impact Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Additional Risk Considerations Window Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Component Margin Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Picklist-IDP Members Functions Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Integrated Decision Making Panel Window Test Case Description Comparison of Test Results with Expected Results Verifying the Passive Stress Model Risk Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Copy Maintenance Rule Data Window Test Case Description Comparison of Test Results with Expected Results Verifying the Active Function Definition Window Test Case Description Comparison of Test Results with Expected Results Verifying the Active Map Components Window Test Case Description...54

6 CN-RAM Comparison of Test Results with Expected Results Verifying the Active Qualitative Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Creation and Modification of the Excel PRA Templates Test Case Description Comparison of Test Results with Expected Results Verifying the Active Load Component PRA Files Window Test Case Description Comparison of Test Results with Expected Results Verifying the Active Quantitative Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Active Integrated Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Active Defense-in-Depth Assessment Window Test Case Description Comparison of Test Results with Expected Results Verifying the Active Integrated Decision Making Panel Window Test Case Description Comparison of Test Results with Expected Results Verifying the Creation and Contents of Reports Test Case Description Comparison of Test Results with Expected Results Verifying the Copy RI-ISI Data Window Test Case Description Comparison of Test Results with Expected Results Verifying the Exit Functionality Test Case Description Comparison of Test Results with Expected Results Listing of Runs Made in Validation Known Problems...75

7 CN-RAM Adequacy and/or Limits of the Validation Adequacy Limits of Validation Glossary...77 Appendix A : Supporting Documentation...79 A.1 Software License Agreement...79 A.2 Configuration Control Information...83 A.3 Treatment Picklist Window...84 A.4 Stress Model Definition...86 A.5 Segment Definitions...87 A.6 Treatment Consequence Mapping Table...90 A.7 Passive Segment Consequence Mapping Table...91 A.8 Passive Segment Component Mapping Table...93 A.9 Passive PRA Runs...95 A.10 Passive Segment PRA Run Mapping Table...96 A.11 Consequence Categories for Qualitative and Quantitative Analysis...99 A.12 Consequence Categories for Shutdown and External Events Impact Assessment Window A.13 Passive Segment Additional Risk Considerations Answer Table A.14 Passive Segment Sufficient Margin Analysis Answer Table A.15 Stress Model Information A.16 Segment ID HSS/LSS Determination A.17 Component ID HSS/LSS Determination in Stress Model Assessment A.18 RISC Categorization of Components A.19 Mapping Components to Functions A.20 Expected Maximum FV, RAW, and CCFRAW for Component IDs within the Integrated Assessment A.21 Active Integrated Assessment Expected Results A.22 Functions Appearing in the Defense-in-Depth Assessment A.23 Temporary Defense-in-Depth Assessment Checklist A: Proprietary Class Statement Checklist* Checklist B: Calculation Note Methodology Checklist Checklist C: Software Validation Checklist...137

8 CN-RAM Checklist D: Verification Methodology Checklist Checklist E: 3-Pass Verification Methodology Checklist Checklist F: Spreadsheet Verification Checklist Checklist G: Spreadsheet Validation Checklist Additional Verifier s Comments...142

9 CN-RAM Introduction 1.1 Background / Purpose 1.2 Input The purpose of this calculation note is to document the Software Verification and Validation for Tool for Risk-Informed Operations, Version 1.0 (Reference 1). This calculation note was prepared according to the following procedure: W , Revision 0.0, Validation of Computer Software, effective January 8, 2016 (Reference 2). The Software Requirements and Design Specification, CN-RAM (Reference 3), and other References (Section 3.0) constitute the inputs to the software validation phase. During the verification and validation (V&V) effort, several minor deviations from expected behavior were discovered. These deviations do not alter the final results and workarounds have been defined for each of them in the User s Manual, LTR-RAM (Reference 4). These minor deviations are described in the associated test cases in this document and are considered acceptable for the purposes of the V&V. A reference to the User s Manual is provided when a workaround is necessary. 1.3 Identification of Program Program Name Version Number Version 1.0 System State Tool for Risk-Informed Operations Windows 7, 64-bit, SP1 1.4 Limits of Applicability This V&V is valid for the Tool for Risk-Informed Operations, Version 1.0. The executable covered by this document is referenced in Section 5.0. The verification and validation was performed on a computer with a 32-bit version of Crystal Reports.

10 CN-RAM Summary of Results and Conclusions A Safety Evaluation Report (SER) does not exist for this program. Therefore, a summary of exemptions to SER limitations for the functional requirements is not applicable. The V&V of the Tool for Risk-Informed Operations, Version 1.0 demonstrates that the tool in conjunction with user interactions may perform RISC categorization of Component IDs. Limitations are described in Section 9.0 along with Reference 4. Based on the results obtained in demonstrating the functional requirements of the Tool for Risk-Informed Operations, Version 1.0, it is concluded that the tool complies with the V&V requirements of Reference 5. The Tool for Risk-Informed Operations, Version 1.0 is concluded to be acceptable for release.

11 CN-RAM References 1. Tool for Risk-Informed Operations, Version 1.0 [Software], March 6, W , Revision 0.0, Validation of Computer Software, effective January 8, CN-RAM , Revision 1.0, Tool for Risk-Informed Operations, Version 1.0 Software Requirements and Design Specification, March 22, LTR-RAM-16-92, Revision 1.0, User s Manual for the Tool for Risk-Informed Operations, Version 1.0, March 22, W , Revision 0.0, Validation of Computer Software, effective January 8, NEI 00-04, Revision 0, 10 CFR SSC Categorization Guideline, Nuclear Energy Institute, July EPRI TR , Revision B-A, Revised Risk-Informed Inservice Inspection Evaluation Procedure, Electric Power Research Institute, December CFR 50.69, Final Rule, Risk-Informed categorization and treatment of structures, systems and components for nuclear power reactors., Nuclear Regulatory Commission, November 22, 2004.

12 CN-RAM Assumptions and Open Items 4.1 Discussion of Significant Assumptions There are no significant assumptions associated with this calculation note. 4.2 Open Items There are no open items associated with this calculation note.

13 CN-RAM Reference to Uniquely Identified Source Listing in a Controlled Location The modified version resides under Configuration Control as a T-Configured computer program. This version will become F-Configured upon completion of all software development and validation documentation. The source code for the T-Configured version resides in the following directory: \\swec0005\ccontrol\configs\risk-informedapplications\1515_trio\1.0\src_t4\5. Review\1515_TRIO_Source The full pathname for the T-Configured executable is: \\swec0005\ccontrol\config\risk-informedapplications\1515_trio\1.0\bin_t4\1515_trio.exe

14 CN-RAM Reference to Baseline Documents The software requirements documentation is contained in Reference 3. The user documentation is contained in Reference 4.

15 CN-RAM Test Cases For the authors, all of the test cases were executed on the same machine: HP-2UA445275T.w-intra.net (Stacy Davis, Ryan Griffin, Richard Rolland, Jack White) For the verifier, all of the test cases were executed on the same machine: TSH w-intra.net (Kyle Hope) 7.1 Verifying the Program Installs Correctly Test Case Description Installation instructions are provided in the User Manual, LTR-RAM (Reference 4). The installation folder should have a file size of 93.0 MB (97,524,560 bytes). The files within the installation folder are shown in Table 7-1. The folder called [Main] is the name of the top folder that has the installation files and subfolders within it. Table 7-1: Installation Files and Folders Folder File Size Size on Disk [Main] - [Main] [Main] 1515_TRIO_Setup.msi setup.exe CrystalReports10_5 - CrystalReports10_5 CrystalReports10_5 CRRedist2008_x64.msi CRRedist2008_x86.msi DotNetFX - DotNetFX DotNetFX dotnetfx.exe instmsia.exe 93.0 MB (97,524,560 bytes) 13.1 MB (13,789,696 bytes) 614 KB (629,248 bytes) 40.5 MB (42,559,488 bytes) 23.4 MB (24,594,944 bytes) 17.1 MB (17,964,544 bytes) 26.5 MB (27,805,752 bytes) 22.4 MB (23,510,720 bytes 1.62 MB (1,709,160 bytes) 93.0 MB (97,546,240 bytes) 13.1 MB (13,791,232 bytes) 616 KB (630,784 bytes) 40.5 MB (42,561,536 bytes) 23.4 MB (24,596,480 bytes) 17.1 MB (17,965,056 bytes) 26.5 MB (27,811,840 bytes) 22.4 MB (23,511,040 bytes) 1.63 MB (1,712,128 bytes) DotNetFX WindowsInstaller MB (2,585, MB (2,588,672

16 CN-RAM KB v2-x86.exe bytes) bytes) Office2007PIARedist MB (7,160,320 bytes) 6.83 MB (7,163,904 bytes) Office2007PIARedist o2007pia.msi 6.82 MB (7,160,320 bytes) 6.83 MB (7,163,904 bytes) ReportViewer MB (2,994,184 bytes) 2.85 MB (2,998,272 bytes) ReportViewer ReportViewer.exe 2.85 MB (2,994,184 bytes) 2.85 MB (2,998,272 bytes) WindowsInstaller3_ MB (2,585,872 bytes) 2.46 MB (2,588,672 bytes) WindowsInstaller3_1 WindowsInstaller- KB v2-x86.exe 2.46 MB (2,585,872 bytes) 2.46 MB (2,588,672 bytes) To install the Tool for Risk-Informed Operations, Version 1.0, click the file 1515_TRIO_Setup.msi. This file will install all of the relevant programs. Once installed, the files shown in Table 7-2 should appear in the location selected during the installation process. Table 7-2: Files Installed to Selected Directory File Name 1515_TRIO.exe 1515_TRIO.exe.config Book.xlsx config.key dbs.ico dirlist.txt Edit.jpg EditIcon.bmp EXCEL.EXE Icon.ico Interop.ADODB.dll Interop.JRO.dll Interop.Microsoft.Office.Interop.Excel.dll LicenseAgreement.mht Microsoft.Vbe.Interop.dll office.dll PCarguments.dat Size 4.2 MB 1.7 kb 7.2 kb 105 B 2.8 kb 1.7 kb 1021 B 1.3 kb 19.4 MB 1.1 kb 100 kb 9.0 kb 1.5 MB 107 kb 61.9 kb 438 kb 26 B

17 CN-RAM Table 7-2: Files Installed to Selected Directory File Name PRAMacros print_config.exe RISC_Original rptactiveassessmentsummary.rpt rptactivecomponents.rpt rptactivefunctions.rpt rptactiveidp.rpt rptaprisk Considerations.rpt rptapriskconsiderations.rpt rptassociatedactivecomponents.rpt rptcomponentclassification.rpt rptcomponents.rpt rptdefenseindepth.rpt rptexternalhazardsqrassessment.rpt rptfireqrassessment.rpt rptfunctioncomponentmappings.rpt rptfunctions.rpt rptidp.rpt rptidpresults.rpt rptintegratedassessment.rpt rptnsrhss.rpt rptoperatoractions.rpt rptpassiveassessmentsummary.rpt rptpassivecomp.rpt rptpassivecompmap.rpt rptpassiveidp.rpt rptqrassessment.rpt rptqualitativeresults.rpt rptquantitativeresults.rpt rptriscrankingsummary.rpt rptsegmentcomponentmappings.rpt rptsegmentconsequence.rpt rptsegmentconsequencecat.rpt rptsegmentdefinition.rpt rptsegmenteeia.rpt rptsegmenteeiashut.rpt rptsegmenteeiasub.rpt Size 24.0 kb 56.0 kb 2.5 MB 32.0 kb 32.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 48.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 32.0 kb 16.0 kb 32.0 kb 32.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 32.0 kb 16.0 kb 16.0 kb 48.0 kb 16.0 kb 16.0 kb 16.0 kb 32.0 kb 16.0 kb 16.0 kb 16.0 kb 48.0 kb 16.0 kb 16.0 kb

18 CN-RAM Table 7-2: Files Installed to Selected Directory File Name rptseismicqrassessment.rpt rptsensitivecomponents.rpt rptshutdownqrassessment.rpt rptsmaccountforuncertainty.rpt rptstressmodel.rpt rptstressmodeldef.rpt rptstressmodelseg.rpt rptsubclassificationrisc.rpt rptsubclassificationrisc1.rpt rptsubclassificationrisc2.rpt rptsubclassificationrisc3.rpt rptsubclassificationrisc4.rpt rptsubcomponentsmappings.rpt rptsubcomponenttosegments.rpt rptsubfunctionsmappings.rpt rptsubnsrlss.rpt rptsubsegmentscmappings.rpt rptsubsrhss.rpt rptsubsrlss.rpt rptsystems.rpt rpttemplate.rpt Size 16.0 kb 16.0 kb 16.0 kb 16.0 kb 32.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb 16.0 kb Currently, there may be a problem with installing Crystal Reports (a program required to display the reports provided in the program). If the Reports-Consolidated Reports gives an exception while utilizing it, manually install Crystal Reports using the CRRedist2008_x86.msi file (do not install using the CRRedist2008_x64.msi file) Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results, including the deviation describes for Crystal Reports. 7.2 Verifying the Software License Agreement Functions Test Case Description Following installation, open the Tool for Risk-Informed Operations, Version 1.0 (also referred to as the Tool) by clicking the 1515_TRIO shortcut on either the desktop or the start menu. A Software License Agreement will appear. Verify the window displays the proper Software License Agreement information as shown in Appendix A.1. Verify it is impossible to enter the

19 CN-RAM Tool without checking I have read and agree to all terms and conditions and clicking the Accept button. Verify closing the window and/or hitting the Decline button (with or without checking the I have read and agree to all terms and conditions statement) will not allow other features of The Tool For Risk-Informed Operations to become enabled. Verify that all features of the program are enabled once the I have read and agree to all terms and conditions is checked and the Accept button is clicked Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results. 7.3 Verifying the Functionality of the Creation, Loading, Saving, and Closing of a Database Test Case Description To create a new database verify that a database isn t open in the Tool (i.e., the Project-Close Database option will be grayed out in the Project tab of the Tool and the top of the Tool window will not have a path to the file shown). Click Project-New Database and the Create New RISC Database As window should appear. The database may be saved with any named.mdb file. Save the database as Save_Test_1.mdb. In the Process Steps-Scope Definition-Plant Definition window, click the New Plant button. Create a plant/unit combination with a Plant Name of 1, a Unit named Unit Test, and a Description of Plant 1, Unit Test. Click the Add button. Verify the new plant/unit combination is inserted in the Plant Definition window and close the window. To save the database as a different name, click the Project-Save As window. Save the database as Save_Test_2.mdb. Verify that the plant/unit combination is still present in the Process Steps-Scope Definition-Plant Definition window. To close the database, click the Close Database option. Verify that the tab now has the Close Database option grayed out along with every other window selection option other than selection options in the Project tab (other than Close Database) and the Help tab. To load an existing database, verify that no database is currently open (the Close Database option will be grayed out in the Project tab of the Tool). Click the Project-Open Database and the Select Database window should appear. Select Save_Test_2.mdb to open the Save_Test_2.mdb database. Verify that the plant/unit combination is still present in the Process Steps-Scope Definition-Plant Definition window Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database files Save_Test_1.mdb and Save_Test_2.mdb, which are electronically attached to this document (see Table 7-6 for a list of electronically attached files).

20 CN-RAM Verifying the Functionality of the About Window Test Case Description To view the About window, click Help-About. This will display the About window; verify that the configuration control information is displayed. Click the OK button to exit out of the window. As this test case was performed and independently verified on separate computers, the execution information displayed in Appendix A.2 will only match for the test case performed on hp-2ua445275t. The configuration control information should match for all computers on which this test case is performed Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results. 7.5 Verifying the Insertion of Plant Definitions Test Case Description To create a new database verify that a database isn t open in the Tool (i.e., the Project-Close Database option will be grayed out in the Project tab of the Tool and the top of the Tool window will not have a path to the file shown). If a database is open, use the Close Database option to exit out of the opened database. Click Project-New Database and the Create New RISC Database As window should appear. Save the database as Plant_Definition.mdb. Select Process Steps-Scope Definition-Plant Definition to open the Plant Definition window.

21 CN-RAM In this window, click the button named New Plant and fill in the Plant Name as Plant 1, the Unit Name as Unit 1 and the Description as Description of Plant 1, Unit 1. Additionally, check the box Make it Default and click the Add feature. The options selected are shown in Figure 7-1. Figure 7-1: Add New Plant Window A pop-up should appear stating that the plant/unit combination has been added. After accepting the notification, the Plant Definitions window may be viewed to verify that Plant 1, Unit 1 is present. Create four (4) additional plant/unit combinations called Plant 1, Unit 2, Plant 2, Unit 1, Plant 2, Unit 3, and Plant 10, Unit 1. Figure 7-2 shows what the appearance of the Plant Definition window should be.

22 CN-RAM Figure 7-2: Plant Definition window after creating initial plant/unit combinations Click the checkbox next to Plant 2, Unit 3 and Plant 10, Unit 1. Hit the Delete button. A confirmation window will appear titled Delete? That will ask the following question: Are you sure you want to delete the selected Plant(s); select Yes. Only Plant 1, Unit 1, Plant 1, Unit 2, and Plant 2, Unit 1 should be present in the window. Click the Restore button. This will display the window that is called Restore Plants. The Restore Plants window will display the plant/unit combinations that have been deleted. Verify that both Plant 2, Unit 3 and Plant 10, Unit 1 are present in this window. Select Plant 2, Unit 3 and click the Go button. A warning window will appear that states: Are you sure you want to restore the selected Plant(s); click Yes. Verify that Plant 2, Unit 3 reappears in the Plant Definition window. Click the New Plant button and create a plant/unit combination named Plant 10, Unit 1 (with a different description than the description written for the original Plant 10, Unit 1). Verify that an Error window appears that states the following: The changes you requested could not be completed because the Plant/Unit combination is already defined. Click the OK button. Now, attempt to create a plant/unit combination named Plant 1, Unit 1 (with a different description than the description written for the original Plant 1, Unit 1). Verify that an Error window appears that states the following: The changes you requested could not be completed because the Plant/Unit combination is already defined. Click the OK button and then the Cancel button in the Add New Plant window. Double click on the section to the left of the checkmark for Plant 2, Unit 3; this will open the Edit Plant window. In this window, change the plant/unit combination to Plant 2, Unit 2. A

23 CN-RAM window called Update will be displayed that states You are about to change the definition of Plant 2, Unit 3 to Plant 2, Unit 2. Proceed?; click Yes. In the Plant Definition window the plant/unit combination should have changed from Plant 2, Unit 3 to Plant 2, Unit 2. Double click on the section to the left of the checkmark for Plant 1, Unit 1; this will open the Edit Plant window. In this window, change the plant, unit combination to Plant 2, Unit 1. A window called Update will be displayed that states You are about to change the definition of Plant 1, Unit 1 to Plant 2, Unit 1. Proceed?; click Yes. The following Error window should be displayed that states: The changes you requested could not be completed because the Plant/Unit combination is already defined. Click the OK button and now attempt changing the plant, unit combination to Plant 10, Unit 1. The same messages as before should be displayed. Therefore, the Plant Definition window should resemble Figure 7-3. Figure 7-3: Final View After Verification of the Plant Definition Window The Plant Definition window has a search feature that may be utilized. Type Plant 1 into the text box next to the Search button and hit the Search button. Both cases of Plant 1 should appear. Verify that four (4) results are shown for searching the keyword description. Finally, verify that two (2) results are shown for searching the keyword Unit 1. Verify that hitting the X button next to the Search key after the Unit 1 search clears the search results and all of the plant/unit combinations are displayed.

24 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Plant_Definition.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). 7.6 Verifying the Creation of System Definitions Test Case Description Click the Project-Save As window. Save the database as System_Definition.mdb. Select Process Steps-Scope Definition-System Definition to open the System Definition window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1. Add a new system by clicking the New System button. This will open a window called Add New System with the Select Plant and Unit already defined as Plant 1, Unit 1. Create a System ID named TS with a System Name of Test System. Click the Add button; a confirmation dialog window stating System TS is successfully added should appear; click OK to confirm the addition of the system. Similarly, add three (3) more systems with the System IDs as TT2, TT3, and TT4 with a System Names of System Test 2, System Test 3, and System Test 4 respectively. Similar confirmation dialogs will appear, click OK to go to the main screen to view the System IDs. Under the Select Plant and Unit dropdown menu select Plant 1, Unit 2. A window named Default? will appear asking You want to make it as default plant; select No. Click the New System button. This will open a window called Add New System with the Select Plant and Unit already defined as Plant 1, Unit 2. Create a System ID named TS with a System Name of Test System. A confirmation dialog window stating System TS is successfully added should appear; click OK to confirm the addition of the system. Click the checkmark next to this system and click the Delete button. A window titled Delete? Will appear that states Are you sure you want to delete the selected System(s); click the Yes button. No System IDs should be present for Plant 1, Unit 2. In the Select Plant and Unit, select Plant 1, Unit 1. Verify that no window named Default? appears since the selection is already the default plant/unit combination. Verify that all of the System IDs created for this plant/unit combination still appear. Click the checkmark next to System IDs TT3 and TT4 and hit the Delete button. A window titled Delete? will appear that states Are you sure you want to delete the selected System(s); click the Yes button. System IDs TS and TT2 should remain in the main window while System IDs TT3 and TT4 should no longer be present. Click the Restore button and the Restore Systems window should appear. Click the checkbox next to System ID TT3 and press the Go button next to the Select the records to restore text. A window should appear named Undo? which states Are you sure you want to restore the

25 CN-RAM selected System(s); click the Yes button. The main System Definition window should have System ID TT3 reappear in the main window. Click the New System button and add a System ID named TT4 with a System Name of Test and click the Add button. A window titled Duplicate Data should appear that states Duplicate System ID; click the OK button. Attempt adding a System ID named TS with a System Name of Test and click the Add button. Duplicate Data should appear that states Duplicate System ID; click the OK button and click Cancel in the Add New System window. Double click to the left of the checkmark box of System ID TT3 and a window named Edit System should appear. Change the System ID to TT4 and the System Name to Test. Click the Update button and verify that the Duplicate Data popup is presented. Next, change System ID TT3 to System ID TS with the System Name to Test. Click the Update button and verify that the Duplicate Data popup is presented. Finally, change the System ID TT3 to TT5 and the System Name to System Test 5. Click the Update button and a window called Update should appear that states: You are about to change the definition of TT3, System Test 3 to TT5, System Test 5. Proceed?; click the Yes button. A new window will appear that states System TT3 Successfully Modified; click the OK button. In the main System Definition window, TT3 will be renamed to TT5. Click the Restore button to verify that only TT4 is present in the Restore Systems window, then click the Cancel button Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file System_Definition.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). Currently, the Tool for Risk-Informed Operations allows a System ID to be named the same as another System ID. This occurs by utilizing the update feature for a System ID. When updating a System ID, there is no Duplicate Data pop-up if that System ID name is already located within the main System Definition window or the Restore window. An example of this occurrence is shown in System_Definition_Error_TS.mdb, where TS appears twice. The V&V may still be continued by creating a new System_Definition.mdb file that only modifies System ID TT3 directly to TT5 (this will be the database that the V&V will continue with). 7.7 Verifying the Copying of Customer Components Window Test Case Description Click the Project-Save As window. Save the database as Copy_Customer_Components.mdb. Select Tools-Copy Customer Component List to open the Copy Customer Component List window.

26 CN-RAM Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Select Excel Spreadsheet button and use Windows Explorer to navigate to the Customer_Component_List.xls spreadsheet; in the Worksheet dropdown select Text. In the All Excel Headings section, click the Component ID item, and press the > button, next select ComponentID and press the Match button; repeat this for Component Description, Component Type, and Safety Class. Press the Transfer button, followed by Yes in the Transfer popup window. Verify that this is successful when the Success window appears. Click the Reset button. Verify that nothing in the window can be selected without selecting the Select Plant button. Close the Copy Customer Component List window and open the Assign Component Functionality window in the Process Steps-Scope Definition-Assign Component Functionality. Verify that the Select Plant and Unit is automatically selected as Plant 1, Unit 1 since it is the default plant. In the Select System pulldown menu, select Test System (TS). Verify that the components that were transferred from Customer_Component_List.xls are present. Close the Assign Component Functionality window. Click Tools-Copy Customer Component List. A message will appear stating Warning: Not all components have been assigned a functionality; click OK. Click Tools-Copy Customer Component List to open the Copy Customer Component List window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Delete button. The following window called Delete should appear and state Are you sure you want to delete all components from Plant 1, Unit 1, Test System (TS) select the Yes option and the following window called Success should appear that states All components from Plant 1, Unit 1, Test System (TS) successfully deleted; click the OK button to close the window. Close the Copy Customer Component List window and open the Assign Component Functionality window in the Process Steps-Scope Definition-Assign Component Functionality. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify that no components are present. Close the Assign Component Functionality window. Click Tools-Copy Customer Component List to open the Copy Customer Component List window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Follow the same process as described previously for uploading Customer_Component_List.xls and verifying that they are present in the Assign Component Functionality Window.

27 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Copy_Customer_Components.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). Clicking the Reset button after a dataset has been placed in the Matches section does not disable the Transfer button and the Delete button. This will not affect the V&V since clicking the Transfer button and the Delete button before the necessary information is provided is avoidable. The Transfer button requires all of the Database Headings to be in the Matches section. The Delete button requires the Select System to be defined. 7.8 Verifying the Add Customer Component List Window Test Case Description Click the Project-Save As window. Save the database as Add_Customer_Components.mdb. Select Tools-Add Customer Component List to open the Add Customer Component List window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the checkmark next to TAC Click the Delete button and click Yes in the Delete? confirmation window. Verify that TAC-0001 is no longer present in the main window. Click the New Component button. Type in the Component ID as TAC-0001 (- is a minus sign and not a dash; the other inputs in the window do not matter) and click the Add button. Verify that a window appears stating Duplicate Component Id, press the OK button. Next, type in the Component ID as TAC Verify that a window appears stating Duplicate Component Id, press the OK button. Finally, type in the Component ID as TST-0001 and click the Add button. Verify TST-0001 appears in the main window. Utilize the Delete button to delete TST Click the Restore button. Verify that TAC-0001 and TST-0001 are present in the window. Select TAC-0001 and click the Go button. Verify a confirmation window will appear and click the Yes button. Verify that TAC-0001 is present in the main window. Enter 000 in the Search field and press the Search button. Verify that all of the ten (10) fields with 000 present will appear. Click the X button to clear the Search. Enter SR in the Search field and press the Search button. Verify that all of the fields with SR present will appear (all odd Component IDs). Click the X button to clear the Search Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file

28 CN-RAM Add_Customer_Components.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). It was discovered that the Search feature does not function in the Safety Class column. This will not affect further analysis of the V&V (further analysis does not require the Search feature in the Safety Class column). The Search feature functions correctly for all other columns. 7.9 Verifying the Picklist-Safety Window Functions Test Case Description Click the Project-Save As window. Save the database as Picklists_Safety.mdb. Select Picklists-Safety to open the Plant Safety String Picklist window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1. Verify that the Key has options SR and NSR. Match the Key option SR with the Value option SR and the Key option NSR with the Value option NSR (the Value refers to the safety classes defined when importing Component IDs; currently only SR and NSR values should be utilized). Close the Safety window. Reopen the Safety window and verify the two matches are present Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Picklists_Safety.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Assign Component Functionality Window Functions Test Case Description Click the Project-Save As window. Save the database as Component_Functionality.mdb. Select Process Steps-Scope Definition-Assign Component Functionality to open the Assign Component Functionality window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click an entire row and select every combination of options in Assign Functionality and Is component a Support/Hanger/Snubber?. Verify that Both and Passive functionality auto-assigns Yes to Is component a Support/Hanger/Snubber? (this may be modified to No by the user). Verify that the Commit Changes button properly updates for every combination by closing and reopening the window after each update (the Commit Changes button when selected will display a window titled Success with a text description of Saved Successfully that can be closed with the OK button). Note that the Support/Hanger/Snubber column will only be automatically updated when the user first selects either Passive or Both functionality for the component and the Is this component a Support/Hanger/Snubber? Question has not already been answered. This is the expected functionality.

29 CN-RAM Additionally, check the checkbox next to Multiple Assign; verify that checkboxes appear next to every component. Check several checkboxes next to the components and then choose an option in the Assign Functionality; a window called Multiple Assign? will appear and will say Are you sure you want to Assign the selected Component(s); press the Yes button. Verify that the same message appears with the Is component a Support/Hanger/Snubber? option. Click the Commit Changes button and a window titled Success with a description of Saved Successfully should appear. Click the OK button to close the window. Close the window with not all Component IDs assigned; verify that a Warning window appears that states Warning: Not all components have been assigned a functionality. Click the OK button to continue closing the window. Reopen the Assign Component Functionality window. Verify that the Filter by functionality pulldown tab functions for all five (5) filtering options. Make a change to either the Assign Functionality or Is component a Support/Hanger/Snubber? for a component without pressing the Commit Changes button. Select a different component after making the change. Close the window, a pop-up window should appear titled Save Changes with a text of Would you like to save changes?; select the Yes option and re-open the Assign Functionality window to verify that the component has been modified. Try the same process again except with selecting the No option and verifying that the component has not been updated. Modify all of the Component IDs to the determinations in Table 7-3. Close the window and verify that all of the components have been properly updated. Table 7-3: Component Functionality Assignments Component ID Functionality Support/Hanger/Snubber TAC-0001 to TAC-0040 Both Odds=Yes Evens=No TAC-0041 to TAC-0060 Passive Odds=Yes Evens=No TAC-0061 to TAC-0823 Active Odds=Yes Evens=No Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected and are shown in the database file Component_Functionality.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The Assign Component Functionality window may incorrectly save Component IDs support, hanger, or snubber information. A workaround to this was established in Reference 4, Section 4.3. It can be verified that the information has been correctly saved by re-opening the window. If the information has changed, change it back to the correct answer, hit the Commit

30 CN-RAM Changes button, and verify that it has been correctly saved by closing and then re-opening the window. This workaround allows the V&V to continue without future test cases being affected Verifying the Passive Stress Model Definition Window Test Case Description Click the Project-Save As window. Save the database as Passive_Stress_Model_Definition.mdb. Select Process Steps-Passive Component Assessment-Stress Model Definition to open the Stress Model Definition window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the New Stress Model button. Enter the name TSM1 in the Stress Model field, and add Stress Model 1 to the Stress Model Description field. Ensure that the Stress Model ID and Stress Model Description are now presented on the Stress Model Definition window, matching the values entered. Using the same process as above, create the following Stress Models: TSM2, TSM3, TSM4, TSM5, and TSM7. The Stress Model Descriptions, along with the stress models to input, are shown in Appendix A.4. Click the check box adjacent to the newly created TSM1, and select the Delete button on the bottom of the screen, confirm that you want to delete TSM1 using the Yes button. Ensure that TSM1 is now absent from the presented list. Click the New Stress Model button. Enter the name TSM1 in the Stress Model field, and add Test to the Stress Model Description field. Verify that an error appears preventing the insertion of a duplicate stress model. Next, enter the name TSM3 in the Stress Model field, and add Test to the Stress Model Description field. Verify that an error appears preventing the insertion of a duplicate stress model. Click to the left of the box for TSM4. Edit TSM4 to TSM1 and click the Update button. Verify that an update window appears and click the Yes button. Verify that an error will occur preventing the change because TSM1 is already present. Next, modify TSM4 to TSM6. Verify that an update window appears and click the Yes button. Verify that an error will occur preventing the change because TSM6 is already present. Click the Restore button at the bottom of the screen and select TSM1, which was recently deleted. Click the check box adjacent to the TSM1 and click the Go button at the bottom of the window. When the Undo? window is presented select the Yes option; ensure that the restored segment TSM1 reappears in the main window Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results with one exception (described below) and are shown in the

31 CN-RAM database file Passive_Stress_Model_Definition.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The exception noted in this test case is that the Restore function (and window) does not function correctly on the Stress Model Definition window. The Restore not functioning will not affect the results of the analysis. Additionally, stress models of the same name are allowed to be created (instead of an error appearing to prevent it). Both of these discrepancies are shown in the database named Passive_Stress_Model_Definition_Error.mdb. To insure these discrepancies do not affect later verification, the database file named Passive_Stress_Model_Definition.mdb was created that did not delete TSM1 or rename stress models (this will be the database that the V&V will continue with) and will be utilized for the next portion of the V&V Verifying the Picklists-Drawing Window Functions Test Case Description Click the Project-Save As window. Save the database as Picklists_Drawing.mdb. Select Picklists-Drawing to open the Drawing Number Picklist window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1. Click the New Drawing button. An Add New Drawing window should open. Verify that the Revision Date shown is automatically selected as the date on the computer. Modify the Revision Date utilizing the calendar picture button located to the right of the date. Select the date as Wednesday, August 24, The date should appear on the screen. Continue with creating the first drawing, after this; add multiple new drawings with the information shown in Table 7-4. Drawing Number Table 7-4: New Drawings in Picklist-Drawing Window Drawing Type Drawing Sheet Drawing Revision Revision Date 789 Type 1 Sheet 1 0 Wednesday, August 24, Type 2 Sheet 1 4 Friday, June 28, Type 1 Sheet 4 0 Thursday, March 28, Type 4 Sheet Tuesday, June 27, 2000 Select the Drawing Number 789 and 790 and click the Delete button. Attempt to add another Drawing Number called 789 (with no other information other than the Revision Date being today). A window that states Error should appear that states: Error in update attempt:

32 CN-RAM The changes you requested to the table were not successful because they would create duplicate values in the index, primary key, or relationship. Change the data in the field or fields that contain duplicate data, remove the index, or redefine the index to permit duplicate entries and try again. Click the OK button to close the Error window. Attempt to create another Drawing Number called 791. Verify the same Error window appears. Click the OK button to close the Error window then the Cancel button on the Add New Drawing window. Click the Restore button, select the Drawing Number 789 to restore and hit the Go button. A window will appear that says Undo? asking the user to verify if they want to restore the selected Drawing(s), click the Yes button. Drawing Number 789 will be restored in the main Drawing Number Picklist window. Click the area to the left of the checkbox to modify Drawing Number 789. Modify the Drawing Number to 790. A warning message will appear, select Yes to the warning message and verify that an error appears because 790 is already present in the Restore window. Now, modify the Drawing Number to 791. A warning message will appear, select Yes to the warning message and verify that an error appears because 791 is already present in the Restore window. Finally, change the information to the following: Drawing Number: 793 Drawing Type: Type 5 Drawing Sheet: 2 Drawing Revision: 1 Revision Date: Thursday, August 25, 2016 A warning message will appear, select Yes to the warning message and verify that the information has been changed. Verify that Drawing Number 789 is no longer present and is replaced with Drawing Number 793. Click the New Drawing button and enter in the information in Table 7-4 for a new Drawing Number 789. Verify that Drawing Number 789 appears Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Picklists_Drawing.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Passive Segment Definition Window Test Case Description Click the Project-Save As window. Save the database as Passive_Segment_Definition.mdb. Select Process Steps-Passive Component Assessment-Segment Definition to open the Segment Definition window.

33 CN-RAM Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the New Segment button at the bottom of the screen, and ensure that the Plant Name and System Name fields match the selected Plant 1, Unit 1 and Test System (TS) respectively. In the SegmentID and Segment Description fields enter TS-001 and TS-001 Description respectively, in the following Comment, Piping Classification, and Containment Performance Impact fields, enter TS-001 Comment, TS-001 Piping Classification, and TS-001 Containment Impact respectively. In the Stress Model dropdown, select TSM1. Click the Add button. Verify TS-001 appears in the main window. Click the Batch Add button, and ensure that Plant is Plant 1, Unit 1 and System ID is Test System (TS), additionally verify that the Highest Current Segment Number field reads TS-001. In the Number of First Segment to be Added field enter 002, and in the Number of Segments to be Added enter 50. Click the Add Segment Identifiers button. Ensure that segments TS-001 through TS-051 are now visible on the listing. TS-002 through TS-051 should have Batch Added-Update Necessary Fields present in the Segment Description column. Click the pencil and notepad icon in the Action column for TS-002. Modify TS-002 with the information shown in Appendix A.5. For the stress model TSM6, choose the New Stress Model option in the Stress Model selection and type in the Stress Model ID of TSM6 and the Stress Model Description of Stress Model 6 (refer to Appendix A.4). Modify TS-002 through TS-051 with the pencil and notepad icon in the Action column to match the information shown in Appendix A.5. Select the segment named TS-001 by clicking on the space to the left of the checkbox. Once selected the row should be highlighted and a right facing arrow should be present on the far left. Click the Other tab on the upper left of the screen, and ensure the upper left hand side of the screen has the text Segment Definitions for Plant 1, Unit 1: Test System (TS). Verify that TS-001 is listed as the Segment ID and that the Description reads TS-001 Description. Click the Update Other Details button. In the Drawing Number dropdown menu, verify that Drawing Number 789, 791, 792, 793 appear (refer to Section 7.12). Enter 123 in the first row in the Segment Line Size table, 456 in the first row in the Segment Line Number table, A1 in the Grid Location, and select 789 in the first row in the Drawing Number table. Press the Ok button, and ensure that 123, 456, A1 and 789 are displayed correctly. Click the Delete button and select the check box next to the 123,456, and A1/789 records. Click the Ok button when prompted. Ensure that the records are no longer displayed in the Other tab. Click the Restore button on the bottom of the screen, select the recently deleted records; press the Ok button to restore this information. Ensure that the records reappear in the Other tab.

34 CN-RAM In the Segment Definition tab, select the checkbox for TS-001. Click the Delete button at the bottom of the screen, and click Yes on the window that is presented. Ensure TS-001 is no longer listed in the Segment Definition window. Click the New Segment button. Create a Segment ID named TS-001 (Segment Description, etc. do not need to be added). Click the Add button. Verify that an error window appears stating that TS-001 has already been defined. Create a Segment ID named TS-002 (Segment Description, etc. do not need to be added). Click the Add button. Verify that an error window appears stating that the TS-002 has already been defined. Click the pencil and notepad icon in the Action column for Segment ID TS-003. Change the Segment ID to TS-001 and click Update. Verify that an error appears stating that the TS-001 has already been defined. Change the Segment ID to TS-002 and click Update. Verify that an error appears stating that the TS-002 has already been defined Press the Restore button at the bottom of the screen, and click the checkbox adjacent to the TS-001 record. Press the Go button at the bottom of the screen, and click Yes when prompted; ensure that the previously deleted segment TS-001 is now listed in the Segment Definition window Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Passive_Segment_Definition.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). In the Other tab, the Description will read the Segment ID name instead of the Description of the Segment ID shown in the Segment Definition tab. This is a textual display discrepancy and will not affect the results. Currently, creating or editing a Segment ID will not check if the Segment ID is already present in the main window or the Restore window. This will not affect the continuation of the V&V. Passive_Segment_Definition_Error.mdb will show the error of creating a new Segment ID and editing an existing Segment ID that was already present. The Passive_Segment_Definition.mdb file will skip these steps (this will be the database that the V&V will continue with). In the V&V test cases attached, an extra TS-001 was created. It was deleted in a later section once it was noticed. It was renamed to TS-999 prior to deleting it Verifying the Treatment-Picklist Window Functions Test Case Description Select Picklists-Treatment to open the Treatment Picklist window. Verify that the information in Appendix A.3 is present. Close the window.

35 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results Verifying the Passive Consequence Definitions Window Test Case Description Click the Project-Save As window. Save the database as Passive_Consequence_Definition.mdb. Select Process Steps-Passive Component Assessment-Consequence Definition to open the Consequence Definition window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify the Segment Consequences table has the Segment IDs which have the correct Segment Description and Comment fields that were defined in the Segment Definition window (refer to Section 7.13). In the Consequence Definitions table, verify that the Treatment column has all of the selections shown in Appendix 10.0A.3. In the Consequence Definitions table, enter in TC10 in the Conseq ID field and CD 7 in the Consequence Description field, next, select IE in the Treatment column and Conseq in the ConseqCode column. Appendix A.6 displays all of the Consequence IDs and their supporting information. Insert the rest of the Consequence IDs shown in Appendix A.6. In the Operator Actions table, create a new Operator Action ID named OA-001; in the OA Description column enter OA-001 Description. In the Segment Consequences table, highlight Segment ID TS-001 (select the leftmost cell of the row). In the Consequence Definitions table, highlight Conseq ID TC10. Press the Assign Consequence to a Segment button and verify that the Conseq ID field of TS-001 reads TC10. In the Segment Consequences table, highlight Segment ID TS-001 (select the leftmost cell of the row). In the Operator Definitions table, highlight Operator ID OA-001. Press the Assign Consequence to a Segment button and verify that the Operator ID field of TS-001 reads OA-001. In the Segment Consequences table, highlight Segment ID TS-001 (select the leftmost cell of the row). Click the Clear Operator Action button below the table. Verify that OA-001 is no longer assigned to TS-001. Click the Clear Consequence button below the table. A warning message will appear confirming the action; click Yes to proceed. Verify that TC10 is no longer assigned to TS-001. Reapply TC10 and OA-001 to TS-001. Click the Clear Consequence button below the table. Verify that TC10 and OA-001 are no longer assigned to TS-001

36 CN-RAM Reapply TC10 and OA-001 to TS-001 along with assigning the other Segment IDs their appropriate Conseq ID (refer to Appendix A.7) Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Passive_Consequence_Definition.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Passive Map Components Window Test Case Description Click the Project-Save As window. Save the database as Passive_Map_Components.mdb. Select Process Steps-Passive Component Assessment-Map Components to open the Map Components to Segment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Assign Component ID TAC-0001 to Segment ID TS-001 and Segment ID TS-051. To accomplish this, highlight TAC Select TS-001 in the Passive Segments section of the window. Verify that the correct Description appears (defined in the Segment Definition window). Click Assign Segment. Verify that the Segment ID column in the Map Components to Segments window has been properly modified for TAC Select TS-051 in the Passive Segments section of the window. Verify that the correct Description appears (defined in the Segment Definition window). Click Assign Segment. Verify that the Segment ID column in the Map Components to Segment window has been properly modified for TAC-0001 (two (2) rows should be present, with one row having TS-001 and another row having TS-051). Additionally assign Component ID TAC-0002 to Segment ID TS-001. Assign TAC-0003 to TS- 003 and TS-004. Utilize the Filter Components feature to see filtering By Assignment and/or By Type. Verify all selection options for filtering components are working as intended. Select TAC-0003/TS-003 row. Click the Unassign Segment button. Verify the TAC-0003/TS-003 row has disappeared and only TAC-0003/TS-004 is present. Select TAC-0003/TS-004 row. Click the Unassign Segment button. Verify the TAC-0003/TS-004 row has disappeared and has been replaced with a TAC-0003 row with no Segment ID assigned to it. Utilize the Unassign Segment button to unassign all Segment IDs from the Component IDs. Input the Component ID mapping information that is shown in Appendix A.8. Map multiple components at the same time to verify the multiple selections feature is functioning. Select the Segment ID prior to selecting the Component IDs that will be assigned to the Segment ID; selecting a new Segment ID unselects selected Component IDs.

37 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Passive_Map_Components.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Copy PRA Data Window Test Case Description Click the Project-Save As window. Save the database as Passive_Copy_PRA_Data.mdb. Select Tools-Copy PRA Data to open the Copy PRA Data window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1. Click the Select Excel Spreadsheet button. A Windows Explorer window will appear titled Select Spreadsheet. Select Passive_PRA_Runs.xls. Verify that Passive_PRA_Runs.xls location will appear in the textbox below the Select Excel Spreadsheet button. For the Select Sheet Name pulldown menu select Main. In the All Excel Headings section, click the PRA Run ID item, and press the > button to move the PRA Run ID to the Selected Excel Headings section. Next select PRARunID in the Database Heading section and press the Match button; repeat this for PRA CCDP/CCDF, PRA CLERP/CLERF, SurrogateComponents, and PRAComment respectively. Press the Transfer button, followed by Yes in the Transfer popup window. Verify that the Success window appears and click the OK button. Select Process Steps-Passive Component Assessment-Quantitative Assessment to open the Perform Passive Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify that the PRA Runs section in the Passive PRA Runs and Consequence Mapping tab displays the imported PRA Run ID and their relevant data. Select Tools-Copy PRA Data to open the Copy PRA Data window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1. Click the Delete button; select the Yes answer in the confirmation pop-up window. Select Process Steps-Passive Component Assessment-Quantitative Assessment to open the Perform Passive Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify that the PRA Runs section in the Passive PRA Runs and Consequence Mapping tab does not have any PRA Run IDs present.

38 CN-RAM Select Tools-Copy PRA Data to open the Copy PRA Data window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1. Reimport the data from Passive_PRA_Runs.xls Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Passive_Copy_PRA_Data.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). Currently, the Excel file that will be imported must have the headers start on row four (4) with the data starting on row five (5). Columns C and D must be filled in with placeholder data. Also, rows one (1) through three (3) must be filled with placeholder data on the columns that will be matched. This Excel file importation will cause a log error file, but the data will be correctly imported as verified in the Perform Passive Quantitative Assessment window. Refer to the Passive_PRA_Runs.xls for an example Excel file Verifying the Passive Quantitative Assessment Window Test Case Description Click the Project-Save As window. Save the database as Passive_Quantitative.mdb. Select Process Steps-Passive Component Assessment-Quantitative Assessment to open the Perform Passive Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify that the PRA Runs table in the Passive PRA Runs and Consequence Mapping tab displays the imported PRA Run ID and their relevant data, refer to Appendix A.910.0A.9. Scroll down to the bottom of the PRA Runs table. There will be a blank space present. Insert the following data on one line: PRA Run ID: 26 Surrogate Components: Surrogate Components 26 CCDF/CCDP: 2.5E-05 CLERF/CLERP: 2.5E-06 PRA Comment: PRA Comment 26 After entering in the values, press the Enter button on the keyboard. Verify that the information is properly displayed. Next, highlight the PRA Run ID 26 row, and click the X button. A warning message will appear asking to confirm the deletion; click Yes. Verify the row has been successfully deleted.

39 CN-RAM Highlight PRA Run ID 1 and highlight Segment ID TS-001. Click the Assign PRA Run button. Verify PRA Run ID 1 has been successfully assigned to Segment ID TS-001. Click Segment ID TS-001 and click the Unassign PRA Run button, verify PRA Run ID 1 is no longer present in the row containing TS-001. Section 5.5 of Reference 4 describes the results for determining High, Medium, and Low for the Consequence Category found in the Evaluate button along with a current workaround for adding Units. Click the Table picture button to the right of the Clear button below the Consequence PRA Run Mapping table. The Units window will appear. Create a new unit name called Annuall. Go to the next line without entering in the Unit Value (in years) for the Annuall row. Correctly enter in the Units and Unit Value (In Years) of Table 7-5 except for the Annually column. Press the OK button. An error window named Uncommitted Selection will appear; hit the OK button. Correct the Annuall to Annually and insert the Unit Value (In Years) for the row. Click the OK button. Verify all of the Units appear in Table 7-5 along with Continuous in the dropdown menu in Consequence PRA Run Mapping. (to verify the dropdown menu contains all the items, select an item with a Treatment of SYS; items with IE or LOCA Treatment values will correctly not allow the dropdown menu to be used) Each one of the values of Unit Values (In Years) will be checked later. Table 7-5: Units (Continuous doesn t appear in this window) Unit Annually 1 Bi-Weekly Monthly Quarterly 0.25 Refueling 1.5 Semi-Annually 0.5 Weekly Test Test Unit Value (In Years) Assign PRA Run IDs to the relevant Segment ID (and relevant Consequence ID if the Segment ID has multiple Consequence IDs assigned) to the values in Appendix A.10. Verify that the Unit is automatically selected Continuous for Initiating Events (IE) and Loss of Coolant Accident (LOCA) treatments. For System (SYS) treatments, select the relevant Unit from the drop-down menu. For cases in Appendix A.10 that state [none], do not assign a PRA Run ID to them.

40 CN-RAM Verify the calculated CCDP=1.01E-04 and CLERP=1.01E-05 for the SYS with Test1 as the Unit. Verify the calculated CCDP=1.01E-04 and CLERP=0 for the SYS with Test2 as the Unit. Open the Evaluate Passive Consequence Category tab. Reselect Test System (TS) from the Select System dropdown. Click the Evaluate All button. Verify that the Consequence Category column matches the Quantitative Consequence Category column in Appendix A.10 and that the Consequence Basis column reads Quantitative. For test cases where [none] as the answer, the test cases will remain blank. Additionally, when those blank ones are evaluated, the pop-up window No Evaluation occurs to notify the user No Evaluation has occurred, click the OK button to continue with the evaluations of the other Segment IDs Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results with one exception and are shown in the database file Passive_Quantitative.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). Currently, TS-050 and TS-051 will have a High Consequence Category (instead of a None Consequence Category) when evaluated even though they have a NONE and NOT-USED Treatment, respectively (this is if they were assigned PRA runs). To continue with the V&V, TS-050 and TS-051 were highlighted and the Clear Evaluation button was clicked in the Passive PRA Runs and Consequence Mapping tab. It then had their PRA Runs unassigned in the Passive PRA Runs and Consequence Mapping tab. TS-050 and TS-051 can no longer be evaluated and will remain blanked (like desired) Verifying the Passive Qualitative Assessment Window Test Case Description Click the Project-Save As window. Save the database as Passive_Qualitative.mdb. Select Process Steps-Passive Component Assessment-Qualitative Assessment window to open the Qualitative Passive Risk Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown.

41 CN-RAM Verify Segment IDs TS-001 through TS-030 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure 7-4. Figure 7-4: Passive Qualitative Assessment Options for IE Consequences

42 CN-RAM Verify Segment IDs TS-031 through TS-032 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure 7-5. Figure 7-5: Passive Qualitative Assessment Options for IE+LOCA Consequences Verify Segment IDs TS-033 through TS-034 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure 7-6. Figure 7-6: Passive Qualitative Assessment Options for SYS Consequences

43 CN-RAM Verify Segment IDs TS-035 through TS-036 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure 7-7. Figure 7-7: Passive Qualitative Assessment Options for SYS+LOCA Consequences Verify Segment IDs TS-037 through TS-038 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure 7-8. Figure 7-8: Passive Qualitative Assessment Options for IE+SYS Consequences

44 CN-RAM Verify Segment IDs TS-039 through TS-049 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure 7-9. Figure 7-9: Passive Qualitative Assessment Options for IE+SYS+LOCA Consequences Verify Segment IDs TS-050 through TS-051 have the lower half of the Qualitative Passive Risk Assessment window the same as Figure Figure 7-10: Passive Qualitative Assessment Options for NONE or NOT-USED Consequences

45 CN-RAM Verify that when the second selection is chosen in the Combination (IE and SYS) Impact Group selection, the following Containment Performance question appears as seen in Figure Figure 7-11: Additional Question with Second Answer in Combination Question Verify that when the third selection is chosen in the Combination (IE and SYS) Impact Group selection, the following Containment Performance question appears as seen in Figure Figure 7-12: Additional Question with Third Answer in Combination Question

46 CN-RAM Section 5.6 of Reference 4 describes the possible Qualitative Consequence Categories based on the answers to specific questions. Select options in the Qualitative Passive Risk Assessment window that will yield Consequence Categories shown in the Qualitative Consequence Category column in Appendix A.11. The Qualitative/Quantitative Consequence Category (higher of two) column in Appendix A.11 will be displayed in the Category column in the Qualitative Passive Risk Assessment window; verify the values match. The Basis will be Qualitative for all Segment IDs due to the analysis being performed last; except for TS-002 (will be Quantitative due to no qualitative assessment performed), TS-050, and TS-051 (will be blank due to no qualitative or quantitative assessment performed) Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Passive_Qualitative.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Passive Shutdown and External Events Impact Assessment Window Test Case Description Click the Project-Save As window. Save the database as Passive_Shut_EE.mdb. Select Process Steps-Passive Component Assessment-Shutdown and External Events Impact Assessment to open the Shutdown and External Events Impact Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Utilize the information in Appendix A.12 to input the Consequence Category from Shutdown Impact Assessment, Basis for Shutdown Consequence, Consequence Category from External Hazards Impact Assessment, and Basis for External Hazards Impact Assessment. Use the Assign Shutdown Consequence and Assign External Events Consequence buttons to assign them to the Segment ID. Verify the Resulting Consequence Category and Resulting Basis match the expected in Appendix A Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Passive_Shut_EE.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files).

47 CN-RAM Verifying the Passive Additional Risk Considerations Window Test Case Description Click the Project-Save As window. Save the database as Passive_ARC.mdb. Select Process Steps-Passive Component Assessment-Additional Risk Consideration to open the Additional Passive Risk Consideration window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Auto Assign RISC Category button. Verify all Segment IDs that have a Consequence Category of Medium/High or High now have a HSS for their Risk Classification. For Segment ID TS-003, only check the following Additional Risk Consideration Questions: (1), (2), (3), (6 a), (6 d) with each Basis being test. Click the Assign RISC Category button. Verify the Risk Classification is HSS for TS-003 (Segment ID TS-003 will be overwritten to LSS in a following paragraph). Select Segment ID TS-003 and click the Clear Assignment button. Verify that the Risk Classification, Risk Classification Basis, and all the ARC columns for Segment ID TS-003 are cleared. Verify the selections in the Additional Risk Considerations menu have been cleared. Utilize Ctrl-Click to select Segment ID TS-003 and all Segment IDs that do not have a Consequence Category of Medium/High or High (except for TS-050 and TS-051 which will be left blank). Check all of the Additional Risk Consideration Questions with the basis being the letter a. Click the Assign RISC Category button. Verify the Risk Classification is LSS for all of the selected Segment IDs. Verify the Risk Classification column matches the HSS or LSS? column in Appendix A.13. Close the window, verify the window Exit? appears that warns Some Segments have not been evaluated. Proceed with exit? Click Yes Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Passive_ARC.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The Additional Risk Consideration (5) s Basis cannot be inputted when selected. This will not affect the results in future portions of the V&V.

48 CN-RAM Verifying the Passive Component Margin Assessment Window Test Case Description Click the Project-Save As window. Save the database as Passive_SM.mdb. Select Process Steps-Passive Component Assessment-Sufficient Margin Assessment to open the Sufficient Margin Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify that only Segment IDs with LSS in the Classification Prior column appear. For Segment ID TS-003, answer the question No and click the Commit Changes button. Verify the Success popup window appears that states Save Successful. Click the OK button. Click the Multiple Assign feature (please note TS-003 will disappear since it has been assigned HSS and clicking the Multiple Assign feature refreshes the window). Utilize the Multiple Assign feature to answer the question Yes for all other displayed Segment IDs in the window (refer to Appendix A.14). Once the Yes button has been clicked, a popup window will appear called Multiple Support/Hanger/Snubber?; click the Yes button. Enter in a Basis of Margin Basis. Click the Commit Changes button. Verify the Success popup window appears that states Save Successful. Click the OK button. Verify all answers in the Resulting Classification column match the Resulting Classification column in Appendix A Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Passive_SM.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Picklist-IDP Members Functions Test Case Description Click the Project-Save As window. Save the database as Picklists_IDP.mdb. Select the Picklists-IDP Members to open the IDP Members window. For the Name, Role, , and Phone text inputs, insert 1 for every option. Click the Save button on the contact information sheet. Click the + button and verify that a new page has been added; insert 2 for all of the information and save the information. Do this three more times with inserting 3, 4, and 5 for the information on the new sheets. Select the 3 and 4 IDP members contact information sheets and click the X button and Yes in the confirmation window, verify that they have been deleted. Close and reopen the IDP Members window and verify that the information for 1, 2, and 5 are still present.

49 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Picklists_IDP.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Passive Integrated Decision Making Panel Window Test Case Description Click the Project-Save As window. Save the database as Passive_IDP.mdb. Select Process Steps-Passive Component Assessment-Integrated Decision-Making Panel to open the Passive Integrated Decision-Making Panel window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify the Segment IDs that are defined as LSS in the Sufficient Margin Assessment window are present in the Segment ID dropdown menu. These Segment IDs are shown as LSS in the Resulting Classification column in Appendix A.14. In the Segment ID dropdown menu, select TS-011. Verify the Segment Description is correctly shown. Check each checkbox in the Review of Risk Info and Review Defense-in- Depth tabs, and in the Assessment tab add Discussion 1, Basis 1, Dissenting 1, Action 1 in the IDP Discussion, Basis, Dissenting Opinions, and Action Items sections, respectively. Enter a Risk Ranking of HSS. Click the Update button (lower right corner) to apply these changes. In the Segment ID dropdown menu, select TS-004. Check each checkbox in the Review of Risk Info and Review Defense-in-Depth tabs, and in the Assessment tab add Discussion 2, Basis 2, Dissenting 2, Action 2 in the IDP Discussion, Basis, Dissenting Opinions, and Action Items sections, respectively. Enter a Risk Ranking of LSS. Click the Update button (lower right corner) to apply these changes. Next, select TS-005 in the Segment ID dropdown menu. In the Same As... pulldown menu, select TS-004 and click the Update button next to it. Repeat this for every other Segment ID located within the Passive Integrated Decision-Making Panel Window that has not been already defined. In the Segment ID dropdown menu, select TS-011. Click the Clear button. Verify that all answers and fields are now empty. Reselect the checkboxes for every answer as before, as well as the text sections and Risk Ranking; click the Update button (lower right corner). Remove the checks from Questions 1, 3, 5, 7 in the Review of Risk Info tab, and in the Review Defense-in-Depth tab remove the checks from Questions 1, 3, 5 and press Update. Verify the rest of the information stays. Close the Passive Integrated Decision-Making Panel Window.

50 CN-RAM Reopen the Passive Integrated Decision-Making Panel window by selecting Process Steps- Passive Component Assessment-Integrated Decision-Making Panel. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify all information has been properly saved Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Passive_IDP.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The Clear button only clears for the sub-tab the user is located within; this means the Clear button will need to be clicked three (3) times to clear the Segment ID Verifying the Passive Stress Model Risk Assessment Window Test Case Description Click the Project-Save As window. Save the database as Passive_Stress_Assessment.mdb. Select Process Steps-Passive Component Assessment-Map Components to open the Map Components to Segment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select TAC-0005/TS-006 and click the Unassign Segment button. Verify the TAC-0005/TS-006 row has disappeared and only TAC-0005/TS-005 is present. Add the assignment of Segment ID TS-007 to Component ID TAC-0005 (keeping TAC-0005/TS-005). Verify TAC-0005/TS-005 and TAC-0005/TS-007 are present. Select TAC-0006/TS-006 and click the Unassign Segment button. Verify the TAC-0006/TS-006 row has disappeared and only TAC-0006/TS-005 is present. Add the assignment of Segment ID TS-007 to Component ID TAC-0006 (keeping TAC-0006/TS-005). Verify TAC-0006/TS-005 and TAC-0006/TS-007 are present. Close the Map Components to Segment window. Select Process Steps-Passive Component Assessment-Additional Risk Consideration to open the Additional Passive Risk Consideration window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Auto Assign RISC Category button. Select TS-003; click the Assign RISC Category button. Verify the Risk Classification for TS-003 appears LSS. Close the Additional Passive Risk Consideration window. When

51 CN-RAM closing the window, verify a pop-up will appear stating Some segments have not been evaluated. Proceed with exit?; click Yes button to close the window. Select Process Steps-Passive Component Assessment-Sufficient Margin Assessment to open the Sufficient Margin Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select TS-003 and answer the question Yes. Click the Commit Change button. Verify the Resulting Classification column has LSS for TS-003. Verify TS-003 is still selected and answer the question No. Click the Commit Change button. Verify the Resulting Classification column has HSS for TS-003. Close the Sufficient Margin Assessment window. Select Process Steps-Passive Component Assessment-Integrated Decision-Making Panel to open the Passive Integrated Decision-Making Panel window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select TS-007 in the Segment ID dropdown menu. Click the Update button. This is completed so the Component IDs TAC-0005 and TAC-0006 are analyzed correctly in the stress model assessment. This is able to be completed because all of the Segment IDs in Stress Model TSM2 are LSS. Otherwise, a complete re-calculation starting at the Additional Risk Considerations window would be necessary (Reference 4, Section 5.10 and Reference 4, Section 5.12 for additional discussion). Select Process Steps-Passive Component Assessment-Process-Stress Model Risk Assessment to open the Risk Determination for Supports, Hangers, and Snubbers window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Stress Model pulldown. Verify that all of the Stress Models present in the Stress Model column in Appendix A.15 appear. Investigate every Stress Model. Verify the Resulting Classification in the Stress Model Details section of the window is the same as the Stress Model HSS/LSS column in Appendix A.15. Verify all of the Segment IDs shown in the Segment IDs Mapped to Stress Model column in Appendix A.15 appear for the Stress Model they are assigned to. Verify the Risk column for each Segment ID matches the HSS/LSS Determination Post-IDP (Utilized in Stress Model Assessment) column in Appendix A.16. Highlight each Segment ID row in the Stress Model to show the Component IDs mapped to the Segment ID. Verify all the Component IDs match to the correct Segment IDs as shown in Appendix A.17. Verify the Component Risk Prior to Stress Model Analysis column and the Component Risk After Stress Model Analysis column match the Component HSS/LSS Prior to the Stress Model Analysis column in Appendix A.17 and the Component HSS/LSS After the Stress Model

52 CN-RAM Analysis in Appendix A.17, respectively. Verify the Support/Hanger/Snubber? column matches the Support, Hanger, or Snubber? column in Appendix A.17. Select Reports-Consolidated Reports to open the Consolidated Reports window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select RISC Results from the Choose a report type dropdown. Click Generate Report. In the first table, named Classification in RISC of Components, verify Component IDs match the RISC (the column with HSS or LSS is the RISC for the Component ID) that is present in the Passive RISC column in Appendix A Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Passive_Stress_Assessment.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). For the tool to calculate the proper results, workarounds (the Additional Risk Considerations window and the Sufficient Margin Assessment window portions in Section ) were utilized. With these workarounds, the Risk Determination for Supports, Hangers, and Snubbers window correctly displays the results. The Report generated does not classify component TAC-0058 and TAC-0006 and instead leaves those rows blank. This is an error with the Classification of RISC of Components report and is not an error with the Risk Determination for Supports, Hangers, and Snubbers window. The HSS/LSS determinations for TAC-0058 and TAC-0006 correctly appear in the Risk Determination for Supports, Hangers, and Snubbers window Verifying the Copy Maintenance Rule Data Window Test Case Description Click the Project-Save As window. Save the database as Tools_Maint_Rule.mdb. Select Tools-Copy Maintenance Rule Data to open the Copy Maintenance Rule Data window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Select Excel Spreadsheet button and select the Active_Maintenance_Rule.xls (this Excel file was created for the V&V). Verify that the location shown in the window is the correct location of Active_Maintenance_Rule.xls. In the Select Worksheet pulldown menu select Text. In the All Excel Headings section, click the Function ID item, and press the > button, next select ActiveFunctionID and press the Match button; repeat this for the other Database

53 CN-RAM Headings. Press the Transfer button. Verify that this is successful when the Success window appears. Select Process Steps-Active Component Assessment-Active Function Definition to open the Active Function Definitions window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify that the information from Active_Maintenance_Rule.xls Functions Definitions window. Return to the Copy Maintenance Rule Data window. has been imported into the Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Delete button, followed by Yes in the Delete pop-up window. A Success pop-up window should appear and state the data has been successfully deleted. Return to the Active Function Definitions window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify there is no longer any Function IDs present. Repeat importing Active_Maintenance_Rule.xls and verifying it has been imported successfully Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Tools_Maint_Rule.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Active Function Definition Window Test Case Description Click the Project-Save As window. Save the database as Active_Function_Definition.mdb. Select Process Steps-Active Component Assessment-Active Function Definition to open the Active Function Definitions window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify the Active Function IDs imported in Section 7.26 are still present. Click the New Function button. For the Active Function ID name it ZZ-001 (no other data inputs need to be inserted). Click the Save button and verify it appears in the Active Functions Definitions window.

54 CN-RAM Verify the pulldown menu option selected for Active Function ID is ZZ-001. Click the Edit button. Change the Active Function ID to AA-001. Click the Save button. Verify an Update window appears. Click the Yes button to continue with the change. Verify that the pulldown menu for Active Function ID no longer has ZZ-001 and has AA-001. Press the Delete button. Check the checkboxes for TF-001 and AA-001. Click the Go button. Verify that a pop-up window called Delete? appears. Click the Yes button to delete the two Active Function IDs. Verify TF-001 and AA-001 are no longer present in the Active Function ID pulldown menu. Click the New Function button. For the Active Function ID name it AA-001 (no other data inputs need to be inserted). Click the Save button and verify an Error window appears that prevents another AA-001 from being created. Click the OK button. Next, attempt to name it TF-002 (no other data inputs need to be inserted). Click the Save button and verify an Error window appears that prevents another TF-002 from being created. Click the OK button and then the Cancel button. Select TF-003 for the Active Function ID pulldown menu. Click the Edit button. Change the Active Function ID to AA-001 (no other data inputs need to be inserted). Click the Save button. Verify an Update window appears. Click the Yes button to continue with the change. Verify an Error window appears that prevents another AA-001 from being created. Click the OK button. Next, attempt to name it TF-002 (no other data inputs need to be inserted). Click the Save button. Verify an Update window appears. Click the Yes button to continue with the change. Verify an Error window appears that prevents another TF-002 from being created. Click the OK button and then the Save button [Cancel button makes TF-003 disappear]. Press the Restore button at the bottom of the screen. Verify that TF-001 and AA-001 are located within the Restore window. Click the check box adjacent to the TF-001 function. Press the Go button, followed by the Yes button. Scroll through the Active Function ID dropdown to verify that TF-001 is listed again with the proper information displayed Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Active_Function_Definition.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). When attempting to edit TF-003, the error messages properly appear. When the Cancel button is hit, TF-003 disappears from the pulldown menu. A workaround for this (presented above), is to hit the Save button instead of the Cancel button with the Active Function ID textbox having TF-003 written within it Verifying the Active Map Components Window Test Case Description Click the Project-Save As window. Save the database as Active_Map_Components.mdb.

55 CN-RAM Select Process Steps-Active Component Assessment-Map Components to open the Map Components to Active Functions window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Assign Component ID TAC-0001 to Function ID TF-001. To accomplish this, click the checkmark next to TAC-0001 and click Assign Function. A window will appear called Assign Components to Active Function. Verify the Plant Name is Plant 1, Unit 1, the System Name is Test System (TS), and the Component ID(s) is TAC Click the checkmark next to TF-001 and click the Go button. A pop-up should appear stating if the change is OK, click the Yes button. Verify that the Function ID and System ID columns in the Map Components to Active Functions window have been properly modified for TAC Additionally assign Component ID TAC-0002 to Function ID TF-001 and Function ID TF-002. Assign TAC-0003 to TF-001. Utilize the Filter Components feature to see filtering By Assignment and/or By Type. Verify all selection options for filtering components is working as intended. Select TAC-0002/TF-002 row. Click the Unassign Function button. Verify the TAC-0002/TF-002 row has disappeared and only TAC-0002/TF-001 is present. Select TAC-0002/TF-001 row. Click the Unassign Function button. Verify the TAC-0002/TF-001 row has disappeared and has been replaced with a TAC-0002 row with no Segment ID assigned to it. Utilize the Unassign Function button to unassign all Function IDs from the Component IDs. Input the Component ID mapping information that is shown in Appendix A.19. Map multiple Component IDs at the same time (if they are mapped to the same Function ID) to verify the multiple selections feature is functioning Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Active_Map_Components.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Active Qualitative Assessment Window Test Case Description Click the Project-Save As window. Save the database as Active_Qualitative.mdb. Select Process Steps-Active Component Assessment-Qualitative Assessment to open the Qualitative Active Risk Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown.

56 CN-RAM When new Component IDs are mapped to Function IDs in the Map Components to Active Function window, verify the warning in Figure 7-13 appears. Click the OK button to continue. Figure 7-13: Active Qualitative Default Assignment Window For Component IDs TAC-0061 through TAC-0064, verify the Candidate for Low Safety Significant or Safety Significant? column functions when modifying the options in the flowchart. Click the Save button after modifying a Component ID and click the Refresh button (below the Components table) to re-calculate the Candidate for Low Safety Significant or Safety Significant? column and the Identify Safety Significant Attributes of the Components column. Verify the flowcharts function for every Select Assessment option. Make TAC-0004, TAC-0011, TAC-0012, TAC-0030 through TAC-0035, and TAC-0061 through TAC-0064 Candidate Safety Significant and write the text a in the Identify Safety Significant Attributes of the Components textbox. Click the Refresh button (below the Components table) in every Select Assessment to update the Components textbox. Close the window. Select Process Steps-Active Component Assessment-Qualitative Assessment to open the Qualitative Active Risk Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify the Component IDs defined above are Safety Significant. Utilize the Filter By pulldown and selecting SS to view all of the Component IDs that are Safety Significant Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Active_Qualitative.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). If multiple assessments (Fire, Seismic, Shutdown, and/or External Hazards) are undertaken, the All selection in Filter By will not show the correctly stored information in the Candidate for Low Safety Significant or Safety Significant? column in the Components table for that

57 CN-RAM assessment. To see the correct HSS/LSS determination for the assessment selected, verify the assessment has been refreshed and use the SS (SS is the same as HSS for determinations in the tool) and/or LSS for the Filter By selections. These two filter selections will display the correct information for the assessment. Each assessment will show different SS or LSS values for Component IDs (based on the determination from each assessment). The overall HSS/LSS determination for the qualitative assessment for a Component ID will be utilized to determine if a Function ID is HSS/LSS. TAC-0064 was not modified to be SS in this window for the test case attached. Because of this, TAC-0064 will appear in the Defense-in-Depth Assessment window in the attached test case file. It will not be classified in the Defense-in-Depth Assessment window and therefore its RISC will be blank Verifying the Creation and Modification of the Excel PRA Templates Test Case Description Click the Project-Save As window. Save the database as Tools_PRA_Templates.mdb. Select Tools-Create Excel PRA Templates to open the Create Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. In the Select PRA Type section check Internal, Fire, Seismic, Shutdown, and External Hazards. Click the Create Excel File(s) button. A pop-up window should appear that is shown in Figure Figure 7-14: Pop-up Window for Creating Excel PRA Templates Click the OK button. A Windows Explorer window named Save Internal PRA Excel File As will appear with the default file name pre-written (this will end up being different than Figure 7-14 due to a textual discrepancy, the default file name in the Windows Explorer window is correct). Choose the location where to create the file and click the Save button. There will be a green progress bar in the Tools-Create Excel PRA Templates window. Once the green progress bar

58 CN-RAM Figure 7-15: Pop-up Window after Creating the Internal Excel PRA Template is filled, a window should pop-up as shown in Figure Click the OK button and complete the rest of the Excel PRA template files for the other PRA Types. Verify the Excel PRA template files are named: 1. (Plant 1-Unit-1)(TS)(Internal).xls 2. (Plant-1-Unit-1)(TS)(Fire).xls 3. (Plant-1-Unit-1)(TS)(Seismic).xls 4. (Plant-1-Unit-1)(TS)(Shutdown).xls 5. (Plant-1-Unit-1)(TS)(External).xls Open the (Plant-1-Unit-1)(TS)(Internal).xls file in Excel. Verify that all of the components that have been mapped to a Function ID are present. Right click on the row where TAC-0001 is located and click the Copy Component option. This should create another row for TAC Right click on the newly created row with TAC-0001 and click the Delete Component option. This second row should now be deleted. This process will be utilized to add Component IDs and delete Component IDs in the Excel files. The five (5) attached files (refer to Section ) show the appearance of the inputted PRA values for each case; replicate them into the created Excel PRA template files. These values are not reflective of actual plant data and instead are utilized to test the threshold values Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results. The Excel PRA template files created (with data inputted) are electronically attached to this document (see Table 7-6 for a list of electronically attached files). They are: 1. (Plant 1-Unit-1)(TS)(Internal).xls 2. (Plant-1-Unit-1)(TS)(Fire).xls

59 CN-RAM (Plant-1-Unit-1)(TS)(Seismic).xls 4. (Plant-1-Unit-1)(TS)(Shutdown).xls 5. (Plant-1-Unit-1)(TS)(External).xls The PRA Excel files are defined differently than what is shown in Figure The correct way to define them is correctly written in the Windows Explorer window Verifying the Active Load Component PRA Files Window Test Case Description Click the Project-Save As window. Save the database as Active_Load_PRA.mdb. Select Process Steps-Active Component Assessment-Load Active Component PRA Files to open the Load Active Components Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select the Internal, Fire, Seismic, Shutdown, and External Hazards checkboxes in the Select PRA Type section. Press the Load Excel File(s) button, a Windows Explorer window named Open Internal PRA File is opened. Select the Excel file named (Plant 1-Unit-1)(TS)(Internal).xls and click the Open button. A green progress bar will appear showing the data importation. Once the data importation has been completed, a window named Success will appear that states Data for PRA Type: Internal Transferred Successfully, click the OK button. A Windows Explorer window named Open Fire PRA File is opened. Select the Excel file named (Plant-1-Unit- 1)(TS)(Fire).xls and click the Open button. A green progress bar will appear showing the data importation. Once the data importation has been completed, a window named Success will appear that states Data for PRA Type: Fire Transferred Successfully, click the OK button. A Windows Explorer window named Open Seismic PRA File is opened. Select the Excel file named (Plant-1-Unit-1)(TS)(Seismic).xls and click the Open button. A green progress bar will appear showing the data importation. Once the data importation has been completed, a window named Success will appear that states Data for PRA Type: Seismic Transferred Successfully, click the OK button. A Windows Explorer window named Open Shutdown PRA File is opened. Select the Excel file named (Plant-1-Unit-1)(TS)(Shutdown).xls and click the Open button. A green progress bar will appear showing the data importation. Once the data importation has been completed, a window named Success will appear that states Data for PRA Type: Shutdown Transferred Successfully, click the OK button. A Windows Explorer window named Open External PRA File is opened. Select the Excel file named (Plant-1-Unit- 1)(TS)(External).xls and click the Open button. A green progress bar will appear showing the data importation. Once the data importation has been completed, a window named Success will appear that states Data for PRA Type: External Transferred Successfully, click the OK button.

60 CN-RAM Select Process Steps-Active Component Assessment-Quantitative Assessment to open the Perform Active Components Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select the Internal, Fire, Seismic, Shutdown, and External Hazards checkboxes in the Select PRA Type section. Press the Evaluate System(s) button. Verify the green progress bar occurs and once complete, verify the pop-up Assessment Complete window appears that states Assessment for PRAType: Internal complete. Click the OK button. Verify this process repeats for the Fire, Seismic, Shutdown, and External Hazards evaluations. Select Process Steps-Active Component Assessment-Integrated Assessment to open the Active Components Integrated Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Perform Integrated Assessment button. Verify the pop-up window Success appears and click the OK button. Verify the Components Summary has information present. Select Process Steps-Active Component Assessment-Load Active Component PRA Files to open the Load Active Components Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select the Internal, Fire, Seismic, Shutdown, and External Hazards checkboxes in the Select PRA Type section. Click the Delete Data button. Verify that a window comes up that states Success with the description of Data for PRA Type: Internal Successfully Deleted. This will repeat four (4) times for the PRA types of Fire, Seismic Shutdown, and External Hazards. Select Process Steps-Active Component Assessment-Integrated Assessment to open the Active Components Integrated Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Perform Integrated Assessment button. Verify that the window named Insufficient Data appears. Click the OK button. Select Process Steps-Active Component Assessment-Load Active Component PRA Files to open the Load Active Components Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select the Internal, Fire, Seismic, Shutdown, and External Hazards checkboxes in the Select PRA Type section. Reload the Excel files.

61 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Active_Load_PRA.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files) Verifying the Active Quantitative Assessment Window Test Case Description Click the Project-Save As window. Save the database as Active_Quantitative.mdb. Select Process Steps-Active Component Assessment-Quantitative Assessment to open the Perform Active Components Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the checkboxes next to each of the following PRA Types in the Select PRA Type Section: Internal, Fire, Seismic, Shutdown, and External Hazards. Press the Evaluate System(s) button, a green progress bar will occur for every assessment. Verify that the Assessment Complete window appears after each assessment; click OK to continue the evaluation to the next assessment. Select Process Steps-Active Component Assessment-Integrated Assessment to open the Active Components Integrated Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Perform Integrated Assessment button. Verify the pop-up window Success appears and click the OK button. Verify the Components Summary has information present. Select Process Steps-Active Component Assessment-Load Active Component PRA Files to open the Load Active Components Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the checkboxes next to each of the following PRA Types in the Select PRA Type Section: Internal, Fire, Seismic, Shutdown, and External Hazards. Click the Delete Data button. Verify the Success window appears, click OK. The Success window will appear five (5) times, one for each PRA Type deleted. Select Process Steps-Active Component Assessment-Quantitative Assessment to open the Perform Active Components Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown.

62 CN-RAM Click the checkboxes next to each of the following PRA Types in the Select PRA Type Section: Internal, Fire, Seismic, Shutdown, and External Hazards. Press the Evaluate System(s) button, a green progress bar will occur for every assessment. Verify that the Assessment Complete window appears after each assessment; click OK to continue the evaluation to the next assessment. Select Process Steps-Active Component Assessment-Integrated Assessment to open the Active Components Integrated Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Perform Integrated Assessment button. Verify that the window named Insufficient Data appears. Click the OK button. Select Process Steps-Active Component Assessment-Load Active Component PRA Files to open the Load Active Components Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select the Internal, Fire, Seismic, Shutdown, and External Hazards checkboxes in the Select PRA Type section. Reload the Excel files. Select Process Steps-Active Component Assessment-Quantitative Assessment to open the Perform Active Components Quantitative Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select Process Steps-Active Component Assessment-Load Active Component PRA Files to open the Load Active Components Plant PRA Excel Files window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Select the Internal, Fire, Seismic, Shutdown, and External Hazards checkboxes in the Select PRA Type section. Reevaluate the PRA Type files Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results and are shown in the database file Active_Quantitative.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files).

63 CN-RAM Verifying the Active Integrated Assessment Window Test Case Description Click the Project-Save As window. Save the database as Active_Integrated_Assessment.mdb. Select Process Steps-Active Component Assessment-Integrated Assessment to open the Active Components Integrated Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Press the Perform Integrated Assessment button. The integrated assessment will appear for Component IDs present in all five (5) PRA Types. Most Component IDs are only present in either the qualitative assessment or a single PRA type in the quantitative assessment. Verify the MAX values for F-V, RAW, and CCF RAW. Verify the PRA Type of the MAX and whether it is from the CDF or LERF assessment. The expected results are shown in Appendix A.20. Verify only Component IDs TAC-0803 through TAC-0821 appear in the Components Summary table. Verify the values of the Integrated F-V, Integrated RAW, and Integrated CCF RAW are the same as the ones shown in Appendix A Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Active_Integrated_Assessment.mdb, which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The MAX values and other information shown in Appendix A.20 does not match the results in the trial. This appears to be a textual error because the Integrated F-V, Integrated RAW, and Integrated CCF RAW were calculated correctly Verifying the Active Defense-in-Depth Assessment Window Test Case Description Click the Project-Save As window. Save the database as Active_Defense_in_Depth.mdb. Select Process Steps-Active Component Assessment-Defense-in-Depth Assessment to open the Defense-In-Depth Assessment window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Click the Find Low Safety Significant Functions.

64 CN-RAM The Function IDs that are shown in this window will be given a categorization of HSS or LSS from the information provided in the Defense-in-Depth Assessment column in Appendix A.22 (utilize Appendix A.23 instead, explained in Section ). The Function IDs that have a - will not be present in the Defense-in-Depth window. Modify the CDF Assessment tab and the Containment Assessment tab to change HSS/LSS categorization. Verify that the CDF Assessment tab and the Containment Assessment tab function yield the correct HSS/LSS categorization (for HSS/LSS determination, utilize Section 6.7 in Reference 4). Press the Update button to confirm a selection. Verify the Category and Basis match based on the selections. The Basis will be the tab that yields the higher Category (if the HSS/LSS determination is the same for both tabs, the Containment Defense-in-Depth Assessment will be displayed). Verify the Clear button clears the information for the shown tab (along with the Category and Basis if both tabs have been cleared). Have the Function IDs match the desired HSS/LSS determinations as shown in the Defense-in-Depth Assessment column in Appendix A.22 (utilize Appendix A.23 instead, explained in Section ) Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Active_Defense_in_Depth.mdb., which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The Defense-in-Depth Assessment window does not properly display the Function IDs. There appears to be a discrepancy in the Sensitivity Studies. Function IDs that are mapped to Component IDs with Sensitivity Studies that exceed the quantitative thresholds should be classified as HSS for this window and not appear in the Defense-in-Depth Assessment window; but they do. Refer to Reference 4, Section 6.5 for more information. Based on the above discrepancies, different Function IDs are shown in the Defense-in-Depth Assessment window compared to results in Appendix A.22. Therefore, utilize Appendix A.23. Refer to Section for the reasoning behind TAC-0064 appearing in this window for the test case file attached. In another test case trial (not attached), TAC-0064 was validated to not appear in this window Verifying the Active Integrated Decision Making Panel Window Test Case Description Click the Project-Save As window. Save the database as Active_IDP.mdb. Select Process Steps-Active Component Assessment-Integrated Decision-Making Panel to open the Active Integrated Decision-Making Panel window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown.

65 CN-RAM Verify the Function IDs that are defined as LSS in the Defense-in-Depth Assessment window are present in the Function ID dropdown menu. These Function IDs are shown as either HSS or LSS in the Integrated Decision-making Panel column in Appendix A.22 (utilize Appendix A.23 instead, explained in Section ). In the Function ID dropdown menu, select TF-005. Verify the Function Description is correctly shown. Check each checkbox in the Review of Risk Info and Review Defense-in-Depth tabs, and in the Assessment tab add Discussion 1, Basis 1, Dissenting 1, Action 1 in the IDP Discussion, Basis, Dissenting Opinions, and Action Items sections, respectively. Enter a Risk Ranking of HSS. Click the Update button (lower right corner) to apply these changes. In the Function ID dropdown menu, select TF-001. Check each checkbox in the Review of Risk Info and Review Defense-in-Depth tabs, and in the Assessment tab add Discussion 2, Basis 2, Dissenting 2, Action 2 in the IDP Discussion, Basis, Dissenting Opinions, and Action Items sections, respectively. Enter a Risk Ranking of LSS. Click the Update button (lower right corner) to apply these changes. Next, select TF-003 in the Function ID dropdown menu. In the Same As... pulldown menu, select TF-001 and click the Update button next to it. Repeat this for every other Function ID located within the Passive Integrated Decision-Making Panel Window that has not been already defined, utilize TF-001 for LSS and TF-005 for HSS. HSS/LSS determinations for every Function ID can be found in Appendix A.22 (utilize Appendix A.23 instead, explained in Section ). Verify the red highlight appears in the upper portion of the table where Risk Significant, In PRA, etc. appear for Function ID TF-001. These red highlights will appear if the checkmarks do not match what is present for the Active Function Definitions window. Choose the correct selections for TF-001, click the Update button. Switch to another Function ID and then come back to this TF-001. Verify the red highlights disappears. Go to TF-005 and keep nothing checked (Risk Significant, In Maintenance Rule, In EOPS, and Fail Safety Component will be highlighted red). In the Function ID dropdown menu, select TF-005. Click the Clear button. Verify that all answers and fields are now empty. Reselect the checkboxes for every answer as before, as well as the text sections and Risk Ranking; click the Update button (lower right corner). Remove the checks from Questions 1, 3 5, 7 in the Review of Risk Info tab, and in the Review Defense-in-Depth tab remove the checks from Questions 1, 3, 5 and press Update. Verify the rest of the information stays. Close the Active Integrated Decision-Making Panel Window. Reopen the Active Integrated Decision-Making Panel window by selecting Process Steps- Active Component Assessment-Integrated Decision-Making Panel. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown. Verify all information has been properly saved.

66 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the database file Active_IDP.mdb., which is electronically attached to this document (see Table 7-6 for a list of electronically attached files). The Clear button only clears for the sub-tab the user is located within; this means the Clear button will need to be clicked three (3) times to clear the Function ID. Function IDs that are defined HSS from a Component ID(s) that did not exceed thresholds in the integrated assessment have no way to be classified down to LSS (currently these Function IDs do not appear in the Active Integrated Decision-Making Panel window). Refer to Reference 4, Section 6.8 for more information. The Normal or Standby selection will be highlighted red unless all selections match what is present in the Active Functions Definition window. None of these affects the final RISC categorization of Component IDs for the test cases described Verifying the Creation and Contents of Reports Test Case Description For each of the following reports, verify that the description provided matches the report provided by the program Systems and Components in Systems Reports Reports-Consolidated Reports-Systems This report generates a table of all of the System IDs and System Names that are created for the plant/unit combination. The information for this report is from Process Steps-Scope Definition-System Definition Reports-Consolidated Reports-Components in System This report generates the Component ID, Component Description, Component Type, Safety Class, Component Functionality, and whether the Component is a support, hanger, or snubber for the plant/unit/system combination. The information for this report is from Process Steps- Scope Definition-Assign Component Functionality. In the Reports-Consolidated Reports-Components in System report, the column describing if a Component ID is a support, hanger, or snubber has Yes displayed for No, and No displayed for Yes.

67 CN-RAM Active Reports Reports-Consolidated Reports-Active-Properties of Functions This report generates the Function ID, Function Description, and other properties defined for the functions for the plant/unit/system combination. The information for this report is from Process Steps-Active Component Assessment-Properties of Functions Reports-Consolidated Reports-Active-Component Mapping to Functions This report generates what Function IDs each of the Component IDs are mapped to. Several tables are provided in this report. First, there is a table that shows each Component ID and what Function ID they are mapped to. The tables succeeding the first table lists the Component IDs mapped to a specific Function ID. The information for this report is from Process Steps-Active Component Assessment-Map Components Reports-Consolidated Reports-Active-Integrated Assessment This report generates the results of the integrated assessment for every Component ID that is present in the quantitative active assessment. The information for this report is from Process Steps-Active Component Assessment-Integrated Assessment Reports-Consolidated Reports-Active-Defense-in-Depth Assessment This report generates the results of the defense-in-depth assessment for every Function ID that is a candidate for LSS based on the quantitative and qualitative results. The information for this report is from Process Steps-Active Component Assessment-Defense-in-Depth Assessment Reports-Consolidated Reports-Active-Integrated Decision-making Panel This report generates the results of the IDP for every Function ID that was reviewed. The information for this report is from Process Steps-Active Component Assessment-Integrated Decision-making Panel. Currently, the table does not properly display the Risk Significance, In PRA, Normal or Standby, In Maintenance Rule, In EOPS, Causes Trip, Fail Safety Component, and Used to Mitigate selections from the Process Steps-Active Component Assessment-Integrated Decisionmaking Panel window Passive Reports Reports-Consolidated Reports-Passive-Properties of Segment This report generates the results of the Segment ID, Segment Description, along with other properties of the Segment ID. The information for this report is from the Other tab of the Process Steps-Passive Component Assessment-Segment Definition window.

68 CN-RAM Reports-Consolidated Reports-Passive-Consequences and Operator Actions This report generates the information of which Consequence ID(s) and Operator Action ID(s) are assigned to specific Segment IDs. The information for this report is from Process Steps- Passive Component Assessment-Consequence Definition. Currently, the Operator Action Description column does not display the information that is present in the Tool for Risk-Informed Operations and currently is blank Reports-Consolidated Reports-Passive-Component Mapping to Segments This report generates the information of which Segment ID(s) a Component ID is mapped to. First, there is a table that shows each Component ID and what Segment ID(s) they are mapped to. The tables succeeding the first table lists the Component IDs mapped to a specific Segment ID. The information for this report is from Process Steps-Passive Component Assessment- Map Components Reports-Consolidated Reports-Passive-External Events and Shutdown This report generates the information on the Segment IDs Consequence Categories for External Hazards and Consequence Categories for Shutdown. The External Hazards table is shown first in the creation of the report, with the Shutdown table succeeding it. The information for this report is from Process Steps-Passive Component Assessment-Shutdown and External Events Impact Assessment. The option to create this report is shown twice in the Reports-Consolidated Reports selection; either of these options generates the report. Currently, there is no table in the Reports-Consolidated Reports-Properties-External Events and Shutdown to display the Resulting Consequence Category and the Resulting Basis columns from Process Steps-Passive Component Assessment-Shutdown and External Events Impact Assessment (refer to Section 7.20) Reports-Consolidated Reports-Passive-Additional Risk Considerations This report generates the information on the Additional Risk Considerations for every Segment ID. If a Segment ID is classified as HSS by its Consequence Category, the Classification Basis will be the Consequence Evaluation. The information for this report is from Process Steps- Passive Component Assessment-Additional Risk Considerations. Currently, the Reports-Consolidated Reports-Passive-Additional Risk Considerations table will display a Risk Classification as HSS for Component IDs that were later made HSS in the Process Steps-Passive Component Assessment-Sufficient Margin Assessment window (refer to Section 7.22).

69 CN-RAM Reports-Consolidated Reports-Passive-Sufficient Margin Assessment This report generates the information on whether the Segment IDs classified as LSS in the Process Steps-Passive Component Assessment-Additional Risk Considerations window have sufficient margin to maintain the Segment as LSS. The information for this report is from Process Steps-Passive Component Assessment-Sufficient Margin Assessment. For Segment IDs that were classified as HSS in the Process Steps-Passive Component Assessment-Additional Risk Considerations window (refer to Section 7.21), the Reports- Consolidated Reports-Passive-Sufficient Margin Assessment report will have the Risk Classification HSS and the Basis column blank for that Segment ID Reports-Consolidated Reports-Passive-Stress Model Assessment This report generates the information on the Stress Model Assessment that is performed in the Tool for Risk-Informed Operations. The first table shown will have the Stress Model along with the associated Risk Classification Pre-IDP and Risk Classification Post-IDP. Even though the Stress Models themselves do not have a risk classification, these columns represent the highest risk classification of the Segment IDs that are present in the stress model. This allows for easier reporting on which supports, hangers, and snubbers will be classified as HSS along with describing the changes to them based on the IDP. Next, each Stress Model has its own table which shows the Segment IDs that are mapped to that Stress Model. To review which Components are mapped to each Segment ID, refer to Reports-Consolidated Reports-Component Mapping to Segments. The information for this report is from Process Steps-Passive Component Assessment- Stress Model Definition window and Process Steps-Passive Component Assessment- Stress Model Risk Assessment window. Currently, the Risk Classification Post-IDP column for the Stress Model Assessment report is blank and does not display any information Reports-Consolidated Reports-Passive-Integrated Decision-making Panel This report generates the results of the IDP for every Segment ID that was reviewed. The information for this report is from Process Steps-Passive Component Assessment- Integrated Decision-making Panel Reports-Consolidated Reports-RISC Results This report generates the results of the final RISC categorization of the components for the specific plant/unit/system combination. The first table that is shown describes the classification of each of the components in RISC-1, RISC-2, RISC-3, or RISC-4. If a component is in RISC-1, it will have HSS appear in the RISC-1 column and N/A appear on all of the other columns. If a component is in RISC-2, it will have

70 CN-RAM HSS appear in the RISC-2 column and N/A appear on all of the other columns. If a component is in RISC-3, it will have LSS appear in the RISC-3 column and N/A appear on all of the other columns. If a component is in RISC-4, it will have LSS appear in the RISC-4 column and N/A appear on all of the other columns. The next four (4) tables show each of the RISC categories and the components that are present within them. The final table shows a summary of the number of components classified in each of the RISC categories. The information for this report is from the final HSS/LSS determination of the components. This is found from determining the highest risk determination for the passive assessment, the highest risk determination of the active assessment, and then determining the overall highest risk determination based on comparing the highest risk determination between the active assessment and the passive assessment. For the passive assessment, the Process Steps-Passive Component Assessment-Stress Model Risk Assessment window displays the information required to determine the highest risk determination for the Component IDs. For the active assessment, the Process Steps-Active Component Assessment-Defense-in- Depth Assessment will screen out Function IDs that are classified as HSS from either the qualitative or quantitative assessment along with further classifying Function IDs as HSS based on the responses to the questions asked. The Process Steps-Active Component Assessment-Integrated Decision-making Panel will determine if the Function IDs that are candidates from LSS from the Process Steps-Active Component Assessment-Defense-in- Depth Assessment will be LSS or HSS. All of the Component IDs present in a Function ID have the same risk classification as the Function ID. The highest risk classification of a Component ID (if it appears in more than one Function ID) will be taken. The final determination of the Risk Categorization will be taking the higher determination between the passive assessment and active assessment for the Component ID (if it is present in both the passive assessment and active assessment). The determination of the safety-class of the component is displayed in the Process Steps-Scope Definition-Assign Component Functionality window. Currently, the RISC categorization does not provide the user with the ability to look at the final classification of a component that is located within multiple systems. For components that are located within multiple systems, this will need to be completed manually by the user (i.e., by generating the reports for each system and manually comparing them). The report has a minor typographical issue in the printout where in the RISC-1, RISC-2, RISC-3, and RISC-4 tables, a second title may appear in the middle of those tables with the same name.

71 CN-RAM Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results mostly match the expected results and are shown in the report.pdf and.xls files, which are electronically attached to this document (see Table 7-6 for a list of electronically attached files). There are some minor formatting issues with the results. Some reports may display items that are located in a Restore folder. In the Reports-Consolidated Reports-Components in System report (refer to ), the column describing if a Component ID is a support, hanger, or snubber has Yes displayed for No, and No displayed for Yes. The Active Integrated Decision-making Panel table (refer to ) does not properly display the Risk Significance, In PRA, Normal or Standby, In Maintenance Rule, In EOPS, Causes Trip, Fail Safety Component, and Used to Mitigate selections from the Process Steps- Active Component Assessment-Integrated Decision-making Panel window. The Operator Action Description column in the Consequences and Operator Actions table (refer to Section ) does not display the information that is present in the Tool for Risk- Informed Operations and currently is blank. Currently, there is no table in the Reports-Consolidated Reports-Properties-External Events and Shutdown (refer to Section ) to display the Resulting Consequence Category and the Resulting Basis columns from Process Steps-Passive Component Assessment- Shutdown and External Events Impact Assessment (refer to Section 7.20). Currently, the Reports-Consolidated Reports-Passive-Additional Risk Considerations table (refer to Section ) will display a Risk Classification as HSS for Component IDs that were later made HSS in the Process Steps-Passive Component Assessment-Sufficient Margin Assessment window (refer to Section 7.22). For Segment IDs that were classified as HSS in the Process Steps-Passive Component Assessment-Additional Risk Considerations window (refer to Section 7.21), the Reports- Consolidated Reports-Passive-Sufficient Margin Assessment report (refer to Section ) will have the Risk Classification HSS and the Basis column blank for that Segment ID. Currently, the Risk Classification Post-IDP for the Stress Model Assessment report (refer to Section ) is blank and does not display any information. Additionally, the RISC Results table (refer to Section ) has some discrepancies with the expected results. TAC-0058 row is blank for a RISC Results table completed after Section 7.35 (the results that should be displayed is the Overall RISC column in Appendix A.18). TAC-0006 and TAC-0058 rows are blank for a RISC Results completed after Section 7.24 (the results that should be displayed is the Passive RISC column in Appendix A.18, because the active assessment has not been completed yet)

72 CN-RAM Refer to Section for the reasoning behind TAC-0064 being blank in the RISC Results table. TAC-0064 was validated to appear correctly in another test case trial (not attached). The RISC Results report has some issues with how component RISC values are displayed. This report generates a total of 14 tables. The functionality of these tables (presented sequentially as they appear in the report) is described below: 1. This table correctly displays the RISC value for each component other than TAC A value of HSS appears in the column corresponding to the RISC value of the component. A value of N/A appears in the other columns. 2. This table incorrectly displays TAC-0058 as a RISC-1 component. 3. This table correctly displays all components classified as RISC This table displays an incorrect result. All components classified as RISC-1, plus some additional components, are displayed. 5. This table incorrectly displays TAC-0058 as a RISC-2 component. 6. This table correctly displays all components classified as RISC This table displays an incorrect result. All components classified as RISC-2, plus some additional components, are displayed. 8. This table correctly displays TAC-0058 as a RISC-3 component, however is not the desired result because TAC-0058 should show up in the next table and this table should not be printed. 9. This table correctly displays all components classified as RISC This table displays an incorrect result. All components classified as RISC-3, plus some additional components, are displayed. 11. This table incorrectly displays TAC-0058 as a RISC-4 component. 12. This table correctly displays all components classified as RISC This table displays an incorrect result. All components classified as RISC-4, plus some additional components, are displayed. 14. This table correctly displays the number of components classified in each RISC bin Verifying the Copy RI-ISI Data Window Test Case Description Open the System_Definition.mdb. Utilize the Save As feature to create a copy of the file named Copy_RI_ISI.mdb Select Tools-Copy RI-ISI Data to open the Copy RI-ISI Data window. Verify the Select Plant and Unit dropdown is selected to the default plant: Plant 1, Unit 1, then select Test System (TS) from the Select System dropdown.

73 CN-RAM Click the Browse for RI-ISI button, and select the RI-ISI database RI-ISI_to_RISC_DB. Select one of the test Select Unit from RI-ISI Database options and one of the Select Systems from RI-ISI Database options, press the Load Data button. A green progress bar displays the progress of the importation. After the importation is complete, verify that a pop-up window titled Complete appears. Click the OK button to close the window. Verify that the data has been imported in the Segment Definition window, the Consequence Definition window, and the Perform Passive Quantitative Assessment window. Return to the Copy RI-ISI Data window and select the same options as before. Click the Delete button. Verify a warning window appears called Delete; click the Yes button to continue with the deletion. A green progress bar displays the progress of the deletion. After the deletion is complete, verify that a pop-up window titled Complete appears. Click the OK button to close the window. Verify that the relevant data has been deleted in the Segment Definition window, the Consequence Definition window, and the Perform Passive Quantitative Assessment window. The Copy RI-ISI Data window is in the process of being optimized for all.mdb formats of RI-ISI data. Therefore, functionality may vary depending on the structure of the RI-ISI data in the.mdb file and an RI-ISI file that isn t recognized could be rejected by the Tool for Risk-Informed Operations Comparison of Test Results with Expected Results The expected results of this test case were performed with a sample RI-ISI database. The import was not fully successful. Several columns (e.g., the Function ID column) were not properly modified. Due to the nature of the ongoing optimization of this window, the test file will not be attached to avoid confusion on the layout of the RI-ISI importation Verifying the Exit Functionality Test Case Description Open the Project menu, and select the Exit button; ensure that the program is now closed Comparison of Test Results with Expected Results The expected results of this test case are described in Section The actual results match the expected results.

74 CN-RAM Listing of Runs Made in Validation Table 7-6 Electronically Attached File Listing Run No. Test Case No. Computer Run Description Machine Name Run Date/Time File Type EDMS File Name or File Location to 7.38 All test runs completed in Section 7.0. HP-2UA445275T.w-intra.net March 16, 2017 to March 22, 2017.zip V_V_Attachments.zip_ _CN-RAM

75 CN-RAM Known Problems 1. As of Version 1.0 for the Tool for Risk-Informed Operations, there are no CAPAL issues opened. 2. Crystal Reports 64-bit currently does not function with the Tool for Risk-Informed Operations, Version 1.0. Crystal Reports 32-bit should be utilized. 3. There are workarounds and other limitations noted in the V&V and detailed in the User s Manual (Reference 4). Limitations are being investigated for possible improvement in the Tool for Risk-Informed Operations, Version Several log files were created during the V&V. None of these log files affect the overall results of the V&V; they are electronically attached to this document (see Section 7.39).

76 CN-RAM Adequacy and/or Limits of the Validation 9.1 Adequacy The V&V demonstrates that the Tool for Risk-Informed Operations, Version 1.0 is capable of performing the major functions of categorization including: Scope Definition Passive Categorizations Active Categorizations Comparison of Active and Passive Reports The Tool for Risk-Informed Operations, Version 1.0 was developed to operate in a Microsoft Access environment. The Tool for Risk-Informed Operations, Version 1.0 is concluded to be acceptable for release. 9.2 Limits of Validation 1. The Add Column(s) button in the Copy Customer Component List window (see Section 7.7), was not validated in this document. Even though this can be matched properly, it currently has no functionality beyond matching additional columns that will not be displayed in the tool. 2. The size of the database is limited by Microsoft Access in The Tool for Risk-Informed Operations, Version Reports are available as detailed in the User s Manual (Reference 5). 4. Additional limits of the validation are shown in the Section 7.0 of this document. 5. Limits to the Tool for Risk-Informed Operations are shown in the User s Manual (Reference 5).

77 CN-RAM Glossary AOT CAPAL CCF CCFRAW CCDF CCDF/CCDP CCDP CDF CDP CFR CLERF CLERF/CLERP CLERP COMB DB Cat. DID EDMS EOPS EPRI FL FV F-V HSS IDP IE IFV IRAW ICCFRAW JI/S Allowed Outage Time Corrective Action, Prevention, and Learning Common Cause Failure Risk Achievement Worth (just CCF basic events) Conditional Core Damage Frequency CCDF or CCDP Conditional Core Damage Probability Core Damage Frequency Core Damage Probability Code of Federal Regulations Conditional Large Early Release Frequency CLERF or CLERP Conditional Large Early Release Probability A combination event (IE+SYS) Design Basis Category Defense-in-Depth Enterprise Document Management System Emergency Operating Procedures Electric Power Research Institute Flooding Fussell-Vesely Fussell-Vesely High Safety Significant Integrated Decision-making Panel Initiating Event Integrated Fussell-Vesely Integrated Risk Achievement Worth (no CCF basic events) Integrated Risk Achievement Worth (just CCF basic events) Jet Impingement or Spray

78 CN-RAM LERF LERP LLOCA LOCA LSS MFLB MFW MLOCA MR N/A NEI NRC NSR PRA PW RAW RI-ISI RISC SGTR SLB SLOCA SR SSCs SYS THA THL V&V Large Early Release Frequency Large Early Release Probability Large LOCA Loss of Coolant Accident Low Safety Significant Main Feedwater Line Breaks Main Feedwater Medium LOCA Maintenance Rule Not Applicable Nuclear Energy Institute Nuclear Regulatory Commission Non-Safety Related Probabilistic Risk Analysis Pipe Whip Risk Achievement Worth (no CCF basic events) Risk-Informed In-Service Inspection Risk-Informed Safety Class Steam Generator Turbine Rupture Steam Line Break Small LOCA Safety Related Systems, Structures, and Components System Failure High Temperature/Humidity (Large Area) High Temperature/Humidity (Localized) Verification and Validation

79 CN-RAM Appendix A: Supporting Documentation A.1 Software License Agreement This appendix shows the Software License Agreement: TOOL FOR RISK-INFORMED OPERATIONS This software is the property of Westinghouse Electric Company LLC and/or its affiliates. It is transmitted to you in confidence and trust, and you agree to treat this software in strict accordance with the terms and conditions of the agreement under which it was provided to you. Any unauthorized use of this software is prohibited. SOFTWARE LICENSE AGREEMENT This is an agreement between the recipient of the Tool for Risk-Informed Operations software (each a Licensee ) and Westinghouse Electric Company LLC ( Westinghouse ). By using any portion of the Software delivered by Westinghouse as set forth herein, Licensee agrees to become bound by the terms of this Software License Agreement (hereinafter Agreement ). WHEREAS, Westinghouse intends to supply its Tool for Risk-Informed Operations software (Version 1.0), including related documentation (collectively the Software ), to Licensee in fulfillment of its obligations under PWROG project PA-RMSC-1515 (the Project ); and WHEREAS, Licensee requires a license to use the Software provided under this Agreement; and WHEREAS, Westinghouse is willing to grant Licensee a license in such Software under the terms and conditions set forth herein. NOW THEREFORE, the parties agree as follows: GRANT OF LICENSE: Westinghouse, as Licensor, grants to Licensee, a nonexclusive right to use the Software at each of the Licensee s nuclear power plant sites, at the Licensee s corporate headquarters, and at other off-site locations which will use the Software on Licensee owned and maintained computers (each a Location ). Westinghouse reserves all rights not expressly granted to Licensee. OWNERSHIP OF WESTINGHOUSE SOFTWARE: As a Licensee, you own the physical media on which the Software is originally distributed, but Westinghouse or its suppliers retain title to and ownership of the Software and all copies of the Software, regardless of the form of media used and other copies which may exist. This Agreement, or the license granted hereunder, is not a sale of

Westinghouse Non-Proprietary Class 3. From: Richard W. Rolland III Ext: +1 (860) Our ref: LTR-RAM Fax: +1 (860)

Westinghouse Non-Proprietary Class 3. From: Richard W. Rolland III Ext: +1 (860) Our ref: LTR-RAM Fax: +1 (860) To: File Date: cc: From: Richard W. Rolland III Ext: +1 (860) 731-6447 Fax: +1 (860) 731-4669 Subject: User s Manual for the Tool for Risk-Informed Operations, Version 1.0 This letter constitutes the User

More information

10 CFR Generic Categorization Process Development

10 CFR Generic Categorization Process Development 10 CFR 50.69 Generic Categorization Process Development Kyle Hope, Westinghouse Risk Applications & Methods II Ryan Griffin, Westinghouse Technology Development Group Richard Rolland, Westinghouse Risk

More information

Teamcenter 11.1 Systems Engineering and Requirements Management

Teamcenter 11.1 Systems Engineering and Requirements Management SIEMENS Teamcenter 11.1 Systems Engineering and Requirements Management Systems Architect/ Requirements Management Project Administrator's Manual REQ00002 U REQ00002 U Project Administrator's Manual 3

More information

CIP Cyber Security Configuration Change Management and Vulnerability Assessments

CIP Cyber Security Configuration Change Management and Vulnerability Assessments Standard Development Timeline This section is maintained by the drafting team during the development of the standard and will be removed when the standard becomes effective. Development Steps Completed

More information

NRC INSPECTION MANUAL MANUAL CHAPTER 0609

NRC INSPECTION MANUAL MANUAL CHAPTER 0609 NRC INSPECTION MANUAL MANUAL CHAPTER 0609 IPAB SIGNIFICANCE DETERMINATION PROCESS 0609-01 PURPOSE The Significance Determination Process (SDP) uses risk insights, where appropriate, to help NRC inspectors

More information

IMR DATA DATA AQUISITION PROGRAM FOR IMR COMBUSTION GAS ANALYZERS

IMR DATA DATA AQUISITION PROGRAM FOR IMR COMBUSTION GAS ANALYZERS IMR DATA DATA AQUISITION PROGRAM FOR IMR COMBUSTION GAS ANALYZERS ENVIRONMENTAL EQUIPMENT, INC. 3634 Central Ave. St. Petersburg, FL 33711 USA Phone: 727-328-2818 E-Mail: info@imrusa.com Web: WWW.IMRUSA.COM

More information

Moody Radio Network 1st Quarter data dates are February, th Quarter Reports are due to SoundExchange by Tuesday, 15-May, 2018

Moody Radio Network 1st Quarter data dates are February, th Quarter Reports are due to SoundExchange by Tuesday, 15-May, 2018 Stations streaming under the agreement signed by the NRB Music Licensing Committee are required to report to SoundExchange two weeks music information per quarter. Moody Radio will be making available

More information

DocAve Content Shield v2.2 for SharePoint

DocAve Content Shield v2.2 for SharePoint DocAve Content Shield v2.2 for SharePoint User Guide For SharePoint 2010 Revision A Issued August 2012 1 Table of Contents Table of Contents... 2 About DocAve Content Shield for SharePoint... 4 Complementary

More information

FILEROOM ADD-IN RELEASE NOTES NEW FEATURES * OPEN ISSUES * ADDRESSED ISSUES RELEASE DATE: MAY 31, 2013 ONESOURCE.THOMSONREUTERS.

FILEROOM ADD-IN RELEASE NOTES NEW FEATURES * OPEN ISSUES * ADDRESSED ISSUES RELEASE DATE: MAY 31, 2013 ONESOURCE.THOMSONREUTERS. FILEROOM ADD-IN RELEASE NOTES NEW FEATURES * OPEN ISSUES * ADDRESSED ISSUES RELEASE DATE: MAY 31, 2013 ONESOURCE.THOMSONREUTERS.COM Proprietary Materials No use of these Proprietary materials is permitted

More information

CEU Online System, The Friday Center for Continuing Education, UNC-Chapel Hill How to Obtain Participant IDs for Awarding of CEUs

CEU Online System, The Friday Center for Continuing Education, UNC-Chapel Hill How to Obtain Participant IDs for Awarding of CEUs The Friday Center for Continuing Education has the responsibility of approving continuing education activities for which CEUs are recorded and maintained as a permanent record for individual participants.

More information

Standard Development Timeline

Standard Development Timeline Standard Development Timeline This section is maintained by the drafting team during the development of the standard and will be removed when the standard is adopted by the NERC Board of Trustees (Board).

More information

DocAve Content Shield v2.2 for SharePoint

DocAve Content Shield v2.2 for SharePoint DocAve Content Shield v2.2 for SharePoint User Guide For SharePoint 2007 Revision A Issued August 2012 1 Table of Contents Table of Contents... 2 About DocAve Content Shield for SharePoint... 4 Complementary

More information

SAP SQN Technical User Manual 8-Step Web Portal. Rev

SAP SQN Technical User Manual 8-Step Web Portal. Rev SAP SQN Technical User Manual 8-Step Web Portal Rev 3.1 10-30-09 1 Table of Contents Section Description Page 1.1 Logging into SAP via the portal 3 2.1 Entering a Specific Known Notification 4 Number 2.2.1

More information

CIP Cyber Security Configuration Management and Vulnerability Assessments

CIP Cyber Security Configuration Management and Vulnerability Assessments Standard Development Timeline This section is maintained by the drafting team during the development of the standard and will be removed when the standard becomes effective. Development Steps Completed

More information

WORKFLOW MANAGER RELEASE NOTES NEW FEATURES * OPEN ISSUES * ADDRESSED ISSUES RELEASE DATE: MAY 17, 2013 CS.THOMSONREUTERS.COM

WORKFLOW MANAGER RELEASE NOTES NEW FEATURES * OPEN ISSUES * ADDRESSED ISSUES RELEASE DATE: MAY 17, 2013 CS.THOMSONREUTERS.COM WORKFLOW MANAGER RELEASE NOTES NEW FEATURES * OPEN ISSUES * ADDRESSED ISSUES RELEASE DATE: MAY 17, 2013 CS.THOMSONREUTERS.COM Proprietary Materials No use of these Proprietary materials is permitted without

More information

Patrice M. Anderson Instructional Designer

Patrice M. Anderson Instructional Designer Patrice M. Anderson Instructional Designer Portfolio Sample Instructor-Led Training This sample contains the ToC and one chapter from the instructor-led student workbook for HP Quality Center (QC) software.

More information

GOOGLE APPS. If you have difficulty using this program, please contact IT Personnel by phone at

GOOGLE APPS. If you have difficulty using this program, please contact IT Personnel by phone at : GOOGLE APPS Application: Usage: Program Link: Contact: is an electronic collaboration tool. As needed by any staff member http://www.google.com or http://drive.google.com If you have difficulty using

More information

User Guide. Kronodoc Kronodoc Oy. Intelligent methods for process improvement and project execution

User Guide. Kronodoc Kronodoc Oy. Intelligent methods for process improvement and project execution User Guide Kronodoc 3.0 Intelligent methods for process improvement and project execution 2003 Kronodoc Oy 2 Table of Contents 1 User Guide 5 2 Information Structure in Kronodoc 6 3 Entering and Exiting

More information

REPORTING FUNDAMENTALS

REPORTING FUNDAMENTALS Lesson 4: REPORTING FUNDAMENTALS The first step in finding data or generating reports in SAP is defining search criteria to limit the scope of the search. Search criteria are the values you have typed

More information

Griffin Training Manual Grif-WebI Introduction (For Analysts)

Griffin Training Manual Grif-WebI Introduction (For Analysts) Griffin Training Manual Grif-WebI Introduction (For Analysts) Alumni Relations and Development The University of Chicago Table of Contents Chapter 1: Defining WebIntelligence... 1 Chapter 2: Working with

More information

BP-VA Quick Start. Last update: 29 th January, Copyright Visual Paradigm International Ltd.

BP-VA Quick Start. Last update: 29 th January, Copyright Visual Paradigm International Ltd. BP-VA Quick Start Last update: 29 th January, 2010 Copyright 2002-2010 Visual Paradigm International Ltd. Table of Contents Table of Contents... 2 Getting started... 3 Installing Business Process Visual

More information

Employment Ontario Information System (EOIS) Case Management System

Employment Ontario Information System (EOIS) Case Management System Employment Ontario Information System (EOIS) Case Management System Service Provider User Guide Chapter 8B: Service Plan Management for Literacy and Basic Skills Version 2.7 December 2017 Table of Contents

More information

e-frr SYSTEM USER GUIDE

e-frr SYSTEM USER GUIDE e-frr SYSTEM USER GUIDE for Electronic Submission of Financial Return Version 1.5 Jun 2015 Table of Contents 1. Introduction... 4 2. Background... 4 3. System Purpose... 4 4. Baseline Specification of

More information

UPGRADE INSTALLATION PROCEDURES

UPGRADE INSTALLATION PROCEDURES UPGRADE INSTALLATION PROCEDURES SIS-TECH Applications is proud to provide you with the new SIL Solver Version 6.0! You will be sent an email from svogtmann@sis-tech.com titled Files have been shared with

More information

TABLE OF CONTENTS. TECHNICAL SUPPORT APPENDIX Appendix A Formulas And Cell Links Appendix B Version 1.1 Formula Revisions...

TABLE OF CONTENTS. TECHNICAL SUPPORT APPENDIX Appendix A Formulas And Cell Links Appendix B Version 1.1 Formula Revisions... SPARC S INSTRUCTIONS For Version 1.1 UNITED STATES DEPARTMENT OF AGRICULTURE Forest Service By Todd Rivas December 29, 1999 TABLE OF CONTENTS WHAT IS SPARC S?... 1 Definition And History... 1 Features...

More information

I.A.M. National Pension Fund Remittance Report Software

I.A.M. National Pension Fund Remittance Report Software I.A.M. National Pension Fund Remittance Report Software The USER S GUIDE INTRODUCTION The I.A.M. National Pension Fund Remittance Report Software version 2.0 (IAMNPF RR Software) is a program created to

More information

Intro to Excel. To start a new workbook, click on the Blank workbook icon in the middle of the screen.

Intro to Excel. To start a new workbook, click on the Blank workbook icon in the middle of the screen. Excel is a spreadsheet application that allows for the storing, organizing and manipulation of data that is entered into it. Excel has variety of built in tools that allow users to perform both simple

More information

Item Number Change for Sage Accpac ERP

Item Number Change for Sage Accpac ERP SAGE ACCPAC Sage Accpac Options Item Number Change for Sage Accpac ERP User Guide 2008 Sage Software, Inc. All rights reserved. Sage Software, Sage Software logos, and all Sage Accpac product and service

More information

Sage Abra Suite. Installation Guide

Sage Abra Suite. Installation Guide Sage Abra Suite Installation Guide 2011 Sage Software, Inc. All rights reserved. Sage, the Sage logos, and the Sage product and service names mentioned herein are registered trademarks or trademarks of

More information

Vela Web User Guide Vela Systems, Inc. All rights reserved.

Vela Web User Guide Vela Systems, Inc. All rights reserved. The Vela Systems Web application is designed to enable the administration, management, and reporting of the Vela Field Management Suite, as well as give project teams the ability to collaborate on issues,

More information

Tool for RSR Export (T-REX) User s Guide. For the Ryan White Services Client-Level Data Report Version 3.3

Tool for RSR Export (T-REX) User s Guide. For the Ryan White Services Client-Level Data Report Version 3.3 Tool for RSR Export (T-REX) User s Guide For the Ryan White Services Client-Level Data Report Version 3.3 Table of Contents 1. About the Tool for RSR Export (T-REX)... 1 2. System Requirements... 2 3.

More information

STUDY ASSISTANT. Adding a New Study & Submitting to the Review Board. Version 10.03

STUDY ASSISTANT. Adding a New Study & Submitting to the Review Board. Version 10.03 STUDY ASSISTANT Adding a New Study & Submitting to the Review Board Version 10.03 Contents Introduction... 3 Add a Study... 3 Selecting an Application... 3 1.0 General Information... 3 2.0 Add Department(s)...

More information

1. ECI Hosted Clients Installing Release 6.3 for the First Time (ECI Hosted) Upgrading to Release 6.3SP2 (ECI Hosted)

1. ECI Hosted Clients Installing Release 6.3 for the First Time (ECI Hosted) Upgrading to Release 6.3SP2 (ECI Hosted) 1. ECI Hosted Clients........................................................................................... 2 1.1 Installing Release 6.3 for the First Time (ECI Hosted)...........................................................

More information

BACKUP APP V7 QUICK START GUIDE FOR SYNOLOGY NAS

BACKUP APP V7 QUICK START GUIDE FOR SYNOLOGY NAS BACKUP APP V7 QUICK START GUIDE FOR SYNOLOGY NAS Revision History Date Descriptions Type of modification 29 Jun 2016 First Draft New 25 Nov 2016 Modified Ch 3 Download Backup App, Ch 3 Install Backup App

More information

Section 4: Software Instructions

Section 4: Software Instructions Section 4: Software Instructions 4.1 Software Package Overview...4-2 4.2 SPC 2008 Software Installation...4-3 4.2.1 SPC 2008 Installation from CD-ROM...4-3 4.2.2 SPC 2008 Update Self-Extracting Zip File

More information

eplan User Guide County of Santa Cruz Planning Department Revised

eplan User Guide County of Santa Cruz Planning Department Revised eplan User Guide County of Santa Cruz Planning Department Revised 01-22-2018 Contents Contents... 2 Part 1: eplan Overview... 3 Part 2: eplan Accounts & Information... 4 User Registration... 4 User Sign

More information

TAI Indicator Database User Instructions for Version 1.0

TAI Indicator Database User Instructions for Version 1.0 TAI Indicator Database User Instructions for Version 1.0 Table of Contents QUICK HELP... 4 How do I get started?...4 Where can I find research guidelines and background information for this assessment?...5

More information

Summary Table of Contents

Summary Table of Contents Copyright Notice Copyright 2006 Online Backup Solution.com All rights reserved. Any technical documentation that is made available by Online Backup Solution.com is the copyrighted work of Online Backup

More information

PrimoPDF Enterprise User Guide, Version 5.0

PrimoPDF Enterprise User Guide, Version 5.0 Table of Contents Installation... 3 Reference Links... 3 Uninstallation... 4 Creating PDF Documents... 4 PrimoPDF Document Settings... 5 PDF Creation Profiles... 5 Document Properties... 6 PDF Security...

More information

AT&T Business Messaging Account Management

AT&T Business Messaging Account Management Account Management Administrator User Guide July 2016 1 Copyright 2016 AT&T Intellectual Property. All rights reserved. AT&T, the AT&T logo and all other AT&T marks contained herein are trademarks of AT&T

More information

LiNC-NXG for Windows 8 Professional, Windows 7 Professional, Vista Business Edition and XP Professional

LiNC-NXG for Windows 8 Professional, Windows 7 Professional, Vista Business Edition and XP Professional LiNC-NXG for Windows 8 Professional, Windows 7 Professional, Vista Business Edition and XP Professional Installation Guide for LiNC-NXG 33-10067-001 REV: C PCSC 3541 Challenger Street Torrance, CA 90503

More information

Griffin Training Manual

Griffin Training Manual Griffin Training Manual Grif-WebI Orientation Class For View Only Users Alumni Relations and Development The University of Chicago Table of Contents Chapter 1: Defining & Accessing Web Intelligence...

More information

Basic Windows 95 Skills

Basic Windows 95 Skills Building Mouse Skills Click press left mouse button once used to select options in menus or to move your cursor Double click press left mouse button twice without moving the mouse pointer used on icons

More information

CIP Cyber Security Configuration Change Management and Vulnerability Assessments

CIP Cyber Security Configuration Change Management and Vulnerability Assessments CIP-010-2 3 Cyber Security Configuration Change Management and Vulnerability Assessments A. Introduction 1. Title: Cyber Security Configuration Change Management and Vulnerability Assessments 2. Number:

More information

Unofficial Comment Form Project Modifications to CIP Standards Requirements for Transient Cyber Assets CIP-003-7(i)

Unofficial Comment Form Project Modifications to CIP Standards Requirements for Transient Cyber Assets CIP-003-7(i) Unofficial Comment Form Project 2016-02 Modifications to CIP Standards Requirements for Transient Cyber Assets CIP-003-7(i) Do not use this form for submitting comments. Use the electronic form to submit

More information

Introduction to Microsoft Excel 2007

Introduction to Microsoft Excel 2007 Introduction to Microsoft Excel 2007 Microsoft Excel is a very powerful tool for you to use for numeric computations and analysis. Excel can also function as a simple database but that is another class.

More information

QC-PRO Gage Management

QC-PRO Gage Management QC-PRO Gage Management User Guide www.pister.com Version 9.1 User Guide Revision 1.5 Table of Contents Introduction 1-1 Overview 1-1 Set Up Information 1-1 Skip Holidays 1-1 Calibration Reports 1-2 Measurement

More information

DBT-120 Bluetooth USB Adapter

DBT-120 Bluetooth USB Adapter DBT-120 Bluetooth USB Adapter Rev.2.1 (09/25/2002) 2 Contents Introduction... 5 Package Contents... 6 Installing Bluetooth Software... 6 Hardware Installation... 8 Introduction to Bluetooth Software...

More information

Software Release Notes

Software Release Notes Software Release Notes Version 2.01.02.01 December 2017 A. Software Description KellyDown Survey Analysis is designed for use by directional survey analysts to help them easily determine whether MWD survey

More information

MA FINANCIAL DATA REPORTING APPLICATION (MAFDRA) Company User s Guide Effective January 2014

MA FINANCIAL DATA REPORTING APPLICATION (MAFDRA) Company User s Guide Effective January 2014 MA FINANCIAL DATA REPORTING APPLICATION (MAFDRA) Company User s Guide Effective January 2014 THE WORKERS COMPENSATION RATING & INSPECTION BUREAU OF MASSACHUSETTS 101 ARCH STREET 5 TH FLOOR, BOSTON, MASSACHUSETTS

More information

This document covers the most frequently used procedures in ClearCase. It contains the following sections:

This document covers the most frequently used procedures in ClearCase. It contains the following sections: ClearCase is a software configuration management system. It is also the tool Concur uses for documentation management and version control. All historical versions are located in a Versioned Object database

More information

ASSEMBLER USER GUIDE. Developed and published by Expedience Software Copyright Expedience Software

ASSEMBLER USER GUIDE. Developed and published by Expedience Software Copyright Expedience Software ASSEMBLER USER GUIDE Developed and published by Expedience Software Copyright 2012-2017 Expedience Software User Guide Contents About this Guide... 1 The User Guide 1 Install Expedience Ribbons... 2 Open

More information

Excel 2010 Level 1: The Excel Environment

Excel 2010 Level 1: The Excel Environment Excel 2010 Level 1: The Excel Environment Table of Contents The Excel 2010 Environment... 1 The Excel Window... 1 File Tab... 1 The Quick Access Toolbar... 4 Access the Customize the Quick Access Toolbar

More information

PigCHAMP Knowledge Software. Enterprise Edition Installation Guide

PigCHAMP Knowledge Software. Enterprise Edition Installation Guide PigCHAMP Knowledge Software Enterprise Edition Installation Guide PIGCHAMP, LLC Enterprise Edition Installation Guide JUNE 2016 EDITION PigCHAMP Knowledge Software 1531 Airport Rd Suite 101 Ames, IA 50010

More information

Standard CIP Cyber Security Critical Cyber Asset Identification

Standard CIP Cyber Security Critical Cyber Asset Identification Standard CIP 002 1 Cyber Security Critical Cyber Asset Identification Standard Development Roadmap This section is maintained by the drafting team during the development of the standard and will be removed

More information

PrimoPDF User Guide, Version 5.0

PrimoPDF User Guide, Version 5.0 Table of Contents Getting Started... 3 Installing PrimoPDF... 3 Reference Links... 4 Uninstallation... 5 Creating PDF Documents... 5 PrimoPDF Document Settings... 6 PDF Creation Profiles... 6 Document

More information

FileLoader for SharePoint

FileLoader for SharePoint Administrator s Guide FileLoader for SharePoint v. 2.0 Last Updated 6 September 2012 Contents Preface 3 FileLoader Users... 3 Getting Started with FileLoader 4 Configuring Connections to SharePoint 8

More information

Installing and Configuring Worldox/Web Mobile

Installing and Configuring Worldox/Web Mobile Installing and Configuring Worldox/Web Mobile SETUP GUIDE v 1.1 Revised 6/16/2009 REVISION HISTORY Version Date Author Description 1.0 10/20/2008 Michael Devito Revised and expanded original draft document.

More information

DiskBoss DATA MANAGEMENT

DiskBoss DATA MANAGEMENT DiskBoss DATA MANAGEMENT Disk Change Monitor Version 9.3 May 2018 www.diskboss.com info@flexense.com 1 1 Product Overview DiskBoss is an automated, policy-based data management solution allowing one to

More information

Electronic Owner s Manual User Guide

Electronic Owner s Manual User Guide Electronic Owner s Manual User Guide I. Getting Started.... 1 Logging In.... 2 Additional Information... 2 II. Searching for an Existing EOM Form... 5 III. Creating a New EOM Form.. 5 IV. Modifying an

More information

Standard CIP Cyber Security Critical Cyber Asset Identification

Standard CIP Cyber Security Critical Cyber Asset Identification Standard CIP 002 1 Cyber Security Critical Cyber Asset Identification Standard Development Roadmap This section is maintained by the drafting team during the development of the standard and will be removed

More information

Building Standards Department Markham eplan Applicant Handbook For Building Permits, Sign Permits and Zoning Preliminary Review

Building Standards Department Markham eplan Applicant Handbook For Building Permits, Sign Permits and Zoning Preliminary Review Markham eplan Applicant Handbook For Building Permits, Sign Permits and Zoning Preliminary Review In addition to this user manual, please refer to the instructions provided in the electronic forms (eforms)

More information

Installation and User Guide Worksoft Certify Content Merge

Installation and User Guide Worksoft Certify Content Merge Installation and User Guide Worksoft Certify Content Merge Worksoft, Inc. 15851 Dallas Parkway, Suite 855 Addison, TX 75001 www.worksoft.com 866-836-1773 Worksoft Certify Content Merge Installation and

More information

HP Project and Portfolio Management Center

HP Project and Portfolio Management Center HP Project and Portfolio Management Center Software Version: 8.00 Generating Fiscal Periods Document Release Date: July 2009 Software Release Date: July 2009 Legal Notices Warranty The only warranties

More information

PharmaReady Version 5.1 SPL Workflow Operational Qualification Complete Installation Software Release Date 28 Jan 2013

PharmaReady Version 5.1 SPL Workflow Operational Qualification Complete Installation Software Release Date 28 Jan 2013 PharmaReady Version 5.1 SPL Workflow al Qualification Complete Installation Software Release Date 28 Jan 2013 Doc Version 1.0 Jan 28, 2013 Company Name: Address: Project Name: Server Name: Server Location:

More information

Colligo Briefcase for Mac. Release Notes

Colligo Briefcase for Mac. Release Notes Colligo Briefcase for Mac Release Notes Contents Technical Requirements... 3 Release 7.5 06 Oct0ber 2017... 4 New in this Release... 4 Release 7.5 18 May 2017... 4 New in 7.5... 4 Issues 7.5... 5 Known

More information

ODBC DOCUMENTATION UPDATES

ODBC DOCUMENTATION UPDATES DOCUMENTATION UPDATES Date Description Where Changed 5/16/03 New upgrading instructions have been added to upgrade OpenLink to version 4.1. Getting Started chapter, in the Upgrading OpenLink section (page

More information

ODBC. Getting Started OpenLink Server Software Using ODBC

ODBC. Getting Started OpenLink Server Software Using ODBC Getting Started OpenLink Server Software Using The documentation in this publication is provided pursuant to a Sales and Licensing Contract for the Prophet 21 System entered into by and between Prophet

More information

CRITERION Vantage 3 Admin Training Manual Contents Introduction 5

CRITERION Vantage 3 Admin Training Manual Contents Introduction 5 CRITERION Vantage 3 Admin Training Manual Contents Introduction 5 Running Admin 6 Understanding the Admin Display 7 Using the System Viewer 11 Variables Characteristic Setup Window 19 Using the List Viewer

More information

DataCollect Administrative Tools Supporting DataCollect (CMDT 3900) Version 3.0.0

DataCollect Administrative Tools Supporting DataCollect (CMDT 3900) Version 3.0.0 Administrator Manual DataCollect Administrative Tools Supporting DataCollect (CMDT 3900) Version 3.0.0 P/N 15V-090-00054-100 Revision A SKF is a registered trademark of the SKF Group. All other trademarks

More information

PART 7. Getting Started with Excel

PART 7. Getting Started with Excel PART 7 Getting ed with Excel When you start the application, Excel displays a blank workbook. A workbook is a file in which you store your data, similar to a three-ring binder. Within a workbook are worksheets,

More information

Editing Course Tools and Properties to 8.4.1

Editing Course Tools and Properties to 8.4.1 Editing Course Tools and Properties 8.3.0 to 8.4.1 User Guide Revised April 16, 2009 Contents Editing course properties The Course Offering Information page Setting course colors Setting the course language

More information

EnGenius Mesh AP Reset Tool Quick Guide

EnGenius Mesh AP Reset Tool Quick Guide EnGenius Mesh AP Reset Tool Quick Guide Revision : 1.1 Table of Contents EnGenius MESH AP Reset Tool Quick Guide 1. Overview...3 2. Installation Procedure...3 3. Uninstallation Procedure...3 4. Tool Layout...4

More information

Test Information and Distribution Engine

Test Information and Distribution Engine SC-Alt Test Information and Distribution Engine User Guide 2018 2019 Published January 14, 2019 Prepared by the American Institutes for Research Descriptions of the operation of the Test Information Distribution

More information

Filing Electronically With the IRS FIRE System and Pro1099

Filing Electronically With the IRS FIRE System and Pro1099 Filing Electronically With the IRS FIRE System and Pro1099 SoftPro Select 4.0 Tax Year 2015 January 20, 2016 4800 Falls of Neuse Road, Suite 400 Raleigh, NC 27609 p (800) 848-0143 f (919) 755-8350 www.softprocorp.com

More information

Lasso Continuous Data Protection Lasso CDP Client Guide August 2005, Version Lasso CDP Client Guide Page 1 of All Rights Reserved.

Lasso Continuous Data Protection Lasso CDP Client Guide August 2005, Version Lasso CDP Client Guide Page 1 of All Rights Reserved. Lasso CDP Client Guide August 2005, Version 1.6.8 Lasso CDP Client Guide Page 1 of 32 Copyright Copyright 2005 Lasso Logic, LLC. All Rights Reserved. No part of this publication may be reproduced, stored

More information

NERC Transmission Availability Data System (TADS): Element Identifier Data Submission Addendum

NERC Transmission Availability Data System (TADS): Element Identifier Data Submission Addendum Transmission Availability Data System (TADS) Element Identifier Data Submission Addendum May 28, 2013 3353 Peachtree Road NE NERC Transmission Availability Data System (TADS): Element Identifier Data Submission

More information

Integrating Word with Excel

Integrating Word with Excel Integrating Word with Excel MICROSOFT OFFICE Microsoft Office contains a group of software programs sold together in one package. The programs in Office are designed to work independently and in conjunction

More information

Quick Start Guide. Table of contents. Browsing in the Navigator... 2 The Navigator makes browsing and navigation easier.

Quick Start Guide. Table of contents. Browsing in the Navigator... 2 The Navigator makes browsing and navigation easier. Table of contents Browsing in the Navigator... 2 The Navigator makes browsing and navigation easier. Searching in Windchill... 3 Quick and simple searches are always available at the top of the Windchill

More information

Outlook Web Access Exchange Server

Outlook Web Access Exchange Server Outlook Web Access Exchange Server Version 2.0 Information Technology Services 2008 Table of Contents I. INTRODUCTION... 1 II. GETTING STARTED... 1 A. Logging In and Existing Outlook Web Access... 1 B.

More information

Software Release Memo

Software Release Memo Siemens Industry, Inc. Software Release Memo Product Involved SR15939-80-8 Rev 1 March 2012 i config Graphical Configuration Utility, Version 4.01 i config Graphical Configuration Utility, version 4.01,

More information

PROMISE ARRAY MANAGEMENT ( PAM) USER MANUAL

PROMISE ARRAY MANAGEMENT ( PAM) USER MANUAL PROMISE ARRAY MANAGEMENT ( PAM) USER MANUAL Copyright 2002, Promise Technology, Inc. Copyright by Promise Technology, Inc. (Promise Technology). No part of this manual may be reproduced or transmitted

More information

HR-Lite Database & Web Service Setup Guide

HR-Lite Database & Web Service Setup Guide HR-Lite Database & Web Service Setup Guide Version: 1.00 HR21 Limited All rights reserved. No part of this document may be reproduced or transmitted in any form or by any means, electronic or mechanical,

More information

ProductCenter Database Merge Utility Installation Guide

ProductCenter Database Merge Utility Installation Guide ProductCenter Database Merge Utility Installation Guide Release 8.4.0 January, 2006 NorthRidge Software, LLC www.nridge.com (603) 434-2525 CONTENTS Introduction... 3 Planning Your Installation... 3 Location...

More information

Technical Publications 1View

Technical Publications 1View Technical Publications 1View 1View User Guides Overview... 2 Homepage... 4 Subscription Management... 5 Table of Contents (TOC)... 6 Navigation... 8 Search... 11 Linking... 13 Order List... 14 Graphic

More information

Online Backup Manager v7 Quick Start Guide for Synology NAS

Online Backup Manager v7 Quick Start Guide for Synology NAS Online Backup Manager v7 Quick Start Guide for Synology NAS Copyright Notice The use and copying of this product is subject to a license agreement. Any other use is prohibited. No part of this publication

More information

Adobe Document Cloud esign Services. for Salesforce Version 17 Installation and Customization Guide

Adobe Document Cloud esign Services. for Salesforce Version 17 Installation and Customization Guide Adobe Document Cloud esign Services for Salesforce Version 17 Installation and Customization Guide 2015 Adobe Systems Incorporated. All rights reserved. Last Updated: August 28, 2015 Table of Contents

More information

Great Start to Quality STARS Quality Improvement Consultants User Manual STARS - Systematic Tiered Assessment and Rating Solution

Great Start to Quality STARS Quality Improvement Consultants User Manual STARS - Systematic Tiered Assessment and Rating Solution STARS Quality Improvement Consultants User Manual STARS - Systematic Tiered Assessment and Rating Solution Table of Contents 1 Great Start to Quality... 4 1.1 Welcome... 4 1.2 Introduction to the Great

More information

Navigator Software User s Manual. User Manual. Navigator Software. Monarch Instrument Rev 0.98 May Page 1 of 17

Navigator Software User s Manual. User Manual. Navigator Software. Monarch Instrument Rev 0.98 May Page 1 of 17 User Manual Navigator Software Monarch Instrument Rev 0.98 May 2006 Page 1 of 17 Contents 1. NAVIGATOR SOFTWARE 2. INSTALLATION 3. USING NAVIGATOR SOFTWARE 3.1 STARTING THE PROGRAM 3.2 SYSTEM SET UP 3.3

More information

Roxen Content Provider

Roxen Content Provider Roxen Content Provider Generation 3 Templates Purpose This workbook is designed to provide a training and reference tool for placing University of Alaska information on the World Wide Web (WWW) using the

More information

Instruction Guide COMMERCIAL EEPM 2.0 ENERGY EFFICIENCY PROGRAM

Instruction Guide COMMERCIAL EEPM 2.0 ENERGY EFFICIENCY PROGRAM Instruction Guide COMMERCIAL EEPM 2.0 ENERGY EFFICIENCY PROGRAM 3 2 TABLE OF CONTENTS About EEPM 2.0...3 Program Participation...4 Service Provider Dashboard...5. Program Option Activity...5 Eligible Programs...5

More information

EXCEL CONNECT USER GUIDE

EXCEL CONNECT USER GUIDE USER GUIDE Developed and published by Expedience Software Copyright 2012-2017 Expedience Software Excel Connect Contents About this Guide... 1 The Style Palette User Guide 1 Excel Connect Overview... 2

More information

University of North Carolina at Charlotte

University of North Carolina at Charlotte University of North Carolina at Charlotte Facilities Management Procedures Manual v1.0 Delivered by PMOLink, LLC December 15-16, 2009 2009 All rights reserved. No part of this publication may be reproduced

More information

ArtfulBits Web Part

ArtfulBits  Web Part ArtfulBits Email Web Part for Microsoft SharePoint User Guide Overview... 2 Feature List... 3 Why ArtfulBits Email Web Part?... 3 How to Use... 3 How to Use Email Web Part... 3 Enabling to Send E-mail

More information

Equitrac Embedded for Sharp OSA

Equitrac Embedded for Sharp OSA Equitrac Embedded for Sharp OSA 1.4 Setup Guide 2014 Equitrac Embedded for Sharp OSA Setup Guide Revision Date Revision List September, 2014 Updated for Equitrac Office/Express 5.4 April 16, 2013 Updated

More information

Cobra Navigation Release 2011

Cobra Navigation Release 2011 Cobra Navigation Release 2011 Cobra Navigation - Rev.0.2 Date: November 27 2012 jmaas@flowserve.com Page 1 of 34 Contents Contents 1 Revision History... 5 2 Introduction.... 6 3 Cobra Login... 7 3.1 Initial

More information

Backup App v7. Quick Start Guide for Windows

Backup App v7. Quick Start Guide for Windows Backup App v7 Quick Start Guide for Windows Revision History Date Descriptions Type of modification 30 Jun 2016 First Draft New 25 Nov 2016 Added Restore Options to Ch 8 Restore Data; Combined Technical

More information

WLAN MIERUZZO BASIC SOFTWARE

WLAN MIERUZZO BASIC SOFTWARE DK-5000 Series WLAN MIERUZZO BASIC SOFTWARE USER S MANUAL DK-5005A, DK-5010A, DK-5030A DK-5005B, DK-5010B, DK-5030B DK-5005C, DK-5010C, DK-5030C DK-5005D, DK-5010D, DK-5030D This manual was last revised

More information

PROMISE ARRAY MANAGEMENT ( PAM) FOR FastTrak S150 TX2plus, S150 TX4 and TX4000. User Manual. Version 1.3

PROMISE ARRAY MANAGEMENT ( PAM) FOR FastTrak S150 TX2plus, S150 TX4 and TX4000. User Manual. Version 1.3 PROMISE ARRAY MANAGEMENT ( PAM) FOR FastTrak S150 TX2plus, S150 TX4 and TX4000 User Manual Version 1.3 Promise Array Management Copyright 2003 Promise Technology, Inc. All Rights Reserved. Copyright by

More information

Learning Worksheet Fundamentals

Learning Worksheet Fundamentals 1.1 LESSON 1 Learning Worksheet Fundamentals After completing this lesson, you will be able to: Create a workbook. Create a workbook from a template. Understand Microsoft Excel window elements. Select

More information