ABSTRACT DATA CLARIFCIATION FORM TRACKING ORACLE TABLE INTRODUCTION REVIEW QUALITY CHECKS

Similar documents
PROC REPORT AN INTRODUCTION

Get Started Writing SAS Macros Luisa Hartman, Jane Liao, Merck Sharp & Dohme Corp.

Going Under the Hood: How Does the Macro Processor Really Work?

Base and Advance SAS

Program Validation: Logging the Log

MIS Reporting in the Credit Card Industry

Matt Downs and Heidi Christ-Schmidt Statistics Collaborative, Inc., Washington, D.C.

%MAKE_IT_COUNT: An Example Macro for Dynamic Table Programming Britney Gilbert, Juniper Tree Consulting, Porter, Oklahoma

Taming a Spreadsheet Importation Monster

Year 2000 Issues for SAS Users Mike Kalt, SAS Institute Inc., Cary, NC Rick Langston, SAS Institute Inc., Cary, NC

How to Incorporate Old SAS Data into a New DATA Step, or What is S-M-U?

DBLOAD Procedure Reference

The SERVER Procedure. Introduction. Syntax CHAPTER 8

Programming Gems that are worth learning SQL for! Pamela L. Reading, Rho, Inc., Chapel Hill, NC

So Much Data, So Little Time: Splitting Datasets For More Efficient Run Times and Meeting FDA Submission Guidelines

A Generalized Macro-Based Data Reporting System to Produce Both HTML and Text Files

T.I.P.S. (Techniques and Information for Programming in SAS )

DSCI 325: Handout 15 Introduction to SAS Macro Programming Spring 2017

One Project, Two Teams: The Unblind Leading the Blind

SAS CURRICULUM. BASE SAS Introduction

SAS Macro Dynamics - From Simple Basics to Powerful Invocations Rick Andrews, Office of the Actuary, CMS, Baltimore, MD

David S. Septoff Fidia Pharmaceutical Corporation

Arthur L. Carpenter California Occidental Consultants, Oceanside, California

A Guided Tour Through the SAS Windowing Environment Casey Cantrell, Clarion Consulting, Los Angeles, CA

Locking SAS Data Objects

Copy That! Using SAS to Create Directories and Duplicate Files

Checking for Duplicates Wendi L. Wright

SAS Online Training: Course contents: Agenda:

Now That You Have Your Data in Hadoop, How Are You Staging Your Analytical Base Tables?

A Format to Make the _TYPE_ Field of PROC MEANS Easier to Interpret Matt Pettis, Thomson West, Eagan, MN

An Introduction to Analysis (and Repository) Databases (ARDs)

PharmaSUG Paper PO12

BASICS BEFORE STARTING SAS DATAWAREHOSING Concepts What is ETL ETL Concepts What is OLAP SAS. What is SAS History of SAS Modules available SAS

SAS CLINICAL SYLLABUS. DURATION: - 60 Hours

From Manual to Automatic with Overdrive - Using SAS to Automate Report Generation Faron Kincheloe, Baylor University, Waco, TX

Epidemiology Principles of Biostatistics Chapter 3. Introduction to SAS. John Koval

BI-09 Using Enterprise Guide Effectively Tom Miron, Systems Seminar Consultants, Madison, WI

SAS System Powers Web Measurement Solution at U S WEST

Internet/Intranet, the Web & SAS

SAS/Warehouse Administrator Usage and Enhancements Terry Lewis, SAS Institute Inc., Cary, NC

A Practical Introduction to SAS Data Integration Studio

Missing Pages Report. David Gray, PPD, Austin, TX Zhuo Chen, PPD, Austin, TX

SAS 9 Programming Enhancements Marje Fecht, Prowerk Consulting Ltd Mississauga, Ontario, Canada

Utilizing the VNAME SAS function in restructuring data files

Paper ###-YYYY. SAS Enterprise Guide: A Revolutionary Tool! Jennifer First, Systems Seminar Consultants, Madison, WI

INTRODUCTION TO SAS HOW SAS WORKS READING RAW DATA INTO SAS

SAS Macro Programming for Beginners

CHAPTER 7 Using Other SAS Software Products

Abstract. The next section will demonstrate the Oracle SQL script and the SAS program, and provide more detailed discussion of this automated process.

The Power of Combining Data with the PROC SQL

SAS/ACCESS Interface to R/3

Using PROC SQL to Calculate FIRSTOBS David C. Tabano, Kaiser Permanente, Denver, CO

New Vs. Old Under the Hood with Procs CONTENTS and COMPARE Patricia Hettinger, SAS Professional, Oakbrook Terrace, IL

Choosing the Right Procedure

InForm Functionality Reference Manual for Sites. Version 1.0

Using Templates Created by the SAS/STAT Procedures

A Table Driven ODS Macro Diane E. Brown, exponential Systems, Indianapolis, IN

Quick and Efficient Way to Check the Transferred Data Divyaja Padamati, Eliassen Group Inc., North Carolina.

Quick Data Definitions Using SQL, REPORT and PRINT Procedures Bradford J. Danner, PharmaNet/i3, Tennessee

SAS/ACCESS Data Set Options

CREATING A SUMMARY TABLE OF NORMALIZED (Z) SCORES

How SAS Thinks Neil Howard, Basking Ridge, NJ

%Addval: A SAS Macro Which Completes the Cartesian Product of Dataset Observations for All Values of a Selected Set of Variables

Fall 2012 OASUS Questions and Answers

Quality Control of Clinical Data Listings with Proc Compare

ET01. LIBNAME libref <engine-name> <physical-file-name> <libname-options>; <SAS Code> LIBNAME libref CLEAR;

A Simple Framework for Sequentially Processing Hierarchical Data Sets for Large Surveys

SAS Application Development Using Windows RAD Software for Front End

An Automation Procedure for Oracle Data Extraction and Insertion

KEYWORDS Metadata, macro language, CALL EXECUTE, %NRSTR, %TSLIT

Journey to the center of the earth Deep understanding of SAS language processing mechanism Di Chen, SAS Beijing R&D, Beijing, China

100 THE NUANCES OF COMBINING MULTIPLE HOSPITAL DATA

Extending the Scope of Custom Transformations

Contents of SAS Programming Techniques

Making a List, Checking it Twice (Part 1): Techniques for Specifying and Validating Analysis Datasets

Using SAS software to fulfil an FDA request for database documentation

WHAT ARE SASHELP VIEWS?

Hidden in plain sight: my top ten underpublicized enhancements in SAS Versions 9.2 and 9.3

Using SAS 9.4M5 and the Varchar Data Type to Manage Text Strings Exceeding 32kb

Automated Checking Of Multiple Files Kathyayini Tappeta, Percept Pharma Services, Bridgewater, NJ

Using SAS/SCL to Create Flexible Programs... A Super-Sized Macro Ellen Michaliszyn, College of American Pathologists, Northfield, IL

An Approach Finding the Right Tolerance Level for Clinical Data Acceptance

Introduction. Getting Started with the Macro Facility CHAPTER 1

The Ins and Outs of %IF

An Introduction to Visit Window Challenges and Solutions

SAS Data View and Engine Processing. Defining a SAS Data View. Advantages of SAS Data Views SAS DATA VIEWS: A VIRTUAL VIEW OF DATA

Lecture 1 Getting Started with SAS

ABSTRACT INTRODUCTION THE GENERAL FORM AND SIMPLE CODE

Run your reports through that last loop to standardize the presentation attributes

USING DATA TO SET MACRO PARAMETERS

Mapping Clinical Data to a Standard Structure: A Table Driven Approach

Correcting for natural time lag bias in non-participants in pre-post intervention evaluation studies

Unlock SAS Code Automation with the Power of Macros

SYSTEM 2000 Essentials

APPENDIX 4 Migrating from QMF to SAS/ ASSIST Software. Each of these steps can be executed independently.

capabilities and their overheads are therefore different.

SAS Programming Techniques for Manipulating Metadata on the Database Level Chris Speck, PAREXEL International, Durham, NC

EXAMPLE 3: MATCHING DATA FROM RESPONDENTS AT 2 OR MORE WAVES (LONG FORMAT)

EXAMPLE 3: MATCHING DATA FROM RESPONDENTS AT 2 OR MORE WAVES (LONG FORMAT)

Give m Their Way They ll Love it! Sarita Prasad Bedge, Family Care Inc., Portland, Oregon

Transcription:

Efficient SAS Quality Checks: Unique Error Identification And Enhanced Data Management Analysis Jim Grudzinski, Biostatistics Manager Of SAS Programming Covance Periapproval Services Inc, Radnor, PA ABSTRACT The pressures of protocol timelines require an efficient process of quality checks and data error identification. Previously, at Covance, quality checks were run on a cumulative and/or on a set time period range. The data managers would have to review reams of hard copy reports and manually check the issued Data Clarification Forms (DCF s) for previously identified errors. Identify new quality check errors and generate the necessary DCF. The quality checks that could not be corrected would still appear on the hard copy reports, creating a redundancy of review time effort. This was a time consuming and inefficient process. This paper will offer one way of providing for an efficient SAS quality check process. Leading to unique error identification and analysis. This process will integrate SAS and Oracle in a Unix environment. The SAS programming code detailed in the paper is intentionally simplistic for demonstration purposes. INTRODUCTION The purpose of quality check diagnostics is to indicate data that are invalid or missing, to insure there are no keying problems and to retrieve missing/invalid information from the site via data clarification forms (DCF s). The initial case reporting form(crf. s) data are imported into an Oracle database. The data are then accessed by SAS via PROC SQL. The quality checks are then programmed in SAS. The cumulative quality check errors are identified and compared to previously identified errors. The new quality check errors are then tagged with a DCF description and a unique tracking id number. The new quality checks errors are stored in a SAS data set and appended to an Oracle table using PROC ACCESS and PROC APPEND. The data are now available for data management analysis. Upon review of data management, DCF s will be generated using a Microsoft Access environment. REVIEW QUALITY CHECKS Efficient quality checks programming starts before you program the first diagnostic. You should review the protocol. If relevant, you should review the proposed table shells. You should have an awareness of what the protocol is all about. You should have a sense of what data are critical to the final analysis. You should review the Oracle database. Check for names that exceed 8 characters. Check for continuity of names across tables. For example, patient number is always PTNO for all tables. Also, check that similar named variables are of the same data type, PTNO is defined as numeric across all tables. You should review the proposed SAS quality checks for effectiveness and accuracy. You should familiarize yourself with the test data. DATA CLARIFCIATION FORM TRACKING ORACLE TABLE Data clarification form tracking Oracle table is critical. This is the table that will have all quality check errors appended to it. This is the table that will be use by data management for analysis and generation of DCF s. You and all other relevant project members must agree upon all data fields and their use. You should start by identifying the key primary fields. These are the fields that will make the record unique. Also, identify the fields that will be populated by you, the SAS programmer, or other members of the project team, i.e. data managers. Listed below is a typical DCF Oracle tracking table. You should remember that the actual components of your table will vary depending upon the protocol and your specific needs. Figure 1 1$0(1XOO"7\SH 75$&.1212718//180%(5 &(175(12718//9$5&+$5 3$7,(1712718//180%(5 9,6,712718//9$5&+$5 (5525&'12718//9$5&+$5 37,1,79$5&+$5 7$%1$0(9$5&+$5 (55257;79$5&+$5 &5)'$7$9$5&+$5 &5)'$7$9$5&+$5 '&)*(19$5&+$5 '$7(*(1'$7( '$7(5(6'$7( 86(5,'9$5&+$5 $'''7('$7( CENTRE, PATIENT, VISIT and PTINT are Oracle table data fields and have been determined to be key fields (records that help define uniqueness). TRACKNO is created via the SAS program and is a unique sequence number. TABNAME is created by the SAS program and represents the Oracle table name. ERRORCD is the unique quality check query identifier. ERRORCD it is defined in the SAS program. ERRORTXT is defined in the DCF Error Text table (see Figure 2).

CRFDATA1 and CRFDAT2 are assigned by the SAS program and represents what is actually on the CRF. There are two of these because some quality check queries are a comparison of two data fields and the DCF would be clearer if both were included on it. The data managers control the remaining data fields. DCFGEN is assigned a default value of N in the SAS program. This data field indicates whether a DCF has been generated for this quality check error. Since the SAS program is appending only new quality check errors, the N represents NO DCF GENERATED. Once a DCF has been generated, this variable would change to an Y. DATEGEN indicates the date when the DCF query was generated. DATERES indicates when the DCF was resolved. The DATERES could be Y = resolved, N= not resolved and a potential third optional I = ignore. The ignore option can be used if the query cannot be resolved, for instance the site is closed out, or it could indicate the query resolution is not critical to the overall protocol. USERID and ADDDTE can be standard identification parameters. DATA CLARIFICATION FORM ERROR TEXT ORACLE TABLE The data clarification form error text Oracle table simply contains the unique error code identifier (ERRORCD) and its related error text field (ERRORTXT). Each quality check when originally defined should have a unique error code assigned to it. This error code will help establish the uniqueness of the quality check error, but also allow easy appending of the related error text data to the record. This error code definition need not be complicated. It could be the first two letters of the Oracle table being query and the number of the query. USERID and ADDDTE can be standard identification parameters. It is very important to confirm that all error codes on the quality check specifications have a corresponding error code on the error text table. Figure 2 1$0(1XOO"7\SH (5525&'12718//9$5&+$5 (55257;79$5&+$5 86(5,'9$5&+$5 $'''7('$7( Be very careful in what personnel you assign insert(append) and/or delete rights to these tables. will need the DCF tracking table and the DCF error text table, it could be efficient to import this data at this point. To do this, you will use the PROC SQL procedure. Figure 3 is an example of the procedure creating SAS data sets from Oracle tables. This example also includes a maco routine that enables you to access multiple oracle tables in a single routine. Figure 3 macro retrieve (tbname,dset) ; proc sql; connect to oracle as db (user=kengriffrey orapw=seattle path="xxx"); create table &dset as select * from connection to db (select * from &tbname); disconnect from db; quit; %mend retrieve; %retrieve(db999.demog,demog); %retrieve(db999.dcftrack,dcftrack); %retrieve(db999.errortxt,errortxt); Normally, the more efficient method is to create SAS views at the start of the study and access the Oracle data through these views. You should remember views would give you real time data. You must determine if you can use SAS views or stay with SAS data sets. To create a SAS view replace the word table with the word view. DEMOGRAPHY TABLE QUALITY CHECK The following quality check is a simple example(figure 4), but should be good enough for our purposes. The protocol requires that the SEX variable in the DEMOG table must be M or F. It is also the first quality check query written for the DEMOG table. The ERRORCD variable has been defined as the first initials of the table and number of the quality check query. Therefore the value of ERRORCD in this example is DEM01. Also the TABNAME variable has been determine to be DEMOG. You should remember the character limit is 8. The quality check query is structured so that you have the option of printing the hard copy results. This is the process that would produce reams of paper, mentioned in the ABSTRACT portion of this paper. But at the same time, you are collecting the data in the format necessary for matching against the DCF TRACKING table, DCF ERROR TEXT table, and appending to the new quality check errors back to the DCF TRACKING Oracle table. SAS QUALITY CHECK PROGRAM STEPS For our example we will be processing quality checks on the DEMOG(demography) table. The SAS program is qualdiag.sas IMPORT NECESSARY ORACLE TABLES DATA Identify the Oracle tables needed for your quality check diagnostics. Also, since you

Figure 4 ***DEM01***; data dem01(keep=centre ptno ptinit tabname crfdata1 crfdata2 visit sex); set demog; format crfdata1 crfdata2 $20. tabname $8.; ***data field tabname not on Oracle table***; tabname= DEMOG ; ***activate error code***; errorcd= DEM01 ; ***CRF data activation***; crfdata1=sex; crfdata2= NOT USED ; ***crfdata2 variable is often used with cross table queries date comparisons*** ***actual quality check query*** if sex in( M, F ) then test01 = PASS ; else test01 = FAIL ; ***do a frequency on test01 and print out to qualdiag.lst detail listing of the failures*** proc freq data = dem01; table test01 / list missing; title1 Quality Checks for Protocol X ; title2 Table Demog DEM01 ; title3 Sex must be M or F ; proc print data = dem01; var errorcd tabname centre ptno ptinit visit crfdata1 crfdata2 sex; where test01 = FAIL ; ***keep only failures and the data you absolutely need***; data dem01(drop=sex test01); set dem01; if test01= FAIL ; This process would repeat itself, based on the number of the required quality checks for the DEMOG table. For our example let us assume one more quality check for DEMOG. You would end up with SAS data set dem02 with all the failures for that particular test. You now concatenate all DEMOG error data sets into one. The quality check process, prior to the concatenation of all DEMOG error data sets represents the traditional process. At this point, all you have available is a hard copy report. Figure 5 data demogall; set dem01 dem02; IDENTIFY THE LAST TRACKNO VALUE If you remember, the value TRACKNO is created through the SAS program. Its purpose is to attach a unique sequence number to the quality check errors being appended to the DCF TRACKING table. Since in the course of the study, it is logical to expect additions and deletions of entries being made directly to the DCF TRACKING table, by data managers, you cannot just do a count of the number of OBS and expect that value to be the last TRACKNO used. The following example shows one way of getting the correct last TRACKNO in use. It also, shows a use for a %GLOBAL macro. Figure 6 *****use the DCFTRACK data set created earlier ***; data tracksum(keep=count trackno); set dcftrack nobs=n;; format count 5.; count=n; ***sort data set ***; proc sort data = tracksum; by count trackno; ***identify the last observation***; data tracksum; set tracksum; by count; ***keep the last observation only***; if last.count; ***create global macro***; data tracksum; set tracksum; %global totnum; call symput( totnum,trim(left(trackno))); CONCATENATE ALL QUALITY CHECK ERRORS FROM ALL TABLES Our example only has one data set of quality check errors. But more than likely, you will have several. The data set ALL, truly has all the known quality check errors. Figure 7 data all; set demogall; ***here is good place to set and default values***; dcfgen = N ;

MERGE SAS DATA SET ALL WITH DCF TRACKING TABLE The data set ALL, from figure 7, contains all the quality check errors for all the tables. The DCF TRACKING table has all the quality check errors that have been identified from previous runs. You must have both data sets sorted by all of the primary keys. The primary keys are the data fields that make the record unique. For our example the primary keys are centre, ptno, visit ptinit tabname crfdata1 crfdata2 errorcd. Figure 8, displays the merge logic. Figure 8 data all; merge all(in=in1) dcftrack(in=in2); by centre patient visit ptinit tabname crfdata1 crfdata2 errorcd ; ***already on dcftrack***; if in1 and in2 then delete; ***placed on dcftrack independently of sas quality checks programs***; if in1 = 0 and in2 = 1 then delete; ***new quality check errors***; if in1 = 1; You now have the data set ALL with only the new quality check errors. A quality check will be considered new, even if only one of the primary keys is different. ASSIGN THE NEW QUALITY CHECKS A UNIQUE TRACK NUMBER You now have to assign a sequential tracking number to the data field TRACKNO. The tracking number serves as a reference point for the DCF s that are generated and sent to the sites. You can keep track of the DCF s that have been returned and those that are still outstanding. Figure 9 data all; set all; Retain count; ***you need the retain***; ***&basesum refer to figure 6*** if _n_ = 1 then count = &basesum; count = count + 1; trackno = count; ***do not need the data field count***; data all(drop=count); set all; MERGE SAS DATA SET ALL WITH DCF TRACKING ERROR TABLE The only data needed to make the data set ALL complete is the error text data. The error text is the description of the quality check error. You should take great care in creating the error text data. The error text description will appear verbatim on the DCF s. In practice, for quality check errors that the data manger knows in advance that a DCF will not be generated, a portion of the error text is often No DCF to be issued. Based on our example of a quality check on the demographic variable SEX; an appropriate error text would be Mandatory field left blank, please clarify patients gender. The figure 10 example, assumes that the data sets have now been sorted by ERRORCD. Figure 10 Data all; Merge all(in=in1) errortxt(in=in2); by errorcd; if in1 and in2; PRINTING OPTIONS Our example program is named qualdiag.sas. The traditional quality check process(figure 4), will print out under qualdiag.lst. You may not want to this data because of its size and the printing time required. Also, you should only want to see what you have just appended to the DCF TRACKING table. A PROC PRINT will also print to qualdiag.lst. The SAS PROC you want to use is PROC PRINTTO. You have the option of sorting the data for quick review. Two ways to sort the data are by TRACKNO or by ERRORCD. The print out sorted by TRACKNO would be a sequential listing, but centre and patient would also group all the quality check errors. This is the way they would appear on the DCFTRACK Oracle table. A separate print out by error code and this should include a PROC FREQ, This will give you a quick snapshot of what quality check errors are failing and how many. This review is good to debug quality checks, aid in validation and it can give the data manager an indication of a serious problem if certain quality checks are failing in large numbers. Figure 11 will only show the PROC PRINTTO option with the data sorted by errorcd. Figure 11 ***select directory and rename print out*** proc printto new print= '/db999/quality/newsort.lst'; proc freq data = all; table errorcd /list missing; title1 'Project db999'; title2 'Frequency of ERROR CODES'; footnote1 'Printout is /db999/quality/newsort.lst';

Proc print noobs; Title2 Listing of New Quality Checks Diagnostics Errors ; Title3 Sorted by ERRORCD Title4 "As Of &sysdate"; Var trackno errorcd centre patient visit ptinit tabname errortxt crfdata1 crfdata2; APPEND QUALITY CHECK ERRORS TO DCF TRACKING ORACLE TABLE You are now ready to append the quality check errors from the SAS data set ALL to the Oracle table DCFTRACK. The data type must be the same. The SAS data set fields cannot have a length that is longer then its corresponding field in the Oracle table. If this condition exists, no data will be appended to the Oracle table. The primary keys, as defined on the Oracle table, cannot have any NULL values on the SAS data set. If this condition exists, the SAS data will be appended to the Oracle table, only until this condition is encountered. You should definitely check your log when executing this quality check process. The SAS PROC S involved in this process are PROC ACCESS and PROC APPEND. The PROC ACCESS procedure allows you to create an ACCESS DESCRIPTOR and a VIEW DESCRIPTOR. The ACCESS DESCRITPOR is how the Oracle table will be accessed and allows the creation of a VIEW DESCRITPOR. The VIEW DESCRIPTOR acts as the window to the DCFTRACK Oracle table. It is the VIEW DESCRITPOR that will serve as the base data set in the PROC APPEND procedure. The PROC APPEND procedure will add the data from the SAS data set ALL to the DCFTRACK oracle table via the VIEW DESCRIPTOR. Figure 12 is an example of this combined PROC ACCESS and PROC APPEND procedure. Figure 12 *** access the oracle database***; proc access dbms=oracle; user=kengriffrey; orapw=seattle path='xxx'; table=db99.dcftrack; *** create the access descriptor***; create work.qchecks.access; *** create the view descriptor***; create work.qchecks.view; The new quality checks are now appended to the DCFTRACK Oracle. You should check the SAS log and make sure the number of records appended are correct. &21&/86,21 This quality check process has reduced the time that data managers need to review the quality check diagnostics. You no longer have to manually review hard copy output and then manually update MICROSOFT ACCESS tables or worst yet manually generate Data Clarification Forms. By identifying only new quality check errors and loading these errors into an Oracle table this whole process has been streamlined and is ultimately less costly. While this presentation offered a simplified example, the whole process would remain the same, (except the SAS program(s) would be written with more efficiencies), regardless of the number of quality checks required and the number of tables involved. Table 1 at the end of this paper is a flowchart of the process reviewed in this paper. REFERENCES SAS Institute Inc. (1998) Processing Database and Spreadsheet Data with SAS/ACCESS Software, Cary, NC: SAS Institute Inc. TRADEMARKS SAS is a registered trademark or trademark of SAS Institute, Inc. in the USA and other countrys. indicates USA registration. CONTACT INFORMATION If you have additional questions, or would like an electronic copy of the code, feel free to contact me. Jim Grudzinski, Biostatistics Manager of SAS Programing Covance Periapproval Services Inc. One Radnor Corporate Center Radnor, PA 19087 (610)975-3823 james.grudzinski@covance.com ***the FORCE option allows the appending of data even if the variables do not match exactly***; Proc append base = qchecks data= all force; The new quality checks are now appended to the DCFTRACK Oracle. You should check the SAS log and make sure the numbers of records appended are correct.

Table 1 Quality Check Diagnostic Process The DCF Tracking Table Contains the Cumulative Listing of Alll Identified QC Errors DCF Tracking Table Centre, Ptno, Errorcd, Crfdata, Other Key Identifiers Qualdiag.sas SAS Quality Check Diagnostic Program(s) Quality Error Previously Reported - No Action Yes Merge DCF Tracking Table w/ SAS Data Set "ALL" by Key Identifiers SAS Data Set "ALL" of Cumulative QC Error s No DCF Error Text Table Error Code/Error Text Oracle Table Append "New" Quality Checks with their Unique Error Text Code by Matching on Error Codes Qualdiag.lst Optional Traditional Data Listing Of All Cumulative Diagnostic Errors PROC ACCESS/PROC APPEND Append "New" Quality Check Errors to DCF Tracking Oracle Table PROC PRINTTO Produce Data Listing of "New" Diagnostic Errors