esource Initiative ISSUES RELATED TO NON-CRF DATA PRACTICES

Similar documents
Examining Rescue Studies

Why organizations need MDR system to manage clinical metadata?

How to review a CRF - A statistical programmer perspective

Improving Metadata Compliance and Assessing Quality Metrics with a Standards Library

How a Metadata Repository enables dynamism and automation in SDTM-like dataset generation

CDASH MODEL 1.0 AND CDASHIG 2.0. Kathleen Mellars Special Thanks to the CDASH Model and CDASHIG Teams

CDASH Standards and EDC CRF Library. Guang-liang Wang September 18, Q3 DCDISC Meeting

Common Protocol Template (CPT) Frequently Asked Questions

Pharmaceuticals, Health Care, and Life Sciences. An Approach to CDISC SDTM Implementation for Clinical Trials Data

Introduction. Lesson 1 Access and Basic Navigation

Standard Operating Procedure. Data Management. Adapted with the kind permission of University Hospitals Bristol NHS Foundation Trust

Common Statistical Analysis Plan (SAP) Frequently Asked Questions

UNIVERSITY OF LEICESTER, UNIVERSITY OF LOUGHBOROUGH & UNIVERSITY HOSPITALS OF LEICESTER NHS TRUST JOINT RESEARCH & DEVELOPMENT SUPPORT OFFICE

How to write ADaM specifications like a ninja.

An Introduction to Analysis (and Repository) Databases (ARDs)

CDISC Standards End-to-End: Enabling QbD in Data Management Sam Hume

Standard Operating Procedure Clinical Data Management

PharmaSUG. companies. This paper. will cover how. processes, a fairly linear. before moving. be carried out. Lifecycle. established.

User Guide 16-Mar-2018

Cost-Benefit Analysis of Retrospective vs. Prospective Data Standardization

Standard Operating Procedure. SOP effective: 06 February 2017 Review date: 06 February 2019

Dataset-XML - A New CDISC Standard

IRT & ecoa: The Good, The Bad & The Ugly YPrime, inc. All Rights Reserved

IBIS. Case Study: Image Data Management System. IBISimg at Novartis using Oracle Database 11g Multimedia DICOM

PharmaSUG 2014 PO16. Category CDASH SDTM ADaM. Submission in standardized tabular form. Structure Flexible Rigid Flexible * No Yes Yes

BioInformatics A Roadmap To Success. Data Management Plans. Wes Rountree Associate Director of Data Management Family Health International

From ODM to SDTM: An End-to-End Approach Applied to Phase I Clinical Trials

Data Consistency and Quality Issues in SEND Datasets

SDTM Implementation Guide Clear as Mud: Strategies for Developing Consistent Company Standards

EDC integrations. Rob Jongen Integration Technology & Data Standards

Standards Driven Innovation

Standardising The Standards The Benefits of Consistency

Agile Master Data Management TM : Data Governance in Action. A whitepaper by First San Francisco Partners

Tips on Creating a Strategy for a CDISC Submission Rajkumar Sharma, Nektar Therapeutics, San Francisco, CA

EUROPEAN MEDICINES AGENCY (EMA) CONSULTATION

Developing an Integrated Platform

Harmonizing CDISC Data Standards across Companies: A Practical Overview with Examples

01.0 Policy Responsibilities and Oversight

SITE FILE MANAGEMENT

The Wonderful World of Define.xml.. Practical Uses Today. Mark Wheeldon, CEO, Formedix DC User Group, Washington, 9 th December 2008

Google Cloud & the General Data Protection Regulation (GDPR)

Paper DS07 PhUSE 2017 CDISC Transport Standards - A Glance. Giri Balasubramanian, PRA Health Sciences Edwin Ponraj Thangarajan, PRA Health Sciences

Issues that Matter Notification and Escalation

Elements of Data Management

BUSINESS CONTINUITY MANAGEMENT PROGRAM OVERVIEW

Making a Business Case for Electronic Document or Records Management

: Course : SharePoint 2016 Site Collection and Site Administration

Chapter 10: Regulatory Documentation

From raw data to submission: A metadata-driven, repository-based process of data conversion to CDISC models

MOBIUS + ARKIVY the enterprise solution for MIFID2 record keeping

Study Data Reviewer s Guide Completion Guideline

ABSTRACT INTRODUCTION WHERE TO START? 1. DATA CHECK FOR CONSISTENCIES

Implementing ITIL v3 Service Lifecycle

Creating a Patient Profile using CDISC SDTM Marc Desgrousilliers, Clinovo, Sunnyvale, CA Romain Miralles, Clinovo, Sunnyvale, CA

R1 Test Case that tests this Requirement Comments Manage Users User Role Management

SharePoint 2016 Site Collections and Site Owner Administration

SharePoint 2016 Site Collections and Site Owner Administration

Clinical trial data management technology Guide

Accelerate GDPR compliance with the Microsoft Cloud

Taming Rave: How to control data collection standards?

Touchstone Technologies, Inc. Course Catalog February 2017

Automated Cloud Compliance. GxP and 21 CFR Part 11 Compliance

Design of Case Report Forms. Case Report Form. Purpose. ..CRF Official clinical data-recording document or tool used in a clinical study

SECURE DATA OFFICE: AN INDEPENDENT TEAM THAT CAN COME TO THE RESCUE IN BLINDED TRIALS

Introduction to ADaM and What s new in ADaM

NEWCASTLE CLINICAL TRIALS UNIT STANDARD OPERATING PROCEDURES

Paper FC02. SDTM, Plus or Minus. Barry R. Cohen, Octagon Research Solutions, Wayne, PA

Security and Privacy Governance Program Guidelines

Standard Development Timeline

Five Ways to Improve Electronic Patient Record Handling for HIPAA/HITECH with Managed File Transfer

THOMSON REUTERS FX TRADING (FXT)

CDISC Public Webinar Standards Updates and Additions. 26 Feb 2015

Experience of electronic data submission via Gateway to PMDA

Data Management Dr Evelyn Flanagan

SharePoint 2016 Site Collections and Site Owner Administration

Are You GHS Compliant or at RISK for FINES?

Converting Data to the SDTM Standard Using SAS Data Integration Studio

FedRAMP: Understanding Agency and Cloud Provider Responsibilities

"Charting the Course... MOC A: SharePoint 2016 Site Collections and Site Owner Administration. Course Summary

GDPR Processor Security Controls. GDPR Toolkit Version 1 Datagator Ltd

Control-M and Payment Card Industry Data Security Standard (PCI DSS)

Once the data warehouse is assembled, its customers will likely

Study Composer: a CRF design tool enabling the re-use of CDISC define.xml metadata

THE TRIAL MASTER FILE

October p. 01. GCP Update Data Integrity

Codelists Here, Versions There, Controlled Terminology Everywhere Shelley Dunn, Regulus Therapeutics, San Diego, California

Bay Area CDISC Network: PhUSE Working Group for Inspection Site Selection Data Standards

Semantic Technologies and CDISC Standards. Frederik Malfait, Information Architect, IMOS Consulting Scott Bahlavooni, Independent

CMS RETROACTIVE ENROLLMENT & PAYMENT VALIDATION RETROACTIVE PROCESSING CONTRACTOR (RPC) STATE AND COUNTY CODE UPDATE STANDARD OPERATING PROCEDURE

Verint Knowledge Management Solution Brief Overview of the Unique Capabilities and Benefits of Verint Knowledge Management

Smart Measurement System for Late Phase

Good Laboratory Practice GUIDELINES FOR THE ARCHIVING OF ELECTRONIC RAW DATA IN A GLP ENVIRONMENT. Release Date:

Data Virtualization Implementation Methodology and Best Practices

1. STRATEGIC PLANNING

Welcome Membership Update Sub-group / Initiatives update Investigator TMF Clarifications Sub-artifacts Observational Studies Exchange Mechanism

CIP Cyber Security Configuration Change Management and Vulnerability Assessments

Guide to IREE Certification

Standards Readiness Criteria. Tier 2

THOMSON REUTERS DEALING

Data Governance Quick Start

Transcription:

esource Initiative ISSUES RELATED TO NON-CRF DATA PRACTICES

ISSUES RELATED TO NON-CRF DATA PRACTICES Introduction Non-Case Report Form (CRF) data are defined as data which include collection and transfer of data in electronic format from internal clinical trial sponsor sources (e.g., specialty laboratories) or external vendors (e.g., laboratory results, imaging, ECG, randomization, drug accountability) into clinical research data repositories/warehouses without entering the data on a CRF in Optimizing the Use of Electronic Data Sources in Clinical Trials, The Landscape, Part 1 1. Although sponsors must collect and retain this data, there is currently no applicable industry guidance. Well thought-out practices and procedures for the collection and retention of non-crf data are crucial to maintaining data accuracy and integrity. To guide the development of such practices and procedures, this paper outlines certain high-level best practices related to handling and managing non-crf data. These are high-level suggestions, as the purpose is to share learnings to enable more efficient, effective use, and/or adoption of esource. The suggestions can and should be tailored to each company s individual circumstances. These considerations can also help companies to address or reduce challenges that they have encountered and/or to initiate discussions to help improve processes. 2

1. Considering Availability of Standard Specifications & File Formats i. Predefines format of incoming data Facilitates integration with other systems Can enable automation, which increases the likelihood of discovering errors Aids interpretation and understanding of expected sponsor requirements v. Enables development of standard preprocessing procedures needed when receiving files Creates efficiencies with vendors as they set up their systems in ways that satisfy specifications across sponsors v Can be used across vendors to help with interoperability v Utilizes standards, helps achieve efficiencies by minimizing time and expenses spent on non-crf activities b. Potential approaches/options i. Set up Standard Data Transfer Specifications for non-crf data by data type (e.g. pharmacokinetic (PK), safety, and biomarker) and/or therapeutic area can generate efficiencies If possible, harmonize standards company-wide Consider using existing industry standards as a starting point for non-crf data to create a template for data specifications and file formats (e.g., CDISC s Study Data Tabulation Model [SDTM], etc.) Explore whether vendor is capable of providing data according to industry standards or have a standardized/validated export data format v. A sponsor might also create alternative processes for those vendors that are not capable of providing data in standard format. Discuss with the vendor what they can do and what could be done by the sponsor/company (e.g., new programming may accommodate vendor-specific variables, but conversion to required controlled terminology is not necessary if the vendor can adhere to industry standards) Consider including all transfer details in Data Transfer Specifications such as but not limited to: 1. File naming conventions 2. File format (e.g., Excel, CSV, text ) 3. Metadata for content 4. Variable names/labels 5. Allowable values and controlled terminology 6. Transfer frequency 7. File should be cumulative or incremental 8. Blinding instructions 3

2. Considering Standard Processes for Handling Non-CRF Data i. Consistency reduces potential for error and re-work Can reduce timelines Can reduce resources needed Reduces programming efforts and less chance of errors v. Reduces training (e.g., especially for new employees) Meets regulatory and other stakeholders expectations v Provides clarity for study teams and during formal inspections b. Potential approaches i. Consider defining data flow process first if possible (as alluded to in Defining Data Flow Process, section 3 below) and defining data handling processes that reflect and align with expected data flow Consider clearly defining the expected use of standard processes so it s understood by appropriate function roles (e.g., provide training to appropriate function roles, etc.) If a Data Standard Specification template does not exist, consider using existing study specifications and work to create a standard template Discuss setting data cleaning processes and standards checks during study start up v. Consider using a software tool that can standardize checks, reports etc., by data type Think about utilizing preferred vendors that have experience with working with sponsors and skilled personnel v As part of the data transfer specifications, have the vendor specify which checks they are performing on the data and on the data load v Specify which checks will be performed by the sponsor on the data load (e.g., number of observations is higher in following cumulative data transfer) ix. Suggest archiving non-crf data specifications in the Trial Master File 4

3. Defining Data Flow Process i. Outlines process definition Identifies sender and receiver as well as the data originator and source data location Provides data lineage, integrity, and clarity (e.g., to be able to limit access and allow version control) Ensures audit and inspection readiness v. Helps ensure IT resources are adequately prepared (e.g., System Owner, Data Owner, server storage, security and compliance) Enables alignment with stakeholders process flow v Provides structure and framework for non-crf data v Helps to identify areas of improvement and innovation b. Potential approaches/options i. Consider creating a standard flowchart and a list of roles to notify when data come in by study Consider having an automated method when files are posted, retrieved, and placed in a company s system platform (e.g., receive an email alert when a file is posted, have scripts that can automatically retrieve and place files in a system platform under a corresponding study directory, and also receive an email alert once completed etc.) Communicate data flow process to the team, and solicit feedback and comments Think about revisiting and updating the data flow process on agreed-upon frequency (and if there are processes that fail or improvements that can be made) Design an agile approach to make the flow consistent but that includes flexibility 5

c. Please see example of data flow below. The numbers identify the non-crf Potential practices listed in this document. *Data Flow processes 3 Non-CRF Data Transfer External Vendor Non-CRF Data Team Statistical Programming Team 4 Non-CRF Data Provider 2 6 Sponsor TEST PRODUCTION TEST PRODUCT 5 Transfer (e.g., web application, sftp) 1 Specifications *Data Flow Processes 1. Considering Availability of Standard Specifications & File Formats 2. Considering Standard Processes for Handling Non-CRF Data 3. Defining Data Flow Process 4. Using a Test Transfer File 5. Using a Secure Transfer Method 6. Data Review, Cleaning and Reconciliation 6

4. Using a Test Transfer File i. Ensures the transfer file process is adhered to and functioning as planned (e.g., access to FTP); End-to-end testing to ensure SYSTEM part is in place Enables the sponsor to verify that the file format complies with the sponsor specifications (e.g., file naming conventions are in place as expected) This also enables the team to refine the specifications in case the format of the data collected is different from what anticipated during the design phase Ensures the provider and sponsor are aligned prior to moving forward with production transfers v. Ensures study teams and other relevant sponsor functions agree with expectations and what should be collected per protocol Demonstrates that non-crf data file passes conformance checks and available for processes b. Potential approaches/options If test data are relevant, think about including end users and their requirements: i. Test data can be created (i.e., dummy data ) to conform to file specifications provided. However, when possible, it is generally preferable that the test data be comprised of actual study data to ensure compliance with specifications. If using automated transfers including monitoring, testing this part should also be considered Also, recommend utilizing the same process on test and production files to ensure there are no gaps and/or issues (e.g., same process on receiving, retrieving, storing and checking the file by using the same scripts, programs and/or tool) In addition, both provider and sponsor should use robust test data that emphasize real production scenarios as much as possible 7

5. Using a Secure Transfer Method i. Required as part of a validated system Maintains consistency on file exchange process for all vendors Ensures the security and data integrity of proprietary data Complies with laws and regulations v. Helps ensure sensitive or un-blinded data are only accessed by the appropriate personnel via access controls/permissions in the secure transfer method b. Potential approaches/options i. Using a secure data transfer method that works for all vendors 1. Consider encryption methods during data transfer or when data are at rest if data are ultimately stored in cloud-based services 2. Consider having and maintaining read-only copy of file with read-only access 3. It may be necessary to have a back-up option for vendors that cannot use standard process (i.e., CD flash drive). If possible, think about using one method to exchange files for all vendors. Using many different vendor web portals, FTP servers, etc., may increase confusion and errors 8

6. Data Review, Cleaning and Reconciliation i. To ensure data integrity and completeness b. Potential approaches/options i. Consider creating non-crf Data Handling Plan Template and a section in regards to non-crf datatypes: this template can be used at study level and modified as needed. Therefore, this plan can define what to expect and how to handle unexpected scenarios for a specific study Set up standard checks to be performed on the data transfer files for non- CRF data (e.g., creating checks by therapeutic area and/or data types e.g., PK, safety, and biomarker) Set up standards reports for therapeutic area and/or data type. Medical monitors and/or bioanalysis groups may need to review the actual data and provide feedback/assessments. Please keep in mind that some function roles may be blinded or un-blinded depending on company s organization structure, so think carefully about this approach Think about adding conditional and data integrity checks along with formatting checks (e.g., check that date of visit is within expected window from protocol and/or against visit date in electronic data capture (EDC) system; have checks to define what file records may be missing or overdue based on pre-defined schedule of events, etc.) v. Consider requesting inventory file from vendors, which confirms what has been transferred (e.g., inventory notification such as size of file, number of records, etc.) Think about defining a process with vendor on how to handle missing records, how to manage changes if a file is incremental or define if the file will be cumulative v Perform ongoing non-crf discrepancy management. This may include executing the check file and conducting data content checks every time a file is received (i.e., test and production) to check for consistency and identify existing and/or new file and data discrepancies. Here are some examples of points to consider when reviewing and reconciling data: 1. Received data compliant to specifications 2. Received complete set of data (all patients and their visits expected per protocol where possible, etc.) should be performed for each data transfer by study group (DM responsible) 3. Check for duplicate and obsolete records 4. Ensure data values match if more than one source is collecting the same information (e.g., subject identifiers, date of visit, subject initials, birth date, date of collections, PK records and samples received in data transfer file compared to data in EDC system) 9

v Perform ongoing non-crf discrepancy management. This may consist of outlining how queries get resolved (who processes the queries and what happens during the process of getting them resolved) ix. Think about clearly defining the roles and responsibilities on the tasks of data review, cleaning and reconciliation. This ensures all stakeholders are clear on their duties and that nothing is left out or duplicated x. Identify timelines with each vendor. Clearly discuss and document frequency of when transfers are expected, how long reconciliation process will take after each transfer, when vendor should expect data reconciliation feedback after each transfer, how long vendor has to make requested corrections/ updates before next transfer delivered Acknowledgements Thank you to the following non-crf team members from the esource Initiative, which contributed to this document: Aleny Caban (Medical & Development - Data Science, Astellas Pharma, Northbrook, IL, USA), Mari Clovis (Clinical Data Integration and Technology Bristol-Myers Squibb Hopewell NJ, USA), William Kesil (Clinical Data Management, Roche, New York, NY, USA), Michelle M Lewis (Clinical Trial Solutions Center of Excellence, Pfizer Inc., Groton, Connecticut, USA),, Patricia McIntire (Clinical Development & Operations, Pfizer Inc., New York, New York, USA), Gerardo Ortiz (Safety and Biomarkers Laboratory, Pfizer Clinical Research Unit, New Haven CT), Suma Suresh (Clinical Trial Solutions Center of Excellence, Pfizer Inc., Groton, Connecticut, USA). In addition, the non-crf team members gratefully thank all of the TransCelerate BioPharma Member Companies, which provided helpful input to this document. References 1. TransCelerate BioPharma. Optimizing the Use of Electronic Data Sources in Clinical Trials, The Landscape, Part 1 http://journals.sagepub.com/doi/pdf/10.1177/2168479016670689, Accessed June 2017 10