Study Data Technical Conformance Guide (TCG)

Similar documents
Study Data Reviewer s Guide

FDA Study Data Technical Conformance Guide v4.2

DIA 11234: CDER Data Standards Common Issues Document webinar questions

CBER STUDY DATA STANDARDS UPDATE

Material covered in the Dec 2014 FDA Binding Guidances

Aquila's Lunch And Learn CDISC The FDA Data Standard. Disclosure Note 1/17/2014. Host: Josh Boutwell, MBA, RAC CEO Aquila Solutions, LLC

CDASH Standards and EDC CRF Library. Guang-liang Wang September 18, Q3 DCDISC Meeting

STUDY DATA TECHNICAL CONFORMANCE GUIDE

How to write ADaM specifications like a ninja.

PharmaSUG Paper DS24

Tips on Creating a Strategy for a CDISC Submission Rajkumar Sharma, Nektar Therapeutics, San Francisco, CA

Study Data Reviewer s Guide Completion Guideline

Dealing with changing versions of SDTM and Controlled Terminology (CT)

Implementing CDISC Using SAS. Full book available for purchase here.

Traceability Look for the source of your analysis results

Data Consistency and Quality Issues in SEND Datasets

Introduction to ADaM and What s new in ADaM

PharmaSUG Paper CD16

Recent Developments in FDA s Review of Proprietary Names for Drugs

Updates on CDISC Standards Validation

POLICY AND PROCEDURES OFFICE OF SURVEILLANCE AND EPIDEMIOLOGY. Procedures for Handling Requests for Proprietary Name Review.

Elements of Data Management

Applying ADaM Principles in Developing a Response Analysis Dataset

Sandra Minjoe, Accenture Life Sciences John Brega, PharmaStat. PharmaSUG Single Day Event San Francisco Bay Area

Creating Define-XML v2 with the SAS Clinical Standards Toolkit 1.6 Lex Jansen, SAS

Lex Jansen Octagon Research Solutions, Inc.

CDISC SDTM and ADaM Real World Issues

From Implementing CDISC Using SAS. Full book available for purchase here. About This Book... xi About The Authors... xvii Acknowledgments...

Improving Metadata Compliance and Assessing Quality Metrics with a Standards Library

Cost-Benefit Analysis of Retrospective vs. Prospective Data Standardization

It s All About Getting the Source and Codelist Implementation Right for ADaM Define.xml v2.0

Edwin Ponraj Thangarajan, PRA Health Sciences, Chennai, India Giri Balasubramanian, PRA Health Sciences, Chennai, India

Why organizations need MDR system to manage clinical metadata?

Design of Case Report Forms. Case Report Form. Purpose. ..CRF Official clinical data-recording document or tool used in a clinical study

The Wonderful World of Define.xml.. Practical Uses Today. Mark Wheeldon, CEO, Formedix DC User Group, Washington, 9 th December 2008

Beyond OpenCDISC: Using Define.xml Metadata to Ensure End-to-End Submission Integrity. John Brega Linda Collins PharmaStat LLC

ADaM Implementation Guide Status Update

Revision of Technical Conformance Guide on Electronic Study Data Submissions

Managing your metadata efficiently - a structured way to organise and frontload your analysis and submission data

Common Statistical Analysis Plan (SAP) Frequently Asked Questions

Automate Analysis Results Metadata in the Define-XML v2.0. Hong Qi, Majdoub Haloui, Larry Wu, Gregory T Golm Merck & Co., Inc.

Paper DS07 PhUSE 2017 CDISC Transport Standards - A Glance. Giri Balasubramanian, PRA Health Sciences Edwin Ponraj Thangarajan, PRA Health Sciences

Biotechnology Industry Organization 1225 Eye Street NW, Suite 400 Washington, DC 20006

Preparing the Office of Scientific Investigations (OSI) Requests for Submissions to FDA

What is high quality study metadata?

MedDRA Update. MedDRA Industry User Group Meeting. 28 September 2018

The Human Touch: Develop a Patient-Centric Injection Device

In addition, below we offer our responses to the questions posed in the Federal Register Notice announcing the availability of the Draft Guidance:

How to clean up dirty data in Patient reported outcomes

ectd TECHNICAL CONFORMANCE GUIDE

Acceptance Checklist for Abbreviated 510(k)s

Standardizing FDA Data to Improve Success in Pediatric Drug Development

CDASH MODEL 1.0 AND CDASHIG 2.0. Kathleen Mellars Special Thanks to the CDASH Model and CDASHIG Teams

Codelists Here, Versions There, Controlled Terminology Everywhere Shelley Dunn, Regulus Therapeutics, San Diego, California

SDTM Implementation Guide Clear as Mud: Strategies for Developing Consistent Company Standards

Best Practice for Explaining Validation Results in the Study Data Reviewer s Guide

How to handle different versions of SDTM & DEFINE generation in a Single Study?

NEWCASTLE CLINICAL TRIALS UNIT STANDARD OPERATING PROCEDURES

Standards Implementation: It Should be Simple Right? Thursday January 18, 2018

Comparison of FDA and PMDA Requirements for Electronic Submission of Study Data

Dataset-XML - A New CDISC Standard

Lex Jansen Octagon Research Solutions, Inc.

Bay Area CDISC Network: PhUSE Working Group for Inspection Site Selection Data Standards

OpenCDISC Validator 1.4 What s New?

Common Programming Errors in CDISC Data

An Efficient Solution to Efficacy ADaM Design and Implementation

Chapter 10: Regulatory Documentation

Use of Traceability Chains in Study Data and Metadata for Regulatory Electronic Submission

PhUSE Paper SD09. "Overnight" Conversion to SDTM Datasets Ready for SDTM Submission Niels Mathiesen, mathiesen & mathiesen, Basel, Switzerland

Pharmaceuticals, Health Care, and Life Sciences. An Approach to CDISC SDTM Implementation for Clinical Trials Data

Submission Guidelines

Common Protocol Template (CPT) Frequently Asked Questions

Considerations on creation of SDTM datasets for extended studies

Guidance for the format and content of the final study report of non-interventional post-authorisation safety studies

21 CFR Part 11 LIMS Requirements Electronic signatures and records

NCI/CDISC or User Specified CT

Contents. Researcher Guidance for Reportable Events & Notifying the IRB. Office of Research Integrity & Compliance Version 2 Updated: June 23, 2017

Implementation of Data Cut Off in Analysis of Clinical Trials

MEDICINES CONTROL COUNCIL

ELECTRONIC SITE DELEGATION LOG (esdl) USER MANUAL

Harmonizing CDISC Data Standards across Companies: A Practical Overview with Examples

CIP Standards Development Overview

Global Checklist to QC SDTM Lab Data Murali Marneni, PPD, LLC, Morrisville, NC Sekhar Badam, PPD, LLC, Morrisville, NC

Working with Composite Endpoints: Constructing Analysis Data Pushpa Saranadasa, Merck & Co., Inc., Upper Gwynedd, PA

Let s Create Standard Value Level Metadata

A SDTM Legacy Data Conversion

Xiangchen (Bob) Cui, Tathabbai Pakalapati, Qunming Dong Vertex Pharmaceuticals, Cambridge, MA

Introduction to ADaM standards

DCDISC Users Group. Nate Freimark Omnicare Clinical Research Presented on

Submission-Ready Define.xml Files Using SAS Clinical Data Integration Melissa R. Martinez, SAS Institute, Cary, NC USA

ehepqual- HCV Quality of Care Performance Measure Program

pan-canadian Oncology Drug Review Stakeholder Feedback on a pcodr Expert Review Committee Initial Recommendation

Leveraging Study Data Reviewer s Guide (SDRG) in Building FDA s Confidence in Sponsor s Submitted Datasets

Application of SDTM Trial Design at GSK. 9 th of December 2010

Reviewers Guide on Clinical Trials

How a Metadata Repository enables dynamism and automation in SDTM-like dataset generation

April 28, Division of Dockets Management (HFA-305) Food and Drug Administration 5630 Fishers Lane, Room 1061 Rockville, MD 20852

A Taste of SDTM in Real Time

How to review a CRF - A statistical programmer perspective

PharmaSUG. companies. This paper. will cover how. processes, a fairly linear. before moving. be carried out. Lifecycle. established.

Transcription:

Study Data Technical Conformance Guide (TCG) Webinar Series 2017 Center for Biologics Evaluation and Research (CBER) Center for Drug Evaluation and Research (CDER)

The TCG TCG provides recommendations on how to submit required standardized study data. OCTOBER 2017 Version 4.0 TCG is non-binding, but adherence to the recommendations facilitates regulatory review. TCG is the guide to use for study data submissions to CBER and CDER. TCG is prepared by an interdisciplinary team from CBER and CDER. TCG is updated, at least, twice a year in March and October. 2

FDA Webinar Series TCG Webinar Series Enhances FDA s outreach activities to industry to provide regular updates and clarification on study data standards. Provides FDA with industry feedback through the Webinar Q&A session. Informs FDA on TCG updates, as well as topics for future webinars. 3

Clinical Outcome Assessments: QS Domain October 2017 TCG Update Sara Jimenez, PhD Office of Biostatistics, CDER

Outline 1) Legislative background 2) Clinical outcome assessment (COA) introduction 3) Logically skipped items 4) Importance of data from logically skipped items 5) Conclusion 5

Legislative Background 6

Legislative Background The Prescription Drug User Fee Act (PDUFA) VI was signed on August 18, 2017, and took effect on October 1, 2017 A key part of PDUFA VI is the enhancement of the incorporation of the patient s voice in drug development and decision making 7

Legislative Background (cont.) The 21 st Century Cures Act was signed on December 13, 2016 Title III of the Act, Subtitle A, is entitled Patient-Focused Drug Development 8

COA Introduction 9

COA Introduction Patient experience data have become more important in the drug development process Patient experience data can be captured in a COA 10

COA Introduction (cont.) A COA is an assessment of a clinical outcome that describes or reflects how an individual feels, functions or survives. The assessment can be made through report by a clinician, a patient, a non-clinical observer, or through a performance-based assessment. 11

COA Introduction (cont.) There are 4 types of COAs 1) patient-reported outcome (PRO) 2) clinician-reported outcome (ClinRO) 3) observer-reported outcome (ObsRO) 4) performance outcome (PerfO) This talk focuses on PROs 12

COA Introduction (cont.) A PRO is a measurement based on a report that comes directly from the patient about the status of the patient s health condition without amendment or interpretation of the patient s response by a clinician or anyone else. 13

COA Introduction (cont.) The outcome can be measured in absolute terms (e.g., severity of a symptom, sign, or state of a disease) or as a change from a previous measure PRO measures include rating scales counts of events 14

COA Introduction (cont.) PROs are a key part of patient-focused drug development (PFDD) An increasing number of drug submissions have used PROs for efficacy in their primary and secondary endpoints 15

COA Introduction (cont.) Examples of PRO instruments are the Patient Global Impression of Severity (PGIS) and the Short Form-36 (SF-36) 16

Logically Skipped Items 17

Logically Skipped Items A PRO instrument may have logically skipped items This occurs when an instrument item is asked conditionally, based on the response for a previous item in the instrument 18

Logically Skipped Items (cont.) One PRO instrument that has logically skipped items is the Work Productivity and Activity Impairment Questionnaire: General Health V2.0 (WPAI:GH) 19

Logically Skipped Items (cont.) 20

Logically Skipped Items (cont.) 21

Logically Skipped Items (cont.) 22

Logically Skipped Items (cont.) Another PRO instrument that has logically skipped items is the Patient-Reported Outcomes Version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) 23

Logically Skipped Items (cont.) Per the scoring instructions on the National Cancer Institute s website Conditional branching should be employed for electronic administration of PRO-CTCAE symptom terms that have two or more items 24

Logically Skipped Items (cont.) The logic branches from frequency, then to severity, then to interference For example, if frequency is > (greater than) never, you next pose the severity question, and if severity > none, you pose the interference question 25

Logically Skipped Items (cont.) 26

Logically Skipped Items (cont.) 27

Logically Skipped Items (cont.) 28

Logically Skipped Items (cont.) Data from questionnaires are included in the QS (questionnaires) SDTM dataset Per the CDISC SDTM Implementation Guide v3.2, the QS dataset is in the findings dataset class 29

Logically Skipped Items (cont.) Data from logically skipped instrument items have been inconsistently included in regulatory submissions or not included at all There has been a need to create data standards for logically skipped items in PRO instruments This data standard was defined in the updated October 2017 TCG V4.0 30

Logically Skipped Items (cont.) Section 4.1.1.3, SDTM Domain Specifications, in the updated TCG says that data from logically skipped items are to be included in QS as follows: 31

Logically Skipped Items (cont.) Some items in an instrument may be logically skipped per the instrument s instructions Responses for logically skipped items should be 1) recorded and/or scored according to the instructions provided in the instrument s user manual, scoring manual, or other documentation provided by the instrument developer and 2) included in the submission dataset. 32

Logically Skipped Items (cont.) Case #1: If instructions on how to record and/or score responses to logically skipped items are available from the instrument developer, then records for logically skipped items should be included in the submission dataset with the following: 33

Logically Skipped Items (cont.) QSSTAT = NOT DONE QSREASND = LOGICALLY SKIPPED ITEM QSORRES, QSSTRESC, and QSSTRESN would be assigned according to the instrument s instructions 34

Logically Skipped Items (cont.) Case #2: If instructions on how to record and/or score responses to logically skipped items are not available from the instrument developer, then records for logically skipped items should be included in the submission dataset with the following: 35

Logically Skipped Items (cont.) QSSTAT = NOT DONE QSREASND = LOGICALLY SKIPPED ITEM QSORRES, QSSTRESC, and QSSTRESN all set to null 36

Logically Skipped Items (cont.) Records from logically skipped instrument items are to be included in SDTM QS to allow for traceability to the source data (i.e., CRF data, ediary data) Data capture mechanisms may need to be modified to allow for the collection of records for logically skipped items 37

Logically Skipped Items (cont.) We strongly recommend that records from logically skipped items in ADaM datasets are carried forward from corresponding records in QS 38

Logically Skipped Items (cont.) From TCG Section 8.3.1: An important component of a regulatory review is an understanding of the provenance of the data (i.e., traceability of the sponsor s results back to the CRF data) If the reviewer is unable to trace study data from the data collection of subjects participating in a study to the analysis of the overall study data, then the regulatory review of a submission may be compromised. 39

Importance of Data from Logically Skipped Items 40

Importance of Data from Logically Skipped Items Per 21 CFR 314.126, an instrument must be well-defined and reliable : (b) An adequate and well-controlled study has the following characteristics: (6) The methods of assessment of subjects response are welldefined and reliable. 41

Importance of Data from Logically Skipped Items (cont.) This includes being able to distinguish between missing values and logically skipped items in an instrument, since a missing value for an item in an instrument is different from the item being logically skipped 42

Importance of Data from Logically Skipped Items (cont.) If that distinction cannot be made, the submitted data are not accurate, and a sponsor is not in compliance with what the FDA requires 43

Importance of Data from Logically Skipped Items (cont.) Thus, it is critical for a sponsor to be able to distinguish between logically skipped items in an instrument and items with truly missing values Those records should be distinguishable in the submitted QS dataset per the data standards defined in TCG Section 4.1.1.3, as well as in corresponding ADaM datasets 44

Importance of Data from Logically Case #1: Skipped Items (cont.) There are numerous methods for handling missing endpoint data in a clinical trial. Those missing data methods can directly impact a clinical trial s efficacy endpoint results, as well as sensitivity analysis results. 45

Importance of Data from Logically Case #2: Skipped Items (cont.) Issues can arise if data collected from an instrument have many missing values. Sometimes patients fail to report for visits, fail to complete questionnaires, or withdraw from a clinical trial before its planned completion. 46

Importance of Data from Logically Skipped Items (cont.) Those missing data can introduce bias and interfere with the ability to compare effects in the test group with the control group because only a subset of the initial randomized population contributes. These patient groups may no longer be comparable. 47

Importance of Data from Logically Case #3: Skipped Items (cont.) Item-level analyses are conducted during PRO instrument development During instrument development, it is necessary to know whether items have truly missing values or are logically skipped 48

Conclusion 49

Conclusion PROs have become a key part of PFDD An increasing number of drug submissions have used PROs for efficacy in their primary and secondary endpoints 50

Conclusion (cont.) Recently passed legislation, as well as FDA regulations, call for patient experience data that are well-defined and reliable Those well-defined and reliable data should be included in submissions to the FDA The FDA will expect inclusion of records from logically skipped instrument items as a data standard 51

Thank you 52

SEND TCG Updates Elaine Thompson, Ph.D. Sr. Staff Fellow FDA CDISC-SEND Liaison CDER/Office of Computational Sciences

Major SEND TCG Updates Study Type Section SEND Domain Specifications General considerations MI Domain CL Domain PC Domain Custom Domains Tumor Datasets Legacy Data 54

Section 4.1.3.1 Study Types The Standard for Exchange of Nonclinical Data (SEND) provides the organization, structure, and format of standard nonclinical (animal toxicology studies) tabulation datasets for regulatory submission. The SEND Implementation Guide (SENDIGv3.0) supports single-dose general toxicology, repeat-dose general toxicology, and carcinogenicity studies. SENDIG v3.1 additionally supports respiratory and cardiovascular safety pharmacology studies. Addition of SEND 3.1 to the Data Standards Catalog was announced in the Federal Register in August 2017 Sunset dates for SEND 3.0 were also established The SEND 3.1 requirement includes respiratory and cardiovascular safety pharmacology studies 55

Section 4.1.3.2 Definitions Sponsors should use the VISITDY or --NOMDY variable appropriate to the selected SENDIG version if findings, which were intended to be analyzed together, were collected across multiple study days. SENDIG 3.1 uses the --NOMDY variable to group measurements collected on grace days. For SEND 3.0, continue to use VISITDY to indicate grouping For SEND 3.1, use --NOMDY 56

MI Domain When histopathology severity data are collected on a severity scale that cannot be represented using the CDISC MISEV codelist without a loss of scientific accuracy (e.g. data were collected on 3 levels or 4 levels but MISEV specifies 5 levels), severity scores may be represented in MISEV as 1 of 4 2 of 4 or 1 of 3 as appropriate, where the first number is the score and the second is the number of available severities in the scale. A score of 1 should be the least severe finding. Extend the non-extensible MISEV codelist with the necessary terms to describe the alternative severity scores, include these extended values in the define.xml and nsdrg, and explain any resulting validation error(s) in the nsdrg. CDISC provides a 5-level severity scale SLIGHT, MILD, MODERATE, MARKED, SEVERE Some organizations have trouble mapping data collected on a 6- level or 4-level scale into the CDISC 5-level scale When data are difficult to map, this provides an alternative 57

MI Example Study Report Severity Control 20 mg/kg Minimal 1 2 Slight 0 3 Mild 0 5 SEND Problem Severity Control 20 mg/kg MINIMAL 1 2 MILD 0 3 MODERATE 0 5 New TCG Alternative Severity Control 20 mg/kg 1 of 6 1 2 2 of 6 0 3 3 of 6 0 5 58 58

CL Domain The information in CLTEST and CLSTRESC, along with CLLOC and CLSEV when appropriate, should be structured to permit grouping of similar findings and thus support the creation of scientifically interpretable incidence tables. Differences between the representation in CL and the presentation of Clinical Observations in the Study Report which impact traceability to the extent that terms or counts in incidence tables created from CL cannot be easily reconciled to those in the Study Report should be mentioned in the nsdrg. Updated to explain that FDA uses the data for incidence tables Differences in terms that toxicologists can understand are acceptable (e.g. seizures in the Study Report vs. convulsions observed post dose in SEND) Differences in numbers of animals should be explained (e.g. 12 animals with seizures in the Study Report vs. 15 in SEND) 59

PC Domain If the nominal times are provided in PCELTM, nulls should be avoided for plasma concentrations used to calculate a profile. PCDTC and PCDY variables should be populated with actual/collected information when it available; however, for GLP single dose, repeat dose, or carcinogenicity studies where actual/collected information are documented on paper and not available electronically, these variables may be left null or populated with calculated or nominal dates/times. The use of calculated or nominal dates and times should be mentioned in the nsdrg. FDA recognizes that providing exact plasma collection times in PCDTC can be burdensome For GLP studies when it is not feasible to provide PCDTC, the column may be: Populated with only the collection date Populated with the collection date and protocol time NULL PCDY should still be provided and may be imputed 60

Custom Domains To provide study data that does not fit into an existing SEND domain, draft SEND domain, or published SDTM domain, consider creating a custom dataset aligned with the Study Data Tabulation Model (SDTM). Questions about custom domains should be addressed in pre-submission meetings and documented in the SDSP. CDISC provides a CoDEx document that describes which data can be confidently exchanged in SEND Sometimes it is necessary to provide additional data to fully represent a study, particularly for nonclinical efficacy studies This paragraph provides FDA preferences about how to represent data that are not modeled in the SENDIG 61

Tumor Datasets Carcinogenicity studies should include an electronic dataset of tumor findings to allow for a complete review. At this time sponsors should continue to include the tumor.xpt and associated define.pdf files regardless of whether or not the study is in SEND format (See tumor.xpt file specification and mappings to the SEND standard available in the SENDIG). When both tumor.xpt and SEND are submitted, the sponsor should ensure that data are traceable between tumor.xpt and the SEND datasets. Any information needed to establish traceability should be presented in the nsdrg. FDA still requires tumor.xpt Discrepancies between SEND datasets, the Study Report, and the tumor.xpt dataset are impossible for reviewers to interpret and thus review cannot be performed Ensure that there is consistency across all information submitted for carcinogenicity assessment 62

Section 8.3.2 Legacy Data For nonclinical studies where data is converted to SEND from a previously established collection system, instances may arise where it is not possible to represent a collected data element as a standardized data element. In these cases, there should be an explanation in the nsdrg as to why certain data elements could not be fully standardized or were otherwise not included in the standardized data submission. As the Study Report should contain a complete representation of the study data in the individual animal listings, no nonstandardized electronic study data should be submitted. The PDF Study Report should contain all of the collected data in the Individual Animal Listings When data elements cannot be provided electronically in SEND, explain in the nsdrg ( Nonclinical Study Data Reviewers Guide) No non-standardized study data should be submitted unless requested by a review division 63

Section 8.3.2 - Legacy Data Submission of a Legacy Data Conversion Plan and Report is not expected for nonclinical studies where data were collected in a previously established data collection system. Legacy Data Conversion Plans are not expected for nonclinical studies Mappings from an established data collection system to SEND are not expected unless requested by a review division 64

Summary The TCG has been updated for SEND 3.1 Implementation details for MI, CL, and PC Custom domains should be discussed pre-submission and documented in the SDSP (Study Data Standardization Plan) Ensure consistency among the SEND data, tumor.xpt, and the Study Report in carcinogenicity submissions No non-standardized electronic data or Legacy Data Conversion Plans should be included with nonclinical toxicology studies in SEND unless they are requested by a review division 65

Click for: CFR 21 Part 314 PDUFA VI FDA PRO Guidance for Industry BEST (Biomarkers, EndpointS, and other Tools) FDA COA Qualification Program FDA TCG Work Productivity and Activity Impairment Questionnaire: General Health V2.0 NCI PRO-CTCAE PDF of today s slides Information For Industry Email any remaining questions to us at: CDERSBIA@fda.hhs.gov Open Q&A begins shortly type in your questions now. Click Here for Evaluation and Certificate Learn about other resources from CDER Small Business & Industry Assistance: Visit Our Website! 66