FDA Study Data Technical Conformance Guide v4.2

Similar documents
Study Data Technical Conformance Guide (TCG)

CBER STUDY DATA STANDARDS UPDATE

Study Data Reviewer s Guide

Improving Metadata Compliance and Assessing Quality Metrics with a Standards Library

DIA 11234: CDER Data Standards Common Issues Document webinar questions

Study Data Reviewer s Guide Completion Guideline

Preparing the Office of Scientific Investigations (OSI) Requests for Submissions to FDA

Material covered in the Dec 2014 FDA Binding Guidances

STUDY DATA TECHNICAL CONFORMANCE GUIDE

Comparison of FDA and PMDA Requirements for Electronic Submission of Study Data

CDASH Standards and EDC CRF Library. Guang-liang Wang September 18, Q3 DCDISC Meeting

PharmaSUG Paper CD16

Elements of Data Management

Implementing CDISC Using SAS. Full book available for purchase here.

IS03: An Introduction to SDTM Part II. Jennie Mc Guirk

Dataset-XML - A New CDISC Standard

Pharmaceuticals, Health Care, and Life Sciences. An Approach to CDISC SDTM Implementation for Clinical Trials Data

Introduction to ADaM and What s new in ADaM

How to write ADaM specifications like a ninja.

Global Checklist to QC SDTM Lab Data Murali Marneni, PPD, LLC, Morrisville, NC Sekhar Badam, PPD, LLC, Morrisville, NC

Dealing with changing versions of SDTM and Controlled Terminology (CT)

From Implementing CDISC Using SAS. Full book available for purchase here. About This Book... xi About The Authors... xvii Acknowledgments...

Submission-Ready Define.xml Files Using SAS Clinical Data Integration Melissa R. Martinez, SAS Institute, Cary, NC USA

Helping The Define.xml User

PharmaSUG Paper PO22

Traceability Look for the source of your analysis results

Paper DS07 PhUSE 2017 CDISC Transport Standards - A Glance. Giri Balasubramanian, PRA Health Sciences Edwin Ponraj Thangarajan, PRA Health Sciences

OpenCDISC Validator 1.4 What s New?

Biotechnology Industry Organization 1225 Eye Street NW, Suite 400 Washington, DC 20006

Tips on Creating a Strategy for a CDISC Submission Rajkumar Sharma, Nektar Therapeutics, San Francisco, CA

A CMC Reviewer s Perspective on the Quality Overall Summary. Arthur B. Shaw, Ph.D. FDA/CDER/ONDQA FDA DMF Expert June 15, 2010.

Revision of Technical Conformance Guide on Electronic Study Data Submissions

Study Data Reviewer s Guide. FDA/PhUSE Project Summary

On the Horizon: New Trial Summary Dataset (TS)

Bay Area CDISC Network: PhUSE Working Group for Inspection Site Selection Data Standards

ectd TECHNICAL CONFORMANCE GUIDE

Data Consistency and Quality Issues in SEND Datasets

esubmission - Are you really Compliant?

Common Programming Errors in CDISC Data

Analysis Data Model Implementation Guide Version 1.1 (Draft) Prepared by the CDISC Analysis Data Model Team

Figure 1. Table shell

Sandra Minjoe, Accenture Life Sciences John Brega, PharmaStat. PharmaSUG Single Day Event San Francisco Bay Area

Beyond OpenCDISC: Using Define.xml Metadata to Ensure End-to-End Submission Integrity. John Brega Linda Collins PharmaStat LLC

Best Practice for Explaining Validation Results in the Study Data Reviewer s Guide

PhUSE Paper SD09. "Overnight" Conversion to SDTM Datasets Ready for SDTM Submission Niels Mathiesen, mathiesen & mathiesen, Basel, Switzerland

An Alternate Way to Create the Standard SDTM Domains

How to handle different versions of SDTM & DEFINE generation in a Single Study?

An Efficient Solution to Efficacy ADaM Design and Implementation

Cost-Benefit Analysis of Retrospective vs. Prospective Data Standardization

GDUFA s Impact on ectds / Case Study: Implementing PDF Standards

Introduction to Define.xml

Updates on CDISC Standards Validation

Legacy to SDTM Conversion Workshop: Tools and Techniques

How a Metadata Repository enables dynamism and automation in SDTM-like dataset generation

Managing your metadata efficiently - a structured way to organise and frontload your analysis and submission data

CDISC SDTM and ADaM Real World Issues

How to review a CRF - A statistical programmer perspective

Let s Create Standard Value Level Metadata

Challenges with the interpreta/on of CDISC - Who can we trust?

Lex Jansen Octagon Research Solutions, Inc.

CDASH MODEL 1.0 AND CDASHIG 2.0. Kathleen Mellars Special Thanks to the CDASH Model and CDASHIG Teams

Paper FC02. SDTM, Plus or Minus. Barry R. Cohen, Octagon Research Solutions, Wayne, PA

The Wonderful World of Define.xml.. Practical Uses Today. Mark Wheeldon, CEO, Formedix DC User Group, Washington, 9 th December 2008

Conversion of Company Standard Trial Data to SDTM

Harmonizing CDISC Data Standards across Companies: A Practical Overview with Examples

Creating Define-XML v2 with the SAS Clinical Standards Toolkit 1.6 Lex Jansen, SAS

SAS Clinical Data Integration 2.6

SAS Clinical Data Integration 2.4

Catherine Hosage Norman, Ph.D., RAC. January 11, 2012

Streamline SDTM Development and QC

What is high quality study metadata?

Study Data Tabulation Model Implementation Guide: Human Clinical Trials Prepared by the CDISC Submission Data Standards Team

The Benefits of Traceability Beyond Just From SDTM to ADaM in CDISC Standards Maggie Ci Jiang, Teva Pharmaceuticals, Great Valley, PA

US Food and Drug Administration. Revision History

Leveraging Study Data Reviewer s Guide (SDRG) in Building FDA s Confidence in Sponsor s Submitted Datasets

PharmaSUG Paper DS24

2 nd ehs Workshop, CC-IN2P3 14 th October 2015, Lyon,

Recent Developments in FDA s Review of Proprietary Names for Drugs

SDTM Implementation Guide Clear as Mud: Strategies for Developing Consistent Company Standards

ectd Next Major Version / Regulated Product Submission

The application of SDTM in a disease (oncology)-oriented organization

Edwin Ponraj Thangarajan, PRA Health Sciences, Chennai, India Giri Balasubramanian, PRA Health Sciences, Chennai, India

Automation of SDTM Programming in Oncology Disease Response Domain Yiwen Wang, Yu Cheng, Ju Chen Eli Lilly and Company, China

Unique Device Identification (UDI) Status, Learnings, Next Steps

Standards Metadata Management (System)

Experience of electronic data submission via Gateway to PMDA

Define.xml 2.0: More Functional, More Challenging

Aquila's Lunch And Learn CDISC The FDA Data Standard. Disclosure Note 1/17/2014. Host: Josh Boutwell, MBA, RAC CEO Aquila Solutions, LLC

Standardising The Standards The Benefits of Consistency

Managing CDISC version changes: how & when to implement? Presented by Lauren Shinaberry, Project Manager Business & Decision Life Sciences

Applying ADaM Principles in Developing a Response Analysis Dataset

SAS Application to Automate a Comprehensive Review of DEFINE and All of its Components

Introduction to ADaM standards

Why organizations need MDR system to manage clinical metadata?

Generating Define.xml from Pinnacle 21 Community

CDISC Nutrition Therapeutic Area User Guide Public Review Webinar

Application of SDTM Trial Design at GSK. 9 th of December 2010

FDA XML Data Format Requirements Specification

Planning to Pool SDTM by Creating and Maintaining a Sponsor-Specific Controlled Terminology Database

Acceptance Checklist for Abbreviated 510(k)s

Transcription:

FDA Study Data Technical Conformance Guide v4.2 November 2018 Helena Sviglin, MPH Regulatory Information Specialist Computational Science Center CDER

Topics Covered in this Webinar New content for v4.2 Appendices New parameter codes New TAUG Protocol Deviations Updates SEND 2

Revision History 3

October Study Data Technical Conformance Guide v4.2 NEW CONTENT 4

Appendix B: Trial Summary (TS) Parameters for Submission Clinical 60 parameters, using CDISC controlled terminology These parameters are desired if it makes sense for the trial Otherwise, follow the appropriate implementation guide 5

Appendix C: Trial Summary (TS) Parameters for Submission Nonclinical 41 parameters, using CDISC controlled terminology These parameters are desired if it makes sense for the trial Otherwise, follow the appropriate implementation guide 6

Appendix D: Additional Documents Evaluated By FDA Agency encourages use of the documents listed in this appendix. EX. Confirmed Data Endpoints for Exchange (CoDEx) for SENDIG v3.0 Data 7

Appendix E: Example Study Data Folder Structure Detailed example of file folder structures for nonclinical datasets in both standardized and legacy formats 8

Appendix E: Example Study Data Folder Structure Even when SEND is required and submitted, tumor.xpt should be placed in analysis/dataset/legacy (see 4.1.3.3) 9

Appendix F: Example Of Simplified TS Dataset For Clinical Data Used to establish study start date during ectd validation 10

Appendix G: Example Of Simplified TS Dataset For Nonclinical Data Used to establish study start date during ectd validation 11

Section 4.1.1.3 New Parameter Codes Allows Agency to understand which Therapeutic Area User Guides (TAUGs), Technical Specifications, Implementation Guides, and Models were utilized for a particular study Agency worked with EVS for these terminology extensions for the variables TSPARM and TSPARMCD 12

Section 5.2 Therapeutic Area User Guides (TAUGs), 1 of 2 Added version numbers to TAUGs Added trial summary parameters capture use 13

Section 5.2 Therapeutic Area User Guides (TAUGs), 2 of 2 New TAUGs for October TCG: Major Depressive Disorder Therapeutic Area User Guide v1.0 Duchenne Muscular Dystrophy Therapeutic Area User Guide v1.0 Traumatic Brain Injury Therapeutic Area User Guide v1.0 Vaccines Therapeutic Area User Guide v1.1 14

Section 4.1.1.3 DV Domain (Protocol Deviations), 1 of 2 The DV domain should be included in your submission. It will be used by reviewers to examine protocol deviation trends of various study sites in order to facilitate the Bioresearch Monitoring Program (BIMO) clinical investigator site selection process, and once FDA tools are developed to extract and format needed data from SDTM, to populate line listings used by the Office of Regulatory Affairs (ORA) investigators during inspections 15

Section 4.1.1.3 DV Domain (Protocol Deviations), 2 of 2 The following variables besides CDISC required variables should be included in the DV domain when submitting DV data: DVSPID, DVTERM, DVDECOD, DVCAT, DVSCAT, DVSTDTC, DVENDTC and EPOCH. 16

October Study Data Technical Conformance Guide v4.2 SEND UPDATES Elaine E. Thompson, PhD Senior Staff Fellow High-performance Integrated Virtual Environment (HIVE) Office of Biostatistics and Epidemiology CBER FDA 17

Summary of major SEND updates Place tumor.xpt in analysis/dataset/legacy Include a simplified ts.xpt with tumor.xpt Categorical data in LB should be placed in LBSTRESC Detailed instructions for LOQ and --CALCN have been provided BG is not required for CDER submissions CO should be limited to relevant content Use of the CoDEX is encouraged 18

Appendix E: Example Study Data Folder Structure Even when SEND is required and submitted, tumor.xpt should be placed in analysis/dataset/legacy (see 4.1.3.3) 19

Section 4.1.3.3 LB Domain Categorical, noncontinuous results reported as incidence counts rather than summary statistics (i.e. mean and standard deviation) should be placed in LBSTRESC even if the categories are numbers. LBSTRESN should be null. Specifically, this includes urinalysis tests where the results are values on a scale. For example, if the allowable values for a urine glucose dipstick test are: NEGATIVE, 100, 250, 500, 1000, >2000, results should only be placed in LBSTRESC. Placing categorical results in STRESC allows straightforward creation of incidence tables on STRESC. Some tests like dipsticks give categorical results Ex. A scale of negative, 100, 250, 500, >1000 Place all results in LBSTRESC Leave LBSTRESN null 20

Section 4.1.3.3 LB Domain When a laboratory test result is either above or below the limit of quantification (LOQ) for the measurement method and this result was used in calculation of group means in the study report, the value used for calculation should be submitted using the supplemental qualifier variable --CALCN. It may not be clear what number was used in the Study Report group means for tests outside the limit of quantification (LOQ) Place numbers necessary to reproduce the group means in SUPPLB in LBCALCN 21

Section 4.1.3.3 PC Domain When a test result is below a LOQ, it should be submitted using the following instructions: PCORRES should contain the actual/verbatim test result. The value of BLQ should be in PCSTRESC to signify that the result is below the LOQ. PCSTRESN should be blank. Standardized units for LOQ should be in PCSTRESU. PCLLOQ should be populated. When a numeric value has been assigned to a result that is below the LOQ for the purpose of group summary statistics, that value should be submitted in SUPPPC as QNAM= PCCALCN to allow the group statistics presented in the study report to be reproduced. When a value that is below the lower LOQ is excluded from group statistics, no PCCALCN entry is needed. Different implementations of results below the limit of quantitation are complicated to use More detail about how to handle <LOQ results has been added 22

Section 4.1.3.3 - BG Domain It is not necessary to include a BG domain in CDER submissions. CDER is currently only using the BW domain to review changes in body weight Other centers are still evaluating BG 23

Section 4.1.3.3 - CO Domain Comments submitted in the CO domain should be relevant to study interpretation. Review software displays all comments submitted, sometimes making it difficult to locate those that are relevant Limit comments to those relevant to the study 24

Q&A and Resources Click for: Study Data Resources Website Study Data Technical Conformance Guide v4.2 PDF of today s slides Additional questions to CDER? Email: cder-edata@fda.hhs.gov Additional questions to CBER? Email: cber.cdisc@fda.hhs.gov Open Q&A begins shortly type in your questions now. Click Here for Evaluation and Certificate Learn about other resources from CDER Small Business & Industry Assistance: Visit Our Website! 25