Beyond the Assumption of Constant Hazard Rate in Estimating Incidence Rate on Current Status Data with Applications to Phase IV Cancer Trial


 Miles Simon Francis
 5 months ago
 Views:
Transcription
1 Beyond the Assumption of Constant Hazard Rate in Estimating Incidence Rate on Current Status Data with Applications to Phase IV Cancer Trial Deokumar Srivastava, Ph.D. Member Department of Biostatistics St. Jude Children s Research Hospital Society for Clinical Trials May 1720, 2015 Joint work with Liang Zhu, Melissa Hudson, Jianmin Pan, and Shesh N Rai
2 Outline Motivating Example Statistical Methodology Simulation and Application Conclusion
3 Motivating Example Hudson et al. (2007), JCO, Noninvasive evaluation of late anthracycline cardiac toxicity in childhood cancer survivors (CCS) CCS treated with anthracylcines and cardiac radiation are at increased risk for late onset of cardiac toxicity. Not feasible to evaluate patients frequently on a long term basis to obtain the data. Only the current status of the patient with onset prior to current status. Risk Groups: At Risk: treated with cardiotoxic therapy Not at Risk: Did not receive cardiotoxic therapy
4 Description of Data
5 Cardiac Toxicity Measures Fractional Shortening (SF): Left ventricular systolic performance Abnormal if SF < 0.28 Afterload (AF): Left ventricular endsystolic wall stress Abnormal if AF > 74 g/cm 2
6 Statistical Problem To estimate the cumulative incidence and confidence intervals.
7 Interval Censored Data Interval Censored Data Instead of observing the exact time (T) of the event it is known that the event occurred in the interval (L,R] Current Status Data (Case I, ICD) Either L=0 or R=, that is either left or rightcensored General or Case II ICD At least one with both L and R (0, ) that is, include at least one finite interval away from zero.
8 General Model X(t) state occupied by a survivor at time t T Observation time to death, cardiac failure or survey U Time to cardiac abnormality Alive with No Cardiac abnormality 1 1 (u) (u) 1 3 Alive with Cardiac abnormality 3 (t u) The intensities 1 (u), 2 (u), and 3 (t u) are transition rates Death/Cardiac Failure
9 Psuedo Survival Functions The pseudo survival functions corresponding to 1 (u), 2 (u), and 3 (t u) are: t Q i t = exp i ( 0 v dv} for i=1,2 t Q 3 t u = exp { 3 v u dv} 0 t Q t = exp 1 v + 2 (v) 0 dv = Q 1 (t)q 2 (t) The probability that the time to the first event alive with abnormal value or death with normal value exceeds t.
10 Likelihood Construction t Interest in estimating Λ 1 t = 1 u du 0 cumulative incidence function., the
11 Likelihood Function The likelihood function is n L θ = {a i logl 1 t i + b i logl 2 t i + c i logl 3 t i i=1 + d i logl 4 (t i )} where a i = δ i 1 γ i, b i = (1 δ i ) 1 γ i, c i = δ i γ i, and d i = (1 δ i )γ i are the indicators corresponding to observation types 1 to 4
12 Analysis using Piecewise Exponential Model Rai et al. (2013) conducted the analysis assuming exponential and piecewise exponential distribution. Assumed 1 to be piecewise exponential with parameters 11 for t < t c (2years or 5years) and 12 for t t c. One has to know the number and location of change points One has to assume that the intensity function is constant within each period. These assumptions are hard to justify in practice and an alternative approach free of these assumptions would be preferable.
13 Analysis using Weibull Model The likelihood based on Weibull distribution for the special case 2 = 3 = 0 and a i = c i = 0 can be represented as, n log L 1, α 1 = b i 1 t α 1 i i=1 + n i=1 d i log 1 exp ( 1 t i α 1 ) The maximum likelihood estimates can be obtained by solving the likelihood equations obtained by taking the derivative of the loglikelihood with respect to the parameters and equating them to zero.
14 Simulation Study Setting I: Data generated from piecewise exponential and the estimates were obtained using piecewise exponential and Weibull distributions. Setting II: Data generated from two different Weibull distribution to mimic slow increasing hazard and rapidly increasing hazard and once again the estimates of CI were obtained Weibull and piecewise exponential distributions. Setting III: Data was generated from piecewise exponential as in Setting 1 but the change point was assumed to be at t1, t and t+1 years.
15 Performance of Weibull Model for the data generated from piecewise Exponential Distribution with 11 = 0.15 and 12 = 0.4 (n=400) Followup time/change point Time True Piecewise Exponential Weibull CI CI (SE) CI (SE) 10 years/ 2 year (0.03) 0.14 (0.03) (0.05) 0.32 (0.04) (0.03) 0.50 (0.04) (0.03) 0.65 (0.03) (0.03) 0.76 (0.03) (0.02) 0.84 (0.02) (0.02) 0.90 (0.02) (0.02) 0.94 (0.02) (0.01) 0.96 (0.01) (0.01) 0.98 (0.01) 10 years/ 5 year (0.02) 0.10 (0.03) (0.03) 0.24 (0.03) (0.03) 0.38 (0.03) (0.04) 0.50 (0.03) (0.04) 0.61 (0.03) (0.03) 0.70 (0.03) (0.03) 0.78 (0.03) (0.03) 0.83 (0.03) (0.03) 0.88 (0.03) (0.02) 0.91 (0.02)
16 Performance of Piecewise Exponential with three assumed change points (a), (b) and (c) for the data generated from Weibull distributions corresponding to Cases A and B with 10 year followup and sample size n=400 Cases * time True Weibull Piecewise Exponetial (a) ** Piecewise Exponetial (b) ** Piecewise Exponetial (c) ** CI CI (SE) CI (SE) CI (SE) CI (SE) A (0.02) 0.03 (0.03) 0.06 (0.02) 0.09 (0.02) (0.03) 0.22 (0.02) 0.12 (0.04) 0.17 (0.03) (0.03) 0.37 (0.02) 0.32 (0.03) 0.24 (0.05) (0.03) 0.49 (0.03) 0.47 (0.03) 0.43 (0.03) (0.03) 0.59 (0.03) 0.59 (0.03) 0.58 (0.03) (0.03) 0.67 (0.03) 0.68 (0.03) 0.68 (0.03) (0.03) 0.74 (0.03) 0.75 (0.03) 0.76 (0.03) (0.03) 0.79 (0.03) 0.81 (0.03) 0.82 (0.03) (0.03) 0.83 (0.02) 0.85 (0.02) 0.87 (0.03) (0.02) 0.86 (0.02) 0.88 (0.02) 0.90 (0.02) B (0.01) 0.10 (0.02) 0.04 (0.02) 0.14 (0.01) (0.03) 0.18 (0.03) 0.07 (0.03) 0.26 (0.02) (0.04) 0.26 (0.04) 0.35 (0.02) 0.36 (0.03) (0.04) 0.33 (0.05) 0.54 (0.03) 0.45 (0.04) (0.03) 0.61 (0.03) 0.68 (0.03) 0.52 (0.04) (0.03) 0.78 (0.03) 0.77 (0.03) 0.59 (0.04) (0.02) 0.87 (0.02) 0.84 (0.02) 0.83 (0.03) (0.02) 0.92 (0.02) 0.89 (0.02) 0.93 (0.02) (0.01) 0.95 (0.01) 0.92 (0.02) 0.97 (0.01) (0.01) 0.97 (0.01) 0.94 (0.01) 0.99 (0.01)
17 Performance of Weibull Model and Piecewise Exponential Distribution for varying change points when the data is generated from piecewise Exponential Distribution with 11 = 0.15 and 12 = 0.4 (n=400) Followup time/change point Time True Weibull Piecewise Exponetial (a) ** Piecewise Exponetial (b) ** Piecewise Exponetial (c) ** 10 years/ 2 year (0.03) 0.10 (0.04) 0.14 (0.03) 0.17 (0.03) (0.04) 0.36 (0.03) 0.26 (0.05) 0.32 (0.04) (0.04) 0.55 (0.03) 0.51 (0.03) 0.44 (0.05) (0.03) 0.68 (0.03) 0.67 (0.03) 0.64 (0.03) (0.03) 0.77 (0.02) 0.78 (0.03) 0.77 (0.03) (0.02) 0.84 (0.02) 0.85 (0.02) 0.85 (0.02) (0.02) 0.89 (0.02) 0.90 (0.02) 0.90 (0.02) (0.02) 0.92 (0.02) 0.93 (0.02) 0.94 (0.02) (0.01) 0.94 (0.01) 0.95 (0.01) 0.96 (0.01) (0.01) 0.96 (0.01) 0.97 (0.01) 0.97 (0.01) 10 years/ 5 year (0.03) 0.13 (0.02) 0.14 (0.02) 0.15 (0.01) (0.03) 0.25 (0.03) 0.26 (0.03) 0.27 (0.02) (0.03) 0.35 (0.04) 0.36 (0.03) 0.38 (0.03) (0.03) 0.43 (0.04) 0.45 (0.04) 0.47 (0.03) (0.03) 0.59 (0.03) 0.53 (0.04) 0.55 (0.04) (0.03) 0.71 (0.03) 0.69 (0.03) 0.62 (0.04) (0.03) 0.79 (0.03) 0.79 (0.03) 0.77 (0.03) (0.03) 0.85 (0.03) 0.86 (0.03) 0.86 (0.03) (0.03) 0.89 (0.02) 0.90 (0.03) 0.91 (0.03) (0.02) 0.92 (0.02) 0.94 (0.02) 0.96 (0.02)
18 Application to AAF data Nonparam etric Weibull Exp1 Exp2 Year CI CI SE CI SE CI SE
19 Application to AAF data
20 Conclusions An alternative to piecewise exponential distribution for modeling CI for the current status data. Proposed the use of Weibull distribution which overcomes the limitations inherent with piecewise exponential approach. Assuming the location of cutpoints Deciding on the number of cutpoints Assuming the constant hazard within each time period. The simulations and applications suggest that the proposed approach is reasonable and provides for a viable alternative to modeling current status data.
21 Future Work 34 observations missing in AF and 6 in both AF and FS. Imputation approach to provide more efficient estimates of the CI Manuscript evaluating the performance of imputation is currently under review in Statistical Methods in Medical Research. AF is often evaluated as a continuous measure and FS is evaluated as a binary measure (normal/abnormal) Both are correlated (Joint modeling is appropriate and needed) We are working on a manuscript that compares the Bayesian and frequentist approaches for evaluating treatment effect with mixed endpoints.
22 Thank You!?