200 likes | 357 Views
Cox Model With Intermitten and Error-Prone Covariate Observation. Yury Gubman PhD thesis in Statistics Supervisors: Prof. David Zucker, Prof. Orly Manor. Introduction. Regression analysis of right-censored survival data commonly arises in many fields, especially in medical science.
E N D
Cox Model With Intermitten and Error-Prone Covariate Observation Yury Gubman PhD thesis in Statistics Supervisors: Prof. David Zucker, Prof. Orly Manor
Introduction • Regression analysis of right-censored survival data commonly arises in many fields, especially in medical science. • The most popular survival data regression model is the Cox (1972) proportional hazard model, in which the hazard function for an individual with covariate vector X is modeled as
Introduction • In many cases, the covariate X is not measured exactly. Instead of X, we observe a surrogate measure W, which is subject to error. Measurement error in the covariates has three main effects: • causes bias in parameter estimation for statistical models; • leads to a loss of power, sometimes profound, for detecting covariate effects; • masks the features of the data, making graphical model analysis difficult. • In addition, although in theory the covariate is a continuous process in time, in practice measurements are taken only over a discrete grid of timepoints (every 6 months, once a year, etc.). This discrepancy may also lead to a bias.
Introduction • The classical measurement error model for individual i and measurement j can be written as: where measurement error term is independent of Xiwith zero mean given Xi. Independence across i is also assumed. • Let n be the number of individuals and J the number of observations (we assume that J is the same for all individuals), and suppose the measurements of the surrogate covariate are taken at timepoints .
Introduction • Tsiatis and Davidian (2004) have suggested a so-called joint model, where models for event time distribution and longitudinal data depend on a common set of latent random variables. • Andersen and Liestol (2003) proposed a simple approach based on the regression calibration idea. The bias due to observing the longitudinal covariate process over a discrete grid is handled by introducing additional variables to the standard Cox model, while measurement error is treated using an external procedure. • Most of the proposed approaches are limited to some special cases or/and specific distribution assumptions.
Proposed Method • We propose an approach that is of intermediate complexity relative to the simple approach of Andersen and Liestol and the joint modeling approach. • We assume additive model for measurement error. The error term is independent across i and j, and independent of all other random variables in the model. We do not assume a specific parametric distribution of εij. • We assume a working parametric model F for the conditional distribution , t2 > t1. • Note that we do not assume that the data is actually distributed according to F, but rather than F is a close enough approximation to yield reasonable estimators of Cox coefficients.
Proposed Method • Assume that Moment Generating Function (MGF) of F exists and is well-defined. • To cover a variety of cases, flexible distributions may be used, such as the Semi-Nonparametric distribution (SNP) of Gallant and Nychka, 1987.
Proposed Method • To start with, assume that (no measurement error). Define: , where Ti - event time, δi is the event indicator. • It follows, that hazard function may be represented as: where Y(t) is an indicator to be at risk at time t.
Proposed Method • The above expression may be approximated by the moment generating function (MGF) of F. It follows that approximated hazard is given by: where is a parameters vector of F estimated at t, is a nearest timepoint before t at which Wi is availible. • We need to evaluate at every timepoint t. However, we can see Wi only at .
Proposed Method • The discrete grid problem is treated by introducing working models for each central moment mk: • In the above expression, gk is some function (chosen by numerical reasons), and Slopehist is a slope of the historical data (before t1). • θk are estimated using all available data at the observed timepoints τq, τp (τq<τp, conditioning on being at risk at τq). OLS technique is applied.
Proposed Method • Using estimated coefficients of the working model in the previous slide: • Solve: • Given these estimated moments, the formulas for F moments can be backsolved and may by calculated for every t. • Given the above, the MGF is defined at every point, and the hazard can be calculated for every t > s. • The estimator for is obtained using the Cox partial likelihood, incorporating the proposed hazard function.
Proposed Method • Note that: • Cox partial likelihood is given in this case by: • Variance of is estimated using Weighted Bootstrap approach.
Simulation Study • Data simulation is based on the setting of Andersen and Liestol (2003), patterned after a clinical trial studying the effect of prednisone treatment versus placebo on survival with liver cirrhosis (Christensen, 1986). • The true data is simulated from the model: where tdenotes a common trend of the form , Airepresents initial variation between individuals, and is a stochastic process representing changes in the covariate over time. • We assume the and the measurement error is normal with zero mean and variance .
Simulation Study • Following Andersen and Liestol (2003), we take Ui(t) to be either a Brownian motion (BM) process or an Ornstein–Uhlenbeck (OU) process with correlation parameter 0.282. • The trend term was taken to be a linear function with initial level of 72 and a decrease of 5 units per year. • In the paper, Cox regression parameter is: β=-0.04. • Sample size is 300, and 12 observations are available, every half a year (total trial length is 6 years). • Failure times were simulated from a Weibull hazard. Each result was obtained based on 500 simulation runs.
Results Estimation of β, W(t) is measured without error
Results Estimation of β, W(t) is measured with error,
Results Variance estimation by Weighted Bootstrap Proposed method with SNP approximation,
Conclusions • We propose a new semiparametric estimation approach for the Cox regression model when covariates are measured intermittently and with error. The intermittent measurement issue is handled by modeling the parameters of the distribution of the covariate among individuals still at risk as a function of time. The relative risk is then computed using the MGF. • The accuracy of the proposed estimators depends critically on the form of the OLS working model for in-between times. Increasing the accuracy of estimates by assuming more flexible interpolation model is a topic for future research.
Conclusions • In a simulation study we found that in most cases the proposed method provides reasonable estimates for the Cox regression parameter and its standard deviations. • Because the SNP model covers a range of distributional shapes, the method can be applied in a range of settings. • The computational burden is moderate – less than one minute for one run for the SNP - based procedure.