Prospective Accuracy Measures in Joint Models for Longitudinal and Time-to-Event Data (original) (raw)
Related papers
Joint modeling of longitudinal and survival data
The joint modelling of longitudinal and survival data has received remarkable attention in the methodological literature over the past decade; however, the availability of software to implement the methods lags behind. The most common form of joint model assumes that the association between the survival and longitudinal processes are underlined by shared random effects. As a result, computationally intensive numerical integration techniques such as adaptive Gauss-Hermite quadrature are required to evaluate the likelihood. We describe a new user written command, stjm, which allows the user to jointly model a continuous longitudinal response and the time to an event of interest. We assume a linear mixed effects model for the longitudinal submodel, allowing flexibility through the use of fixed and/or random fractional polynomials of time. Four choices are available for the survival submodel; namely the exponential, Weibull or Gompertz proportional hazard models, and the flexible parametric model (stpm2). Flexible parametric models are fitted on the log cumulative hazard scale which has direct computational benefits as it avoids the use of numerical integration to evaluate the cumulative hazard. We describe the features of stjm through application to a dataset investigating the effect of serum bilirubin level on time to death from any cause, in 312 patients with primary biliary cirrhosis.
Extensions in the field of joint modeling of correlated data and dynamic predictions improve the development of prognosis research. The R package frailtypack provides estimations of various joint models for longitudinal data and survival events. In particular, it fits models for recurrent events and a terminal event (frailtyPenal), models for two survival outcomes for clustered data (frailtyPenal), models for two types of recurrent events and a terminal event (multivPenal), models for a longitudinal biomarker and a terminal event (longiPenal) and models for a longitudinal biomarker, recurrent events and a terminal event (trivPenal). The estimators are obtained using a standard and penalized maximum likelihood approach, each model function allows to evaluate goodness-of-fit analyses and provides plots of baseline hazard functions. Finally, the package provides individual dynamic predictions of the terminal event and evaluation of predictive accuracy. This paper presents the theoretical models with estimation techniques, applies the methods for predictions and illustrates frailtypack functions details with examples.
Bayesian Approach for Joint Longitudinal and Time-to-Event Data with Survival Fraction
Many medical investigations generate both repeatedly-measured (longitudinal) biomarker and survival data. One of complex issue arises when investigating the association between longitudinal and time-to-event data when there are cured patients in the population, which leads to a plateau in the survival function S(t) after sufficient follow-up. Thus, usual Cox proportional hazard model [11] is not applicable since the proportional hazard assumption is violated. An alternative is to consider survival models incorporating a cure fraction. In this paper, we present a new class of joint model for univariate longitudinal and survival data in presence of cure fraction. For the longitudinal model, a stochastic Integrated Ornstein-Uhlenbeck process will present, and for the survival model a semiparametric survival function will be considered which accommodate both zero and non-zero cure fractions of the dynamic disease progression. Moreover, we consider a Bayesian approach which is motivated by the complexity of the model. Posterior and prior specification needs to accommodate parameter constraints due to the non-negativity of the survival function. A simulation study is presented to evaluate the performance of the proposed joint model.
A modified two-stage approach for joint modelling of longitudinal and time-to-event data
Journal of Statistical Computation and Simulation, 2018
Joint models for longitudinal and time-to-event data have been applied in many different fields of statistics and clinical studies. However, the main difficulty these models have to face with is the computational problem. The requirement for numerical integration becomes severe when the dimension of random effects increases. In this paper, a modified two-stage approach has been proposed to estimate the parameters in joint models. In particular, in the first stage, the linear mixed-effects models and best linear unbiased predictorsare applied to estimate parameters in the longitudinal submodel. In the second stage, an approximation of the fully joint log-likelihood is proposed using the estimated the values of these parameters from the longitudinal submodel. Survival parameters are estimated bymaximizing the approximation of the fully joint loglikelihood. Simulation studies show that the approach performs well, especially when the dimension of random effects increases. Finally, we implement this approach on AIDS data.
arXiv (Cornell University), 2020
Two-part joint models for a longitudinal semicontinuous biomarker and a terminal event have been recently introduced based on frequentist estimation. The biomarker distribution is decomposed into a probability of positive value and the expected value among positive values. Shared random effects can represent the association structure between the biomarker and the terminal event. The computational burden increases compared to standard joint models with a single regression model for the biomarker. In this context, the frequentist estimation implemented in the R package frailtypack can be challenging for complex models (i.e., large number of parameters and dimension of the random effects). As an alternative, we propose a Bayesian estimation of two-part joint models based on the Integrated Nested Laplace Approximation (INLA) algorithm to alleviate the computational burden and fit more complex models. Our simulation studies confirm that INLA provides accurate approximation of posterior estimates and to reduced computation time and variability of estimates compared to frailtypack in the situations considered. We contrast the Bayesian and frequentist approaches in the analysis of two randomized cancer clinical trials (GERCOR and PRIME studies), where INLA has a reduced variability for the association between the biomarker and the risk of event. Moreover, the Bayesian approach was able to characterize subgroups of patients associated with different responses to treatment in the PRIME study. Our study suggests that the Bayesian approach using INLA algorithm enables to fit complex joint models that might be of interest in a wide range of clinical applications.
A stochastic joint model for longitudinal and survival data with cure patients
International Journal of Tomography and Simulation, 2009
Many medical investigations generate both repeatedly-measured (longitudinal) biomarker and survival data. One of complex issues arises when investigating the association between longitudinal and time-to-event data when there are cured patients in the population, which leads to a plateau in the survival function S(t) after sufficient follow-up. Thus, usual Cox proportional hazard model Cox (1972) is not applicable since the proportional hazard assumption is violated. An alternative is to consider survival models incorporating a cure fraction. In this paper we present a new class of joint model for univariate longitudinal and survival data in presence of cure fraction. For the longitudinal model, a stochastic Integrated Ornstein-Uhlenbeck process will be presented. For the survival model a semiparametric survival function will be considered which accommodate both zero and non-zero cure fractions of the dynamic disease progression. Moreover, we consider a Bayesian approach which is mot...
Biometrics, 2008
In clinical studies, longitudinal biomarkers are often used to monitor disease progression and failure time. Joint modeling of longitudinal and survival data has certain advantages and has emerged as an effective way to mutually enhance information. Typically, a parametric longitudinal model is assumed to facilitate the likelihood approach. However, the choice of a proper parametric model turns out to be more elusive than models for standard longitudinal studies in which no survival endpoint occurs. In this article, we propose a nonparametric multiplicative random effects model for the longitudinal process, which has many applications and leads to a flexible yet parsimonious nonparametric random effects model. A proportional hazards model is then used to link the biomarkers and event time. We use B-splines to represent the nonparametric longitudinal process, and select the number of knots and degrees based on a version of the Akaike information criterion (AIC). Unknown model parameters are estimated through maximizing the observed joint likelihood, which is iteratively maximized by the Monte Carlo Expectation Maximization (MCEM) algorithm. Due to the simplicity of the model structure, the proposed approach has good numerical stability and compares well with the competing parametric longitudinal approaches. The new approach is illustrated with primary biliary cirrhosis (PBC) data, aiming to capture nonlinear patterns of serum bilirubin time courses and their relationship with survival time of PBC patients.
A now common goal in medical research is to investigate the inter-relationships between a repeatedly measured biomarker, measured with error, and the time to an event of interest. This form of question can be tackled with a joint longitudinal-survival model, with the most common approach combining a longitudinal mixed effects model with a proportional hazards survival model, where the models are linked through shared random effects. In this article, we look at incorporating delayed entry (left truncation), which has received relatively little attention. The extension to delayed entry requires a second set of numerical integration, beyond that required in a standard joint model. We therefore implement two sets of fully adaptive Gauss-Hermite quadrature with nested Gauss-Kronrod quadrature (to allow time-dependent association structures), conducted simultaneously, to evaluate the likelihood. We evaluate fully adaptive quadrature compared with previously proposed non-adaptive quadrature through a simulation study, showing substantial improvements, both in terms of minimising bias and reducing computation time. We further investigate, through simulation, the consequences of misspecifying the longitudinal trajectory and its impact on estimates of association. Our scenarios showed the current value association structure to be very robust, compared with the rate of change that we found to be highly sensitive showing that assuming a simpler trend when the truth is more complex can lead to substantial bias. With emphasis on flexible parametric approaches, we generalise previous models by proposing the use of polynomials or splines to capture the longitudinal trend and restricted cubic splines to model the baseline log hazard function. The methods are illustrated on a dataset of breast cancer patients, modelling mammographic density jointly with survival, where we show how to incorporate density measurements prior to the at-risk period, to make use of all the available information. User-friendly Stata software is provided.
Joint modelling of repeated measurement and time-to-event data: an introductory tutorial
International journal of epidemiology, 2015
The term 'joint modelling' is used in the statistical literature to refer to methods for simultaneously analysing longitudinal measurement outcomes, also called repeated measurement data, and time-to-event outcomes, also called survival data. A typical example from nephrology is a study in which the data from each participant consist of repeated estimated glomerular filtration rate (eGFR) measurements and time to initiation of renal replacement therapy (RRT). Joint models typically combine linear mixed effects models for repeated measurements and Cox models for censored survival outcomes. Our aim in this paper is to present an introductory tutorial on joint modelling methods, with a case study in nephrology. We describe the development of the joint modelling framework and compare the results with those obtained by the more widely used approaches of conducting separate analyses of the repeated measurements and survival times based on a linear mixed effects model and a Cox mod...