Michael Crowther | University of Leicester (original) (raw)

Papers by Michael Crowther

Research paper thumbnail of Joint longitudinal hurdle and time-to-event models: an application related to viral load and duration of the first treatment regimen in patients with HIV initiating therapy

Shared parameter joint models provide a framework under which a longitudinal response and a time ... more Shared parameter joint models provide a framework under which a longitudinal response and a time to event can be modelled simultaneously. A common assumption in shared parameter joint models has been to assume that the longitudinal response is normally distributed. In this paper, we instead propose a joint model that incorporates a two-part ‘hurdle’ model for the longitudinal response, motivated in part by longitudinal response data that is subject to a detection limit. The first part of the hurdle model estimates the probability that the longitudinal response is observed above the detection limit, whilst the second part of the hurdle model estimates the mean of the response conditional on having exceeded the detection limit. The time-to-event outcome is modelled using a parametric proportional hazards model, assuming a Weibull baseline hazard. We propose a novel association structure whereby the current hazard of the event is assumed to be associated with the current combined (expected) outcome from the two parts of the hurdle model. We estimate our joint model under a Bayesian framework and provide code for fitting the model using the Bayesian software Stan. We use our model to estimate the association between HIV RNA viral load, which is subject to a lower detection limit, and the hazard of stopping or modifying treatment in patients with HIV initiating antiretroviral therapy.

Research paper thumbnail of Joint modelling of longitudinal and survival data: Incorporating delayed entry and an assessment of model misspecification

Research paper thumbnail of Using meta-analysis to inform the design of subsequent studies of diagnostic test accuracy

Research Synthesis Methods, 2012

An individual diagnostic accuracy study rarely provides enough information to make conclusive rec... more An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial evidence to underpin reliable clinical decision-making. Very few investigators consider any sample size calculations when designing a new diagnostic accuracy study. However, it is important to consider the number of subjects in a new study in order to achieve a precise measure of accuracy.Sutton et al. have suggested previously that when designing a new therapeutic trial, it could be more beneficial to consider the power of the updated meta-analysis including the new trial rather than of the new trial itself. The methodology involves simulating new studies for a range of sample sizes and estimating the power of the updated meta-analysis with each new study added. Plotting the power values against the range of sample sizes allows the clinician to make an informed decision about the sample size of a new trial. This paper extends this approach from the trial setting and applies it to diagnostic accuracy studies. Several meta-analytic models are considered including bivariate random effects meta-analysis that models the correlation between sensitivity and specificity. Copyright © 2012 John Wiley & Sons, Ltd.

Research paper thumbnail of Setting benchmark revision rates for total hip replacement: analysis of registry evidence

Research paper thumbnail of The use of restricted cubic splines to approximate complex hazard functions in the analysis of time-to-event data: a simulation study

Journal of Statistical Computation and Simulation, 2013

Research paper thumbnail of Joint modeling of longitudinal and survival data

The joint modelling of longitudinal and survival data has received remarkable attention in the me... more The joint modelling of longitudinal and survival data has received remarkable attention in the methodological literature over the past decade; however, the availability of software to implement the methods lags behind. The most common form of joint model assumes that the association between the survival and longitudinal processes are underlined by shared random effects. As a result, computationally intensive numerical integration techniques such as adaptive Gauss-Hermite quadrature are required to evaluate the likelihood. We describe a new user written command, stjm, which allows the user to jointly model a continuous longitudinal response and the time to an event of interest. We assume a linear mixed effects model for the longitudinal submodel, allowing flexibility through the use of fixed and/or random fractional polynomials of time. Four choices are available for the survival submodel; namely the exponential, Weibull or Gompertz proportional hazard models, and the flexible parametric model (stpm2). Flexible parametric models are fitted on the log cumulative hazard scale which has direct computational benefits as it avoids the use of numerical integration to evaluate the cumulative hazard. We describe the features of stjm through application to a dataset investigating the effect of serum bilirubin level on time to death from any cause, in 312 patients with primary biliary cirrhosis.

Research paper thumbnail of Simulating complex survival data

Simulation studies are essential for understanding and evaluating both current and new statistica... more Simulation studies are essential for understanding and evaluating both current and new statistical models. When simulating survival times, often an exponential or Weibull distribution is assumed for the baseline hazard function, with survival times generated using the method of Bender et al. (2005). Assuming a constant or monotonic hazard can be considered too simplistic and lack biological plausibility in many situations. We describe a new user written command, survsim, which allows the user to simulate survival times from 2-component parametric mixture models, allowing much more flexibility in the underlying hazard. Standard parametric distributions can also be used including the exponential, Weibull and Gompertz. Furthermore, survival times can be simulated from the all-cause distribution of cause-specific hazards for competing risks, using the method of Beyersmann et al. (2009). A multinomial distribution is used to create the event indicator, whereby the probability of experiencing each event at a simulated time t is the cause-specific hazard divided by the all-cause hazard evaluated at time t. Baseline covariates can be included in all scenarios. We also describe the extension to incorporate non-proportional hazards in standard parametric and competing risks scenarios.

Research paper thumbnail of A general framework for parametric survival analysis

Statistics in Medicine, 2014

Parametric survival models are being increasingly used as an alternative to the Cox model in biom... more Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

Research paper thumbnail of Simulating biologically plausible complex survival data

Statistics in Medicine, 2013

Simulation studies are conducted to assess the performance of current and novel statistical model... more Simulation studies are conducted to assess the performance of current and novel statistical models in pre-defined scenarios. It is often desirable that chosen simulation scenarios accurately reflect a biologically plausible underlying distribution. This is particularly important in the framework of survival analysis, where simulated distributions are chosen for both the event time and the censoring time. This paper develops methods for using complex distributions when generating survival times to assess methods in practice. We describe a general algorithm involving numerical integration and root-finding techniques to generate survival times from a variety of complex parametric distributions, incorporating any combination of time-dependent effects, time-varying covariates, delayed entry, random effects and covariates measured with error. User-friendly Stata software is provided. Copyright © 2013 John Wiley & Sons, Ltd.

Research paper thumbnail of Flexible parametric joint modelling of longitudinal and survival data

Statistics in Medicine

The joint modelling of longitudinal and survival data is a highly active area of biostatistical r... more The joint modelling of longitudinal and survival data is a highly active area of biostatistical research. The submodel for the longitudinal biomarker usually takes the form of a linear mixed effects model. We describe a flexible parametric approach for the survival submodel that models the log baseline cumulative hazard using restricted cubic splines. This approach overcomes limitations of standard parametric choices for the survival submodel, which can lack the flexibility to effectively capture the shape of the underlying hazard function. Numerical integration techniques, such as Gauss–Hermite quadrature, are usually required to evaluate both the cumulative hazard and the overall joint likelihood; however, by using a flexible parametric model, the cumulative hazard has an analytically tractable form, providing considerable computational benefits. We conduct an extensive simulation study to assess the proposed model, comparing it with a B-spline formulation, illustrating insensitivity of parameter estimates to the baseline cumulative hazard function specification. Furthermore, we compare non-adaptive and fully adaptive quadrature, showing the superiority of adaptive quadrature in evaluating the joint likelihood. We also describe a useful technique to simulate survival times from complex baseline hazard functions and illustrate the methods using an example data set investigating the association between longitudinal prothrombin index and survival of patients with liver cirrhosis, showing greater flexibility and improved stability with fewer parameters under the proposed model compared with the B-spline approach. We provide user-friendly Stata software. Copyright © 2012 John Wiley & Sons, Ltd.

Research paper thumbnail of Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis

Statistics in medicine, 2014

Multilevel mixed effects survival models are used in the analysis of clustered survival data, suc... more Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss–Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

Research paper thumbnail of stgenreg: A Stata package for general parametric survival analysis

Research paper thumbnail of Joint modeling of survival and longitudinal non-survival data: current methods and issues. Report of the DIA Bayesian joint modeling working group

Statistics in Medicine, 2014

Explicitly modeling underlying relationships between a survival endpoint and processes that gener... more Explicitly modeling underlying relationships between a survival endpoint and processes that generate longitudinal measured or reported outcomes potentially could improve the efficiency of clinical trials and provide greater insight into the various dimensions of the clinical effect of interventions included in the trials. Various strategies have been proposed for using longitudinal findings to elucidate intervention effects on clinical outcomes such as survival. The application of specifically Bayesian approaches for constructing models that address longitudinal and survival outcomes explicitly has been recently addressed in the literature. We review currently available methods for carrying out joint analyses, including issues of implementation and interpretation, identify software tools that can be used to carry out the necessary calculations, and review applications of the methodology. Copyright © 2014 John Wiley & Sons, Ltd.

Research paper thumbnail of Adjusting for treatment switching in randomised controlled trials – A simulation study and a simplified two-stage method

Research paper thumbnail of Individual patient data meta-analysis of survival data using Poisson regression models

BMC medical research methodology

"Background An Individual Patient Data (IPD) meta-analysis is often considered the gold-stand... more "Background

An Individual Patient Data (IPD) meta-analysis is often considered the gold-standard for synthesising survival data from clinical trials. An IPD meta-analysis can be achieved by either a two-stage or a one-stage approach, depending on whether the trials are analysed separately or simultaneously. A range of one-stage hierarchical Cox models have been previously proposed, but these are known to be computationally intensive and are not currently available in all standard statistical software. We describe an alternative approach using Poisson based Generalised Linear Models (GLMs).

Methods

We illustrate, through application and simulation, the Poisson approach both classically and in a Bayesian framework, in two-stage and one-stage approaches. We outline the benefits of our one-stage approach through extension to modelling treatment-covariate interactions and non-proportional hazards. Ten trials of hypertension treatment, with all-cause death the outcome of interest, are used to apply and assess the approach.

Results

We show that the Poisson approach obtains almost identical estimates to the Cox model, is additionally computationally efficient and directly estimates the baseline hazard. Some downward bias is observed in classical estimates of the heterogeneity in the treatment effect, with improved performance from the Bayesian approach.

Conclusion

Our approach provides a highly flexible and computationally efficient framework, available in all standard statistical software, to the investigation of not only heterogeneity, but the presence of non-proportional hazards and treatment effect modifiers."

Research paper thumbnail of Adjusting Survival Time Estimates to Account for Treatment Switching in Randomized Controlled Trials--an Economic Evaluation Context: Methods, Limitations, and Recommendations

Medical decision making : an international journal of the Society for Medical Decision Making, 2014

Research paper thumbnail of Adjusting for measurement error in baseline prognostic biomarkers included in a time-to-event analysis: a joint modelling approach

BMC Medical Research Methodology, 2013

Research paper thumbnail of Simulation Based Sample Size Calculation for Designing New Clinical Trials and Diagnostic Test Accuracy Studies to Update an Existing Meta-analysis

Research paper thumbnail of When is a further clinical trial justified?

Research paper thumbnail of What is the clinical effectiveness and cost-effectiveness of using drugs in treating obese patients in primary care? A systematic review

Health technology assessment (Winchester, England), Jan 1, 2012

Research paper thumbnail of Joint longitudinal hurdle and time-to-event models: an application related to viral load and duration of the first treatment regimen in patients with HIV initiating therapy

Shared parameter joint models provide a framework under which a longitudinal response and a time ... more Shared parameter joint models provide a framework under which a longitudinal response and a time to event can be modelled simultaneously. A common assumption in shared parameter joint models has been to assume that the longitudinal response is normally distributed. In this paper, we instead propose a joint model that incorporates a two-part ‘hurdle’ model for the longitudinal response, motivated in part by longitudinal response data that is subject to a detection limit. The first part of the hurdle model estimates the probability that the longitudinal response is observed above the detection limit, whilst the second part of the hurdle model estimates the mean of the response conditional on having exceeded the detection limit. The time-to-event outcome is modelled using a parametric proportional hazards model, assuming a Weibull baseline hazard. We propose a novel association structure whereby the current hazard of the event is assumed to be associated with the current combined (expected) outcome from the two parts of the hurdle model. We estimate our joint model under a Bayesian framework and provide code for fitting the model using the Bayesian software Stan. We use our model to estimate the association between HIV RNA viral load, which is subject to a lower detection limit, and the hazard of stopping or modifying treatment in patients with HIV initiating antiretroviral therapy.

Research paper thumbnail of Joint modelling of longitudinal and survival data: Incorporating delayed entry and an assessment of model misspecification

Research paper thumbnail of Using meta-analysis to inform the design of subsequent studies of diagnostic test accuracy

Research Synthesis Methods, 2012

An individual diagnostic accuracy study rarely provides enough information to make conclusive rec... more An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial evidence to underpin reliable clinical decision-making. Very few investigators consider any sample size calculations when designing a new diagnostic accuracy study. However, it is important to consider the number of subjects in a new study in order to achieve a precise measure of accuracy.Sutton et al. have suggested previously that when designing a new therapeutic trial, it could be more beneficial to consider the power of the updated meta-analysis including the new trial rather than of the new trial itself. The methodology involves simulating new studies for a range of sample sizes and estimating the power of the updated meta-analysis with each new study added. Plotting the power values against the range of sample sizes allows the clinician to make an informed decision about the sample size of a new trial. This paper extends this approach from the trial setting and applies it to diagnostic accuracy studies. Several meta-analytic models are considered including bivariate random effects meta-analysis that models the correlation between sensitivity and specificity. Copyright © 2012 John Wiley & Sons, Ltd.

Research paper thumbnail of Setting benchmark revision rates for total hip replacement: analysis of registry evidence

Research paper thumbnail of The use of restricted cubic splines to approximate complex hazard functions in the analysis of time-to-event data: a simulation study

Journal of Statistical Computation and Simulation, 2013

Research paper thumbnail of Joint modeling of longitudinal and survival data

The joint modelling of longitudinal and survival data has received remarkable attention in the me... more The joint modelling of longitudinal and survival data has received remarkable attention in the methodological literature over the past decade; however, the availability of software to implement the methods lags behind. The most common form of joint model assumes that the association between the survival and longitudinal processes are underlined by shared random effects. As a result, computationally intensive numerical integration techniques such as adaptive Gauss-Hermite quadrature are required to evaluate the likelihood. We describe a new user written command, stjm, which allows the user to jointly model a continuous longitudinal response and the time to an event of interest. We assume a linear mixed effects model for the longitudinal submodel, allowing flexibility through the use of fixed and/or random fractional polynomials of time. Four choices are available for the survival submodel; namely the exponential, Weibull or Gompertz proportional hazard models, and the flexible parametric model (stpm2). Flexible parametric models are fitted on the log cumulative hazard scale which has direct computational benefits as it avoids the use of numerical integration to evaluate the cumulative hazard. We describe the features of stjm through application to a dataset investigating the effect of serum bilirubin level on time to death from any cause, in 312 patients with primary biliary cirrhosis.

Research paper thumbnail of Simulating complex survival data

Simulation studies are essential for understanding and evaluating both current and new statistica... more Simulation studies are essential for understanding and evaluating both current and new statistical models. When simulating survival times, often an exponential or Weibull distribution is assumed for the baseline hazard function, with survival times generated using the method of Bender et al. (2005). Assuming a constant or monotonic hazard can be considered too simplistic and lack biological plausibility in many situations. We describe a new user written command, survsim, which allows the user to simulate survival times from 2-component parametric mixture models, allowing much more flexibility in the underlying hazard. Standard parametric distributions can also be used including the exponential, Weibull and Gompertz. Furthermore, survival times can be simulated from the all-cause distribution of cause-specific hazards for competing risks, using the method of Beyersmann et al. (2009). A multinomial distribution is used to create the event indicator, whereby the probability of experiencing each event at a simulated time t is the cause-specific hazard divided by the all-cause hazard evaluated at time t. Baseline covariates can be included in all scenarios. We also describe the extension to incorporate non-proportional hazards in standard parametric and competing risks scenarios.

Research paper thumbnail of A general framework for parametric survival analysis

Statistics in Medicine, 2014

Parametric survival models are being increasingly used as an alternative to the Cox model in biom... more Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

Research paper thumbnail of Simulating biologically plausible complex survival data

Statistics in Medicine, 2013

Simulation studies are conducted to assess the performance of current and novel statistical model... more Simulation studies are conducted to assess the performance of current and novel statistical models in pre-defined scenarios. It is often desirable that chosen simulation scenarios accurately reflect a biologically plausible underlying distribution. This is particularly important in the framework of survival analysis, where simulated distributions are chosen for both the event time and the censoring time. This paper develops methods for using complex distributions when generating survival times to assess methods in practice. We describe a general algorithm involving numerical integration and root-finding techniques to generate survival times from a variety of complex parametric distributions, incorporating any combination of time-dependent effects, time-varying covariates, delayed entry, random effects and covariates measured with error. User-friendly Stata software is provided. Copyright © 2013 John Wiley & Sons, Ltd.

Research paper thumbnail of Flexible parametric joint modelling of longitudinal and survival data

Statistics in Medicine

The joint modelling of longitudinal and survival data is a highly active area of biostatistical r... more The joint modelling of longitudinal and survival data is a highly active area of biostatistical research. The submodel for the longitudinal biomarker usually takes the form of a linear mixed effects model. We describe a flexible parametric approach for the survival submodel that models the log baseline cumulative hazard using restricted cubic splines. This approach overcomes limitations of standard parametric choices for the survival submodel, which can lack the flexibility to effectively capture the shape of the underlying hazard function. Numerical integration techniques, such as Gauss–Hermite quadrature, are usually required to evaluate both the cumulative hazard and the overall joint likelihood; however, by using a flexible parametric model, the cumulative hazard has an analytically tractable form, providing considerable computational benefits. We conduct an extensive simulation study to assess the proposed model, comparing it with a B-spline formulation, illustrating insensitivity of parameter estimates to the baseline cumulative hazard function specification. Furthermore, we compare non-adaptive and fully adaptive quadrature, showing the superiority of adaptive quadrature in evaluating the joint likelihood. We also describe a useful technique to simulate survival times from complex baseline hazard functions and illustrate the methods using an example data set investigating the association between longitudinal prothrombin index and survival of patients with liver cirrhosis, showing greater flexibility and improved stability with fewer parameters under the proposed model compared with the B-spline approach. We provide user-friendly Stata software. Copyright © 2012 John Wiley & Sons, Ltd.

Research paper thumbnail of Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis

Statistics in medicine, 2014

Multilevel mixed effects survival models are used in the analysis of clustered survival data, suc... more Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss–Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

Research paper thumbnail of stgenreg: A Stata package for general parametric survival analysis

Research paper thumbnail of Joint modeling of survival and longitudinal non-survival data: current methods and issues. Report of the DIA Bayesian joint modeling working group

Statistics in Medicine, 2014

Explicitly modeling underlying relationships between a survival endpoint and processes that gener... more Explicitly modeling underlying relationships between a survival endpoint and processes that generate longitudinal measured or reported outcomes potentially could improve the efficiency of clinical trials and provide greater insight into the various dimensions of the clinical effect of interventions included in the trials. Various strategies have been proposed for using longitudinal findings to elucidate intervention effects on clinical outcomes such as survival. The application of specifically Bayesian approaches for constructing models that address longitudinal and survival outcomes explicitly has been recently addressed in the literature. We review currently available methods for carrying out joint analyses, including issues of implementation and interpretation, identify software tools that can be used to carry out the necessary calculations, and review applications of the methodology. Copyright © 2014 John Wiley & Sons, Ltd.

Research paper thumbnail of Adjusting for treatment switching in randomised controlled trials – A simulation study and a simplified two-stage method

Research paper thumbnail of Individual patient data meta-analysis of survival data using Poisson regression models

BMC medical research methodology

"Background An Individual Patient Data (IPD) meta-analysis is often considered the gold-stand... more "Background

An Individual Patient Data (IPD) meta-analysis is often considered the gold-standard for synthesising survival data from clinical trials. An IPD meta-analysis can be achieved by either a two-stage or a one-stage approach, depending on whether the trials are analysed separately or simultaneously. A range of one-stage hierarchical Cox models have been previously proposed, but these are known to be computationally intensive and are not currently available in all standard statistical software. We describe an alternative approach using Poisson based Generalised Linear Models (GLMs).

Methods

We illustrate, through application and simulation, the Poisson approach both classically and in a Bayesian framework, in two-stage and one-stage approaches. We outline the benefits of our one-stage approach through extension to modelling treatment-covariate interactions and non-proportional hazards. Ten trials of hypertension treatment, with all-cause death the outcome of interest, are used to apply and assess the approach.

Results

We show that the Poisson approach obtains almost identical estimates to the Cox model, is additionally computationally efficient and directly estimates the baseline hazard. Some downward bias is observed in classical estimates of the heterogeneity in the treatment effect, with improved performance from the Bayesian approach.

Conclusion

Our approach provides a highly flexible and computationally efficient framework, available in all standard statistical software, to the investigation of not only heterogeneity, but the presence of non-proportional hazards and treatment effect modifiers."

Research paper thumbnail of Adjusting Survival Time Estimates to Account for Treatment Switching in Randomized Controlled Trials--an Economic Evaluation Context: Methods, Limitations, and Recommendations

Medical decision making : an international journal of the Society for Medical Decision Making, 2014

Research paper thumbnail of Adjusting for measurement error in baseline prognostic biomarkers included in a time-to-event analysis: a joint modelling approach

BMC Medical Research Methodology, 2013

Research paper thumbnail of Simulation Based Sample Size Calculation for Designing New Clinical Trials and Diagnostic Test Accuracy Studies to Update an Existing Meta-analysis

Research paper thumbnail of When is a further clinical trial justified?

Research paper thumbnail of What is the clinical effectiveness and cost-effectiveness of using drugs in treating obese patients in primary care? A systematic review

Health technology assessment (Winchester, England), Jan 1, 2012

Research paper thumbnail of Multilevel mixed effects parametric survival analysis

With the release of Stata 14 came the mestreg command to fit multilevel mixed effects parametric ... more With the release of Stata 14 came the mestreg command to fit multilevel mixed effects parametric survival models, assuming normally distributed random effects, estimated with maximum likelihood utilising Gaussian quadrature. In this article, I present the user written stmixed command, which serves as both an alternative and a complimentary program for the fitting of multilevel parametric survival models, to mestreg. The key extensions include incorporation of the flexible para-metric Royston-Parmar survival model, and the ability to fit multilevel relative survival models. The methods are illustrated with a commonly used dataset of patients with kidney disease suffering recurrent infections, and a simulated example, illustrating a simple approach to simulating clustered survival data using survsim (Crowther and Lambert (2012, 2013)).