Boosting Algorithms: Regularization, Prediction and Model Fitting (original) (raw)

Open Access

November 2007 Boosting Algorithms: Regularization, Prediction and Model Fitting

Peter Bühlmann,Torsten Hothorn

Statist. Sci. 22(4): 477-505 (November 2007). DOI: 10.1214/07-STS242

Abstract

We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selection in high-dimensional covariate spaces, are discussed as well.

The practical aspects of boosting procedures for fitting statistical models are illustrated by means of the dedicated open-source software package mboost. This package implements functions which can be used for model fitting, prediction and variable selection. It is flexible, allowing for the implementation of new boosting algorithms optimizing user-specified loss functions.

Citation

Download Citation

Peter Bühlmann. Torsten Hothorn. "Boosting Algorithms: Regularization, Prediction and Model Fitting." Statist. Sci. 22 (4) 477 - 505, November 2007. https://doi.org/10.1214/07-STS242

Information

Published: November 2007

First available in Project Euclid: 7 April 2008

Digital Object Identifier: 10.1214/07-STS242

Keywords: generalized additive models, generalized linear models, gradient boosting, software, Survival analysis, Variable selection

Rights: Copyright © 2007 Institute of Mathematical Statistics

Vol.22 • No. 4 • November 2007