Parallelizing AdaBoost by weights dynamics (original) (raw)
Computational Statistics & Data Analysis, 2007
Abstract
AdaBoost is one of the most popular classification methods. In contrast to other ensemble methods (e.g., Bagging) the AdaBoost is inherently sequential. In many data intensive real-world situations this may limit the practical applicability of the method. P-AdaBoost is a novel scheme for the parallelization of AdaBoost, which builds upon earlier results concerning the dynamics of AdaBoost weights. P-AdaBoost yields approximations to the standard AdaBoost models that can be easily and efficiently distributed over a network of computing nodes. Properties of P-AdaBoost as a stochastic minimizer of the AdaBoost cost functional are discussed. Experiments are reported on both synthetic and benchmark data sets.
Bruno Caprile hasn't uploaded this paper.
Let Bruno know you want this paper to be uploaded.
Ask for this paper to be uploaded.