Ensemble Of Fast Learning Stochastic Gradient Boosting
Document Type
Article
Publication Date
7-29-2019
Publication Title
Communications in Statistics: Simulation and Computation
Abstract
Boosting is one of the most popular and powerful learning algorithms. However, due to its sequential nature in model fitting, the computational time of boosting algorithm can be prohibitive for big data analysis. In this paper, we proposed a parallel framework for boosting algorithm, called Ensemble of Fast Learning Stochastic Gradient Boosting (EFLSGB). The proposed EFLSGB is well suited for parallel execution, and therefore, can substantially reduce the computational time. Analysis of simulated and real datasets demonstrates that EFLSGB achieves highly competitive prediction accuracy in comparison with gradient tree boosting.
First Page
40
Last Page
52
PubMed ID
35682375
Volume
51
Issue
1
Recommended Citation
Li, Bin; Yu, Qingzhao; and Peng, Lu, "Ensemble Of Fast Learning Stochastic Gradient Boosting" (2019). School of Public Health Faculty Publications. 272.
https://digitalscholar.lsuhsc.edu/soph_facpubs/272
10.1080/03610918.2019.1645170