Gradient Boosting
Gradient boosting is a popular idea to improve the efficiency of weak learners.
Boosting: Sequentially adding weak learners at each stage to compensate existing weak learners.
Gradient: Identify shortcomings of exisiting weak learners.
Algorithm
- At iteration i: model is , response is , residual is
(1) let =
(2) fit regression tree to data - Boosting concept to compensate the shortcomings of existing weak learners
(3) Gradient concept:
Loss function ,
Cost function , treat as a parameter and take derivatives:
, which is the negative residual - shortcoming, hence update by the gradient: