Gradient boosting is a popular idea to improve the efficiency of weak learners.


  1. Boosting: Sequentially adding weak learners at each stage to compensate existing weak learners.

  2. Gradient: Identify shortcomings of exisiting weak learners.


Algorithm

  • At iteration i: model is , response is , residual is

(1) let =

(2) fit regression tree to data - Boosting concept to compensate the shortcomings of existing weak learners

(3) Gradient concept:

Loss function ,

Cost function , treat as a parameter and take derivatives:

, which is the negative residual - shortcoming, hence update by the gradient:

⤧  Previous post Pictures from last week's featured talk. ⤧  Next post Gradient Boosting Notes and Pictures