7.5 Gradient Boosting (L07: Ensemble Methods)

Published: 27 October 2020
on channel: Sebastian Raschka
12,999
221

Sebastian's books: https://sebastianraschka.com/books/

In this video, we will take the concept of boosting a step further and talk about gradient boosting. Where AdaBoost uses weights for training examples to boost the trees in the next round, gradient boosting uses the gradients of the loss to compute residuals on which the next tree in the sequence is fit.

XGBoost paper mentioned in the video: https://dl.acm.org/doi/pdf/10.1145/29...

Link to the code: https://github.com/rasbt/stat451-mach...

-------

This video is part of my Introduction of Machine Learning course.

Next video:    • 7.6 Random Forests (L07: Ensemble Met...  

The complete playlist:    • Intro to Machine Learning and Statist...  

A handy overview page with links to the materials: https://sebastianraschka.com/blog/202...

-------

If you want to be notified about future videos, please consider subscribing to my channel:    / sebastianraschka  


Watch video 7.5 Gradient Boosting (L07: Ensemble Methods) online without registration, duration hours minute second in high quality. This video was added by user Sebastian Raschka 27 October 2020, don't forget to share it with your friends and acquaintances, it has been viewed on our site 12,999 once and liked it 221 people.