13.3.2 Decision Trees & Random Forest Feature Importance (L13: Feature Selection)

Published: 22 December 2021
on channel: Sebastian Raschka
13,937
241

Sebastian's books: https://sebastianraschka.com/books/

This video explains how decision trees training can be regarded as an embedded method for feature selection. Then, we will also look at random forest feature importance and go over two different ways it's computed: (a) impurity-based and (b) permutation-based.

Slide link: https://sebastianraschka.com/pdf/lect...

Code link: https://github.com/rasbt/stat451-mach...

-------

This video is part of my Introduction of Machine Learning course.

Next video:    • 13.4.1 Recursive Feature Elimination ...  

The complete playlist:    • Intro to Machine Learning and Statist...  

A handy overview page with links to the materials: https://sebastianraschka.com/blog/202...

-------

If you want to be notified about future videos, please consider subscribing to my channel:    / sebastianraschka  


Watch video 13.3.2 Decision Trees & Random Forest Feature Importance (L13: Feature Selection) online without registration, duration hours minute second in high quality. This video was added by user Sebastian Raschka 22 December 2021, don't forget to share it with your friends and acquaintances, it has been viewed on our site 13,937 once and liked it 241 people.