13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (L13: Feature Selection)

Published: 13 December 2021
on channel: Sebastian Raschka
5,620
90

Sebastian's books: https://sebastianraschka.com/books/

Without going into the nitty-gritty details behind logistic regression, this lecture explains how/why we can consider an L1 penalty --- a modification of the loss function -- as an embedded feature selection method.

Slides: https://sebastianraschka.com/pdf/lect...

Code: https://github.com/rasbt/stat451-mach...

Links to the logistic regression videos I referenced:
https://sebastianraschka.com/blog/202...

-------

This video is part of my Introduction of Machine Learning course.

Next video:    • 13.3.2 Decision Trees & Random Forest...  

The complete playlist:    • Intro to Machine Learning and Statist...  

A handy overview page with links to the materials: https://sebastianraschka.com/blog/202...

-------

If you want to be notified about future videos, please consider subscribing to my channel:    / sebastianraschka  


Watch video 13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (L13: Feature Selection) online without registration, duration hours minute second in high quality. This video was added by user Sebastian Raschka 13 December 2021, don't forget to share it with your friends and acquaintances, it has been viewed on our site 5,620 once and liked it 90 people.