The gradient captures the partial derivative of cost with respect to all of our machine learning model's parameters. To come to grips with it, Jon Krohn carries out a regression on individual data points and derives the partial derivatives of quadratic cost. He then gets into what it means to descend the gradient and derives the partial derivatives of mean squared error, enabling you to learn from batches of data, instead of individual points. He finishes the lesson off with discussions of backpropagation and higher order partial derivatives.
This lesson is an excerpt from “Calculus for Machine Learning LiveLessons” Purchase the entire video course at informit.com/youtube and save 50% with discount code YOUTUBE.
Also available in O’Reilly Online Learning (Safari) subscription service.
Смотрите видео Gradients in Machine Learning with Jon Krohn онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь LiveLessons 20 Май 2021, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 1,39 раз и оно понравилось 1 людям.