In this video I discuss about why the ReLU activation function is more popular in deep neural networks than other activation functions like tanh or sigmoid by solving the saturating gradients problem for high input values.
Related Videos
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
ReLU variants explained: • ReLU Activation Function Variants Exp...
Why we need activation functions: • Why We Need Activation Functions In N...
References
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
ImageNet Classification with Deep Convolutional Neural Networks (ReLU becomes popular): https://proceedings.neurips.cc/paper/...
Contents
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
00:00 - Intro
00:15 - Activation Functions
02:58 - Activation Functions Derivatives
05:36 - ReLU Became Popular
08:35 - Outro
Follow Me
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
🐦 Twitter: @datamlistic / datamlistic
📸 Instagram: @datamlistic / datamlistic
📱 TikTok: @datamlistic / datamlistic
Channel Support
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
The best way to support the channel is to share the content. ;)
If you'd like to also support the channel financially, donating the price of a coffee is always warmly welcomed! (completely optional and voluntary)
► Patreon: / datamlistic
► Bitcoin (BTC): 3C6Pkzyb5CjAUYrJxmpCaaNPVRgRVxxyTq
► Ethereum (ETH): 0x9Ac4eB94386C3e02b96599C05B7a8C71773c9281
► Cardano (ADA): addr1v95rfxlslfzkvd8sr3exkh7st4qmgj4ywf5zcaxgqgdyunsj5juw5
► Tether (USDT): 0xeC261d9b2EE4B6997a6a424067af165BAA4afE1a
#relu #activationfunction #saturatedgradients
Watch video Why ReLU Is Better Than Other Activation Functions | Tanh Saturating Gradients online without registration, duration hours minute second in high quality. This video was added by user DataMListic 08 September 2022, don't forget to share it with your friends and acquaintances, it has been viewed on our site 3,090 once and liked it 79 people.