ReLU Activation Function Variants Explained
>> YOUR LINK HERE: ___ http://youtube.com/watch?v=ScGmrFBmoVI
In this video we explain the various ReLU activation function variants including: Leaky ReLU (LReLU), Parametric ReLU (PReLU), Gaussian Error Linear Unit (GELU), Sigmoid Linear Unit (SILU), Softplus and Exponential Linear Unit (ELU). • Related Videos • ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ • Why ReLU is better than other activation functions: • Why ReLU Is Better Than Other Activat... • Why activations functions are necessary in NN: • Why We Need Activation Functions In N... • Contents • ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ • 00:00 - Intro • 00:10 - Leaky ReLU (LReLU) • 01:06 - Parametric ReLU (PReLU) • 02:14 - Gaussian Error Linear Unit (GELU) • 03:44 - Sigmoid Linear Unit (SILU) • 05:22 - Softplus • 07:01 - Exponential Linear Unit (ELU) • 08:21 - Discussion • 08:52 - Outro • Follow Me • ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ • 🐦 Twitter: @datamlistic / datamlistic • 📸 Instagram: @datamlistic / datamlistic • 📱 TikTok: @datamlistic / datamlistic • Channel Support • ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ • The best way to support the channel is to share the content. ;) • If you'd like to also support the channel financially, donating the price of a coffee is always warmly welcomed! (completely optional and voluntary) • ► Patreon: / datamlistic • ► Bitcoin (BTC): 3C6Pkzyb5CjAUYrJxmpCaaNPVRgRVxxyTq • ► Ethereum (ETH): 0x9Ac4eB94386C3e02b96599C05B7a8C71773c9281 • ► Cardano (ADA): addr1v95rfxlslfzkvd8sr3exkh7st4qmgj4ywf5zcaxgqgdyunsj5juw5 • ► Tether (USDT): 0xeC261d9b2EE4B6997a6a424067af165BAA4afE1a • ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ • Special thanks to Arka Mitra for the video idea! • #relu #prelu #leakyrelu #prelu #gelu #elu #softplus #silu
#############################
