🦙
【SiLU】Swish activation function【Method】
1. What is Swish activation function
Swish is a activatiion function similar to ReLU. It can be expressed a "smooth ReLU"
・ Swish formula
where
this called SiLU, equivarent to when swish's beta is 1.
Swish achieved some improved score from ReLU, the result can be attributed to its smoothness.
Reference
[1] Swish, Paper With Code
Discussion