References SELU : Self Normailizing Neural Networks GELU : Gaussian Error Linear Units (GELUs)- Elu PRELU : Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification Leaky ReLU : Rectifier Nonlinearities Improve Neural Network Acoustic Models Swish : Swish: a Self-Gated Activation Function Activation Functions Blog