activations

Functions

elu(…) : Apply the exponential exponential linear unit to input. [Clevert et.al, 2015].

exponential(…) : Apply the exponential activation to input.

linear(…) : Apply the linear activation to input.

relu(…) : Apply the rectified linear unit to input. [Nair & Hinton, 2010].

selu(…) : Apply the scaled exponential linear unit to input. [Klambauer et.al, 2017].

sigmoid(…) : Apply the sigmoid function to input.

softmax(…) : Apply the softmax function to input.

tanh(…) : Apply the tanh function to input.