activations

Functions

elu(…) : Apply the exponential exponential linear unit to input. [Clevert et.al, 2015].

exponential(…) : Apply the exponential activation to input.

get(…) : Return the activation callable by identifier.

hard_sigmoid(…) : Apply the hard sigmoid function to input.

linear(…) : Apply the linear activation to input.

relu(…) : Apply the rectified linear unit to input. [Nair & Hinton, 2010].

selu(…) : Apply the scaled exponential linear unit to input. [Klambauer et.al, 2017].

sigmoid(…) : Apply the sigmoid function to input.

softmax(…) : Apply the softmax function to input.

swish(…) : Apply the swish function to input. [Ramachandran et.al, 2017].

tanh(…) : Apply the tanh function to input.