class GRU : Apply a multi-layer gated recurrent unit (GRU) RNN. [Cho et.al, 2014].

class LSTM : Apply a multi-layer long short-term memory (LSTM) RNN. [Hochreiter & Schmidhuber, 1997].

class RNN : Apply a multi-layer Elman RNN. [Elman, 1990].


batch_norm(…) : Apply the batch normalization. [Ioffe & Szegedy, 2015].

bias_add(…) : Add the bias across channels to input.

channel_norm(…) : Apply the normalization to each channel of input.

channel_shuffle(…) : Apply the group shuffle to each channel of input. [Zhang et.al, 2017].

conv(…) : Apply the n-dimension convolution.

conv_transpose(…) : Apply the n-dimension deconvolution.

conv1d(…) : Apply the 1d convolution.

conv1d_transpose(…) : Apply the 1d deconvolution.

conv2d(…) : Apply the 2d convolution.

conv2d_transpose(…) : Apply the 2d deconvolution.

conv3d(…) : Apply the 3d convolution.

conv3d_transpose(…) : Apply the 3d deconvolution.

depthwise_conv2d(…) : Apply the 2d depthwise convolution.

depth_to_space(…) : Rearrange depth data into spatial blocks.

dropout(…) : Set the elements of input to zero randomly. [Srivastava et.al, 2014].

drop_block(…) : Set the blocks over input to zero randomly. [Ghiasi et.al, 2018].

drop_path(…) : Set the examples over input to zero randomly. [Larsson et.al, 2016].

elu(…) : Apply the exponential linear unit. [Clevert et.al, 2015].

gelu(…) : Apply the gaussian error linear unit. [Hendrycks & Gimpel, 2016].

group_norm(…) : Apply the group normalization. [Wu & He, 2018].

hardsigmoid(…) : Apply the hard sigmoid function.

hardswish(…) : Apply the hard swish function. [Howard et.al, 2019].

instance_norm(…) : Apply the instance normalization. [Ulyanov et.al, 2016]

layer_norm(…) : Apply the layer normalization. [Ba et.al, 2016]

leaky_relu(…) : Apply the leaky rectified linear unit.

local_response_norm(…) : Apply the local response normalization. [Krizhevsky et.al, 2012].

log_softmax(…) : Compute the composite of logarithm and softmax.

lp_norm(…) : Apply the lp normalization.

moments(…) : Compute the mean and variance of input along the given axis.

prelu(…) : Apply the parametric rectified linear unit. [He et.al, 2015].

pool1d(…) : Apply the 1d pooling.

pool2d(…) : Apply the 2d pooling.

pool3d(…) : Apply the 3d pooling.

relu(…) : Apply the rectified linear unit. [Nair & Hinton, 2010].

relu6(…) : Apply the clipped-6 rectified linear unit. [Krizhevsky, 2010].

selu(…) : Apply the scaled exponential linear unit. [Klambauer et.al, 2017].

silu(…) : Apply the sigmoid linear unit. [Hendrycks & Gimpel, 2016].

softmax(…) : Compute the softmax result.

space_to_depth(…) : Rearrange blocks of spatial data into depth.

sync_batch_norm(…) : Apply the batch normalization with synced statistics. [Ioffe & Szegedy, 2015].