dragon.nn

Classes

class GRU : Apply a multi-layer gated recurrent unit (GRU) RNN. [Cho et.al, 2014].

class LSTM : Apply a multi-layer long short-term memory (LSTM) RNN. [Hochreiter & Schmidhuber, 1997].

class RNN : Apply a multi-layer Elman RNN. [Elman, 1990].

Functions

batch_norm(…) : Apply the batch normalization. [Ioffe & Szegedy, 2015].

bias_add(…) : Add the bias across channels to input.

conv2d(…) : Apply the 2d convolution.

conv2d_transpose(…) : Apply the 2d deconvolution.

depthwise_conv2d(…) : Apply the 2d depthwise convolution.

depth_to_space(…) : Rearrange depth data into spatial blocks.

dropout(…) : Set the elements of the input to zero randomly. [Srivastava et.al, 2014].

drop_block2d(…) : Set the spatial blocks over input to zero randomly. [Ghiasi et.al, 2018].

drop_path(…) : Set the examples over the input to zero randomly. [Larsson et.al, 2016].

elu(…) : Apply the exponential linear unit. [Clevert et.al, 2015].

fully_connected(…) : Compute the dense matrix multiplication along the given axes.

group_norm(…) : Apply the group normalization. [Wu & He, 2018].

instance_norm(…) : Apply the instance normalization. [Ulyanov et.al, 2016]

layer_norm(…) : Apply the layer normalization. [Ba et.al, 2016]

leaky_relu(…) : Apply the leaky rectified linear unit.

local_response_norm(…) : Apply the local response normalization. [Krizhevsky et.al, 2012].

log_softmax(…) : Apply the composite of logarithm and softmax.

prelu(…) : Apply the parametric rectified linear unit. [He et.al, 2015].

pool2d(…) : Apply the 2d pooling.

relu(…) : Apply the rectified linear unit. [Nair & Hinton, 2010].

relu6(…) : Apply the clipped-6 rectified linear unit. [Krizhevsky, 2010].

selu(…) : Apply the scaled exponential linear unit. [Klambauer et.al, 2017].

softmax(…) : Apply the softmax function.

space_to_depth(…) : Rearrange blocks of spatial data into depth.

sync_batch_norm(…) : Apply the batch normalization with synced statistics. [Ioffe & Szegedy, 2015].