layers

Classes

class Activation : Activation layer.

class Add : Layer to add a sequence of inputs.

class AveragePooling1D : 1D average pooling layer.

class AveragePooling2D : 2D average pooling layer.

class AveragePooling3D : 3D average pooling layer.

class BatchNormalization : Batch normalization layer. [Ioffe & Szegedy, 2015].

class Concatenate : Layer to concatenate a sequence of inputs.

class Conv1D : 1D convolution layer.

class Conv1DTranspose : 1D deconvolution layer.

class Conv2D : 2D convolution layer.

class Conv2DTranspose : 2D deconvolution layer.

class Conv3D : 3D convolution layer.

class Conv3DTranspose : 3D deconvolution layer.

class Dense : Fully-connected layer.

class DepthwiseConv2D : 2D depthwise convolution layer. [Chollet, 2016].

class Dropout : Layer to apply the dropout function. [Srivastava et.al, 2014].

class ELU : Layer to apply the exponential linear unit. [Clevert et.al, 2015].

class Flatten : Layer to reshape input into a matrix.

class GlobalAveragePooling1D : 1D global average pooling layer.

class GlobalAveragePooling2D : 2D global average pooling layer.

class GlobalAveragePooling3D : 3D global average pooling layer.

class GlobalMaxPool1D : 1D global max pooling layer.

class GlobalMaxPool2D : 2D global max pooling layer.

class GlobalMaxPool3D : 3D global max pooling layer.

class Layer : The base class of layers.

class LayerNormalization : Layer normalization layer. [Ba et.al, 2016]

class LeakyReLU : Layer to apply the leaky rectified linear unit.

class Maximum : Layer to compute the maximum of a sequence of inputs.

class MaxPool1D : 1D max pooling layer.

class MaxPool2D : 2D max pooling layer.

class MaxPool3D : 3D max pooling layer.

class Minimum : Layer to compute the minimum of a sequence of inputs.

class Multiply : Layer to multiply a sequence of inputs.

class Permute : Layer to permute the dimensions of input.

class Reshape : Layer to change the dimensions of input.

class ReLU : Layer to apply the rectified linear unit. [Nair & Hinton, 2010].

class SELU : Layer to apply the scaled exponential linear unit. [Klambauer et.al, 2017].

class Softmax : Layer to apply the softmax function.

class Subtract : Layer to subtract two inputs.

class UpSampling1D : 1D upsampling layer.

class UpSampling2D : 2D upsampling layer.

class UpSampling3D : 3D upsampling layer.

class ZeroPadding1D : 1D zero padding layer.

class ZeroPadding2D : 2D zero padding layer.

class ZeroPadding3D : 3D zero padding layer.