layers

Classes

class Add : The layer to add a sequence of inputs.

class AveragePooling2D : The average 2d pooling layer.

class BatchNormalization : The batch normalization layer. [Ioffe & Szegedy, 2015].

class Concatenate : The layer to concatenate a sequence of inputs.

class Conv2D : The 2d convolution layer.

class Conv2DTranspose : The 2d deconvolution layer.

class Dense : The fully-connected layer.

class DepthwiseConv2D : The 2d depthwise convolution layer. [Chollet, 2016].

class Dropout : The dropout layer. [Srivastava et.al, 2014].

class ELU : The layer to apply the exponential linear unit. [Clevert et.al, 2015].

class Flatten : The layer to reshape input into a matrix.

class GlobalAveragePooling2D : The global average 2d pooling layer.

class GlobalMaxPool2D : The global max 2d pooling layer.

class Layer : The base class of layers.

class LeakyReLU : The layer to apply the leaky rectified linear unit.

class Maximum : The layer to compute the maximum of a sequence of inputs.

class MaxPool2D : The max 2d pooling layer.

class Minimum : The layer to compute the minimum of a sequence of inputs.

class Multiply : The layer to multiply a sequence of inputs.

class Permute : The layer to permute the dimensions of input.

class Reshape : The layer to change the dimensions of input.

class ReLU : The layer to apply the rectified linear unit. [Nair & Hinton, 2010].

class SELU : Apply the scaled exponential linear unit. [Klambauer et.al, 2017].

class Softmax : The layer to apply the softmax function.

class Subtract : The layer to subtract two inputs.