RNN

class dragon.nn.RNN(
  input_size,
  hidden_size,
  nonlinearity='relu',
  num_layers=1,
  bidirectional=False,
  dropout=0,
  **kwargs
)[source]

Apply a multi-layer Elman RNN. [Elman, 1990].

The data format of inputs should be \((T, N, C)\):

t, n, c = 8, 2, 4
m = dragon.nn.RNN(8, 16)
x = dragon.constant([t, n, c], 'float32')
y = m(x)

__init__

RNN.__init__(
  input_size,
  hidden_size,
  nonlinearity='relu',
  num_layers=1,
  bidirectional=False,
  dropout=0,
  **kwargs
)[source]

Create a RNN module.

Parameters:
  • input_size (int) The dimension of input.
  • hidden_size (int) The dimension of hidden state.
  • nonlinearity ({'tanh', 'relu'}, optional) The nonlinearity.
  • num_layers (int, optional, default=1) The number of recurrent layers.
  • bidirectional (bool, optional, default=False) Whether to create a bidirectional rnn.
  • dropout (number, optional, default=0) The dropout ratio.