LSTM

class dragon.vm.torch.nn.LSTM(
  input_size,
  hidden_size,
  num_layers=1,
  bias=True,
  batch_first=False,
  dropout=0,
  bidirectional=False
)[source]

Apply a multi-layer long short-term memory (LSTM) RNN. [Hochreiter & Schmidhuber, 1997].

Examples:

m = torch.nn.LSTM(32, 64)
x = torch.ones(8, 32, 256)
outputs, hidden = m(x)

__init__

LSTM.__init__(
  input_size,
  hidden_size,
  num_layers=1,
  bias=True,
  batch_first=False,
  dropout=0,
  bidirectional=False
)[source]

Create a LSTM module.

input_size : int
The dimension of input.
hidden_size : int
The dimension of hidden state.
num_layers : int, optional, default=1
The number of recurrent layers.
bias : bool, optional, default=True
True to use bias.
batch_first : bool, optional, default=False
True to use order [N, T, C] otherwise [T, N, C].
dropout : number, optional, default=0
The dropout ratio.
bidirectional : bool, optional, default=False
Whether to create a bidirectional lstm.