LSTM

class dragon.vm.torch.nn.LSTM(
  input_size,
  hidden_size,
  num_layers=1,
  bias=True,
  batch_first=False,
  dropout=0,
  bidirectional=False
)[source]

Apply a multi-layer long short-term memory (LSTM) RNN. [Hochreiter & Schmidhuber, 1997].

Examples:

m = torch.nn.LSTM(32, 64)
x = torch.ones(8, 32, 256)
outputs, hidden = m(x)

__init__

LSTM.__init__(
  input_size,
  hidden_size,
  num_layers=1,
  bias=True,
  batch_first=False,
  dropout=0,
  bidirectional=False
)[source]

Create a LSTM module.

input_sizeint
The dimension of input.
hidden_sizeint
The dimension of hidden state.
num_layersint, optional, default=1
The number of recurrent layers.
biasbool, optional, default=True
True to use bias.
batch_firstbool, optional, default=False
True to use order [N, T, C] otherwise [T, N, C].
dropoutnumber, optional, default=0
The dropout ratio.
bidirectionalbool, optional, default=False
Whether to create a bidirectional lstm.