twodlearn.recurrent module¶
-
class
twodlearn.recurrent.BaseCell(trainable=True, name=None, *args, **kwargs)[source]¶ Bases:
twodlearn.core.layers.Layer-
build(input_shape)[source]¶ Creates the variables of the layer (optional, for subclass implementers).
This is a method that implementers of subclasses of Layer or Model can override if they need a state-creation step in-between layer instantiation and layer call.
This is typically used to create the weights of Layer subclasses.
- Parameters
input_shape – Instance of TensorShape, or list of instances of TensorShape if the layer expects a list of inputs (one instance per input).
-
-
class
twodlearn.recurrent.DenseCell(trainable=True, name=None, *args, **kwargs)[source]¶ Bases:
twodlearn.recurrent.BaseCellBase RNN cell for which the inputs and states are represented using a dense tensor.
-
class
twodlearn.recurrent.Lstm(n_inputs, n_outputs, n_hidden, name=None, **kargs)[source]¶ Bases:
twodlearn.recurrent.Rnn-
class
LstmSetup(**kargs)[source]¶ Bases:
twodlearn.recurrent.RnnSetup-
class
LstmStateAndOutput(hidden, y)[source]¶ Bases:
twodlearn.core.common.TdlModel
-
class
-
ModelOutput[source]¶ alias of
Lstm.LstmSetup
-
class
-
class
twodlearn.recurrent.Lstm2Lstm(n_inputs, n_outputs, n_hidden, afunction=<function tanh>, encoder_afunction=<function tanh>, name=None)[source]¶ Bases:
twodlearn.core.common.TdlModelUses an Lstm to convert a fixed length sequence into the initial state for an LSTM sequential model
-
ModelOutput[source]¶ alias of
Lstm2Lstm.Output
-
-
class
twodlearn.recurrent.LstmCellOptimized(n_inputs, n_units, afunction=<function tanh>, name='LstmCell')[source]¶ Bases:
twodlearn.core.common.TdlModel- Single lstm cell defined as in: “Generating Sequences with
Recurrent Neural Networks”, Alex Graves, 2014
-
afunction[source] activation function for the cell
-
n_inputs[source]
-
class
twodlearn.recurrent.Mlp2Lstm(n_inputs, n_outputs, window_size=1, name=None, **kargs)[source]¶ Bases:
twodlearn.core.common.TdlModelUses an MLP to convert a fixed length sequence into the initial state for an LSTM sequential model
-
class
twodlearn.recurrent.MlpNarx(n_inputs, n_outputs, window_size, n_hidden, afunction=<function relu>, name='mlp_narx', **kargs)[source]¶ Bases:
twodlearn.recurrent.NarxNarx that uses an Mlp as cell
-
CellModel[source]¶ alias of
twodlearn.feedforward.MlpNet
-
ModelOutput[source]¶ alias of
MlpNarx.Output
-
class
Output(model, x0=None, n_unrollings=1, batch_size=None, inputs=None, compute_loss=True, options=None, name=None)[source]¶ Bases:
twodlearn.recurrent.NarxSetup
Number of hidden layers
-
-
class
twodlearn.recurrent.MultilayerLstmCell(n_inputs, n_hidden, n_outputs=None, output_layer=None, name=None, **kargs)[source]¶ Bases:
twodlearn.core.common.TdlModel-
class
MultilayerLstmCellSetup(**kargs)[source]¶ Bases:
twodlearn.core.common.OutputModel
-
class
-
class
twodlearn.recurrent.Narx(n_inputs, n_outputs, window_size=1, name='narx', **kargs)[source]¶ Bases:
twodlearn.recurrent.Rnn
-
class
twodlearn.recurrent.Rnn(n_inputs, n_outputs, n_states=None, name='rnn', **kargs)[source]¶
-
class
twodlearn.recurrent.SimpleRnn(cell, name='SimpleRnn', options=None, **kargs)[source]¶ Bases:
twodlearn.core.common.TdlModel-
class
RnnOutput(model, x0=None, inputs=None, n_unrollings=None, options=None, name=None)[source]¶
-
class
-
class
twodlearn.recurrent.StateSpaceCell(trainable=True, name=None, *args, **kwargs)[source]¶ Bases:
twodlearn.recurrent.BaseCell
-
class
twodlearn.recurrent.StateSpaceDense(trainable=True, name=None, *args, **kwargs)[source]¶ Bases:
twodlearn.recurrent.StateSpaceCell,twodlearn.recurrent.DenseCell