SimpleRNNCell
classkeras.layers.SimpleRNNCell(
units,
activation="tanh",
use_bias=True,
kernel_initializer="glorot_uniform",
recurrent_initializer="orthogonal",
bias_initializer="zeros",
kernel_regularizer=None,
recurrent_regularizer=None,
bias_regularizer=None,
kernel_constraint=None,
recurrent_constraint=None,
bias_constraint=None,
dropout=0.0,
recurrent_dropout=0.0,
seed=None,
**kwargs
)
Cell class for SimpleRNN.
This class processes one step within the whole time sequence input, whereas
keras.layer.SimpleRNN
processes the whole sequence.
Arguments
tanh
).
If you pass None
, no activation is applied
(ie. "linear" activation: a(x) = x
).True
), whether the layer
should use a bias vector.kernel
weights matrix,
used for the linear transformation of the inputs. Default:
"glorot_uniform"
.recurrent_kernel
weights matrix, used for the linear transformation
of the recurrent state. Default: "orthogonal"
."zeros"
.kernel
weights
matrix. Default: None
.recurrent_kernel
weights matrix. Default: None
.None
.kernel
weights
matrix. Default: None
.recurrent_kernel
weights matrix. Default: None
.None
.Call arguments
(batch, features)
.(batch, units)
, which is the state
from the previous time step.dropout
or
recurrent_dropout
is used.Example
inputs = np.random.random([32, 10, 8]).astype(np.float32)
rnn = keras.layers.RNN(keras.layers.SimpleRNNCell(4))
output = rnn(inputs) # The output has shape `(32, 4)`.
rnn = keras.layers.RNN(
keras.layers.SimpleRNNCell(4),
return_sequences=True,
return_state=True
)
# whole_sequence_output has shape `(32, 10, 4)`.
# final_state has shape `(32, 4)`.
whole_sequence_output, final_state = rnn(inputs)