tf.RNNEncoderBlock module
tf.RNNEncoderBlock module¶
Base encoder block from which the RNNEncoder models are built.
- class tf.RNNEncoderBlock.RNNEncoderBlock(*args: Any, **kwargs: Any)¶
Bases:
modelzoo.common.layers.tf.BaseLayer.BaseLayer
The RNN Encoder models for the language modeling task and sentiment analysis. The init takes a nested params dictionary. Each of the following sections represents one of the outer keys, which must have its own set of parameters.
- Parameters
hidden_size (int) – Number of units inside each RNN layer.
encoder_depth (int) – Number of RNN layers. Depth of the model.
dropout_rate (float) – Fraction to drop.
use_residuals (bool) – Whether skip connections are used.
residual_type (str) –
Options are
'short'
or'long'
. Iflong
, adds the input to the end of the block. Ifshort
, adds to the middle. The following is an example of short residuals for a certain layer ordering:--*-->RNN---->Layer_Norm--(+)-->Dropout---->RNN \______________________/
The following is an example of long residuals for the same layers:
--*-->RNN---->Layer_Norm---->Dropout--(+)-->RNN \__________________________________/
enable_layer_norm_before_dropout (bool) – Exclusive from
enable_layer_norm_after_dropout
. Inserts a layer norm before the dropout in eachRNNEncoderBlock
layer.enable_layer_norm_after_dropout (bool) – Exclusive from
enable_layer_norm_before_dropout
. Inserts a layer norm after the dropout in eachRNNEncoderBlock
layer.initializer (str) –
See Tensorflow documentation for
SimpleRNNCell
for more on the following parameters:bias_initializer (str)
rnn_use_bias (bool)
rnn_activation (str)
rnn_recurrent_activation (str)
rnn_unit_forget_bias (bool)
rnn_cell (str) – Can be
'lstm'
,'gru'
or'rnn'
.
- call(inputs, mask=None, is_training=True)¶
Embedding layer inputs or whatnot