tf.layers.DenseLayer module
tf.layers.DenseLayer module#
- class tf.layers.DenseLayer.DenseLayer(*args: Any, **kwargs: Any)#
Bases:
modelzoo.common.tf.layers.BaseLayer.BaseLayer
Wrapper around the Keras densely-connected layer. Provides support for
"gelu"
activation.- Parameters
units (int) – Number of units in the layer output.
activation (Optional[Union[str, Callable]]) – If not
None
, an activation function to be applied after the dense layer. The activation function can either be a callable string name of a Tensorflow built-in activation, or"gelu"
.use_bias (bool) – Whether to use bias.
kernel_initializer (str) – Kernel intializer. Defaults to
"glorot_uniform"
.kernel_initializer – Bias intializer. Defaults to
"zeros"
.kernel_regularizer (Optional[Callable]) – Kernel regularizer. Defaults to
None
.bias_regularizer (Optional[Callable]) – Bias regularizer. Defaults to
None
.activity_regularizer (Optional[Callable]) – Activity (output activation) regularizer. Defaults to
None
.kernel_constraint (Optional[Callable]) – Kernel constraint. Defaults to
None
.bias_constraint (Optional[Callable]) – Bias constraint. Defaults to
None
.boundary_casting (bool) – If
True
, outputs the values in half precision and casts the input values up to full precision.tf_summary (bool) – If
True
, saves the activations withsummary_layer
.
- call(inputs, **kwargs)#
Apply the densely-connected layer.
- Parameters
inputs (Tensor) – An N-D tensor with shape:
(batch_size, ..., input_dim)
.- Returns
An N-D tensor with shape:
(batch_size, ..., units)
.- Return type
Tensor