tf.ActivationLayer module
tf.ActivationLayer module¶
- class tf.ActivationLayer.ActivationLayer(*args: Any, **kwargs: Any)¶
Bases:
modelzoo.common.layers.tf.BaseLayer.BaseLayer
Wrapper around the Keras activation layer.
Also supports
activation="GeLU"
, which is currently missing inkeras.layers.ActivationLayer
.- Parameters
activation (Union[str, Callable]) – The function to be applied. This can either be callable string name of a Tensorflow built-in activation, or one of
"gelu"
or"lrelu"
(lrelu
denotes LeakyReLU).boundary_casting (bool) – If
True
, outputs the values in half precision and casts the input values up to full precision.tf_summary (bool) – If
True
, saves the activations with thesummary_layer
.
- call(inputs, **kwargs)¶
Apply the activation layer.
- Parameters
inputs – Arbitrary tensor.
- Returns
A tensor of the same shape as the input.
- Return type
Tensor
- static gelu(x)¶