tf.keras.layers.ReLU  |  TensorFlow v2.0.0 (original) (raw)

tf.keras.layers.ReLU

Stay organized with collections Save and categorize content based on your preferences.

Rectified Linear Unit activation function.

Inherits From: Layer

View aliases

Compat aliases for migration

SeeMigration guide for more details.

tf.compat.v1.keras.layers.ReLU

tf.keras.layers.ReLU(
    max_value=None, negative_slope=0, threshold=0, **kwargs
)

With default values, it returns element-wise max(x, 0).

Otherwise, it follows:f(x) = max_value for x >= max_value,f(x) = x for threshold <= x < max_value,f(x) = negative_slope * (x - threshold) otherwise.

Input shape:

Arbitrary. Use the keyword argument input_shape(tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.

Arguments
max_value Float >= 0. Maximum activation value.
negative_slope Float >= 0. Negative slope coefficient.
threshold Float. Threshold value for thresholded activation.