Module: tf.keras.activations | TensorFlow v2.0.0 (original) (raw)
Module: tf.keras.activations
Stay organized with collections Save and categorize content based on your preferences.
Built-in activation functions.
Functions
elu(...): Exponential linear unit.
exponential(...): Exponential activation function.
hard_sigmoid(...): Hard sigmoid activation function.
linear(...): Linear activation function.
relu(...): Rectified Linear Unit.
selu(...): Scaled Exponential Linear Unit (SELU).
sigmoid(...): Sigmoid.
softmax(...): The softmax activation function transforms the outputs so that all values are in
softplus(...): Softplus activation function.
softsign(...): Softsign activation function.
tanh(...): Hyperbolic Tangent (tanh) activation function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.