tf.keras.activations.hard_silu | TensorFlow v2.16.1 (original) (raw)
tf.keras.activations.hard_silu
Stay organized with collections Save and categorize content based on your preferences.
Hard SiLU activation function, also known as Hard Swish.
View aliases
Main aliases
tf.keras.activations.hard_swish
tf.keras.activations.hard_silu(
x
)
It is defined as:
0
ifif x < -3
x
ifx > 3
x * (x + 3) / 6
if-3 <= x <= 3
It's a faster, piecewise linear approximation of the silu activation.
Args | |
---|---|
x | Input tensor. |
Reference:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.