tf.keras.ops.silu | TensorFlow v2.16.1 (original) (raw)
tf.keras.ops.silu
Sigmoid Linear Unit (SiLU) activation function, also known as Swish.
View aliases
Main aliases
tf.keras.ops.nn.silu, tf.keras.ops.nn.swish, tf.keras.ops.swish
tf.keras.ops.silu(
x
)
The SiLU activation function is computed by the sigmoid function multiplied by its input. It is defined as f(x) = x * sigmoid(x).
| Args | |
|---|---|
| x | Input tensor. |
| Returns |
|---|
| A tensor with the same shape as x. |
Example:
x = keras.ops.convert_to_tensor([-6.0, 1.0, 0.0, 1.0, 6.0])
keras.ops.sigmoid(x)
array([0.00247262, 0.7310586, 0.5, 0.7310586, 0.9975274], dtype=float32)
keras.ops.silu(x)
array([-0.0148357, 0.7310586, 0.0, 0.7310586, 5.9851646], dtype=float32)
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.