tf.compat.v1.math.softmax  |  TensorFlow v2.16.1 (original) (raw)

Computes softmax activations.

View aliases

Compat aliases for migration

SeeMigration guide for more details.

tf.compat.v1.nn.softmax

tf.compat.v1.math.softmax(
    logits, axis=None, name=None, dim=None
)

Used in the notebooks

Used in the tutorials
Fitting Dirichlet Process Mixture Model Using Preconditioned Stochastic Gradient Langevin Dynamics Classify Flowers with Transfer Learning

Used for multi-class predictions. The sum of all outputs generated by softmax is 1.

This function performs the equivalent of

softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis, keepdims=True)

Example usage:

softmax = tf.nn.softmax([-1, 0., 1.]) softmax <tf.Tensor: shape=(3,), dtype=float32, numpy=array([0.09003057, 0.24472848, 0.66524094], dtype=float32)> sum(softmax) <tf.Tensor: shape=(), dtype=float32, numpy=1.0>

Args
logits A non-empty Tensor. Must be one of the following types: half,float32, float64.
axis The dimension softmax would be performed on. The default is -1 which indicates the last dimension.
name A name for the operation (optional).
Returns
A Tensor. Has the same type and shape as logits.
Raises
InvalidArgumentError if logits is empty or axis is beyond the last dimension of logits.