tf.raw_ops.SparseSoftmaxCrossEntropyWithLogits  |  TensorFlow v2.16.1 (original) (raw)

tf.raw_ops.SparseSoftmaxCrossEntropyWithLogits

Stay organized with collections Save and categorize content based on your preferences.

Computes softmax cross entropy cost and gradients to backpropagate.

View aliases

Compat aliases for migration

SeeMigration guide for more details.

tf.compat.v1.raw_ops.SparseSoftmaxCrossEntropyWithLogits

tf.raw_ops.SparseSoftmaxCrossEntropyWithLogits(
    features, labels, name=None
)

Unlike SoftmaxCrossEntropyWithLogits, this operation does not accept a matrix of label probabilities, but rather a single label per row of features. This label is considered to have probability 1.0 for the given row.

Inputs are the logits, not probabilities.

Args
features A Tensor. Must be one of the following types: half, bfloat16, float32, float64. batch_size x num_classes matrix
labels A Tensor. Must be one of the following types: int32, int64. batch_size vector with values in [0, num_classes). This is the label for the given minibatch entry.
name A name for the operation (optional).
Returns
A tuple of Tensor objects (loss, backprop).
loss A Tensor. Has the same type as features.
backprop A Tensor. Has the same type as features.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.

Last updated 2024-04-26 UTC.