tf.keras.losses.BinaryCrossentropy  |  TensorFlow v2.16.1 (original) (raw)

tf.keras.losses.BinaryCrossentropy

Stay organized with collections Save and categorize content based on your preferences.

Computes the cross-entropy loss between true labels and predicted labels.

Inherits From: Loss

tf.keras.losses.BinaryCrossentropy(
    from_logits=False,
    label_smoothing=0.0,
    axis=-1,
    reduction='sum_over_batch_size',
    name='binary_crossentropy'
)

Used in the notebooks

Used in the guide Used in the tutorials
Distributed training with TensorFlow Estimators Migrate `tf.feature_column`s to Keras preprocessing layers Using Counterfactual Logit Pairing with Keras Load a pandas DataFrame Transfer learning and fine-tuning Basic text classification Parameter server training with ParameterServerStrategy CycleGAN

Use this cross-entropy loss for binary (0 or 1) classification applications. The loss function requires the following inputs:

Args
from_logits Whether to interpret y_pred as a tensor oflogit values. By default, we assume that y_pred is probabilities (i.e., values in [0, 1]).
label_smoothing Float in range [0, 1]. When 0, no smoothing occurs. When > 0, we compute the loss between the predicted labels and a smoothed version of the true labels, where the smoothing squeezes the labels towards 0.5. Larger values oflabel_smoothing correspond to heavier smoothing.
axis The axis along which to compute crossentropy (the features axis). Defaults to -1.
reduction Type of reduction to apply to the loss. In almost all cases this should be "sum_over_batch_size". Supported options are "sum", "sum_over_batch_size" or None.
name Optional name for the loss instance.

Examples:

Recommended Usage: (set from_logits=True)

With compile() API:

model.compile(
    loss=keras.losses.BinaryCrossentropy(from_logits=True),
    ...
)

As a standalone function:

# Example 1: (batch_size = 1, number of samples = 4) y_true = [0, 1, 0, 0] y_pred = [-18.6, 0.51, 2.94, -12.8] bce = keras.losses.BinaryCrossentropy(from_logits=True) bce(y_true, y_pred) 0.865

# Example 2: (batch_size = 2, number of samples = 4) y_true = [[0, 1], [0, 0]] y_pred = [[-18.6, 0.51], [2.94, -12.8]] # Using default 'auto'/'sum_over_batch_size' reduction type. bce = keras.losses.BinaryCrossentropy(from_logits=True) bce(y_true, y_pred) 0.865 # Using 'sample_weight' attribute bce(y_true, y_pred, sample_weight=[0.8, 0.2]) 0.243 # Using 'sum' reduction` type. bce = keras.losses.BinaryCrossentropy(from_logits=True, reduction="sum") bce(y_true, y_pred) 1.730 # Using 'none' reduction type. bce = keras.losses.BinaryCrossentropy(from_logits=True, reduction=None) bce(y_true, y_pred) array([0.235, 1.496], dtype=float32)

Default Usage: (set from_logits=False)

# Make the following updates to the above "Recommended Usage" section # 1. Set `from_logits=False` keras.losses.BinaryCrossentropy() # OR ...('from_logits=False') # 2. Update `y_pred` to use probabilities instead of logits y_pred = [0.6, 0.3, 0.2, 0.8] # OR [[0.6, 0.3], [0.2, 0.8]]

Methods

call

View source

call(
    y_true, y_pred
)

from_config

View source

@classmethod from_config( config )

get_config

View source

get_config()

__call__

View source

__call__(
    y_true, y_pred, sample_weight=None
)

Call self as a function.