tf.nn.batch_normalization  |  TensorFlow v2.16.1 (original) (raw)

tf.nn.batch_normalization

Stay organized with collections Save and categorize content based on your preferences.

Batch normalization.

View aliases

Compat aliases for migration

SeeMigration guide for more details.

tf.compat.v1.nn.batch_normalization

tf.nn.batch_normalization(
    x, mean, variance, offset, scale, variance_epsilon, name=None
)

Used in the notebooks

Used in the tutorials
Distributed training with DTensors

Normalizes a tensor by mean and variance, and applies (optionally) ascale \(\gamma\) to it, as well as an offset \(\beta\):

\(\frac{\gamma(x-\mu)}{\sigma}+\beta\)

mean, variance, offset and scale are all expected to be of one of two shapes:

See equation 11 in Algorithm 2 of source:Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift; S. Ioffe, C. Szegedy.

Args
x Input Tensor of arbitrary dimensionality.
mean A mean Tensor.
variance A variance Tensor.
offset An offset Tensor, often denoted \(\beta\) in equations, or None. If present, will be added to the normalized tensor.
scale A scale Tensor, often denoted \(\gamma\) in equations, orNone. If present, the scale is applied to the normalized tensor.
variance_epsilon A small float number to avoid dividing by 0.
name A name for this operation (optional).
Returns
the normalized, scaled, offset tensor.
References
Batch Normalization - Accelerating Deep Network Training by Reducing Internal Covariate Shift:Ioffe et al., 2015 (pdf)