torch.nn.utils.clip_grad_norm_ — PyTorch 2.7 documentation (original) (raw)

torch.nn.utils.clip_grad_norm_(parameters, max_norm, norm_type=2.0, error_if_nonfinite=False, foreach=None)[source][source]

Clip the gradient norm of an iterable of parameters.

The norm is computed over the norms of the individual gradients of all parameters, as if the norms of the individual gradients were concatenated into a single vector. Gradients are modified in-place.

This function is equivalent to torch.nn.utils.get_total_norm() followed bytorch.nn.utils.clip_grads_with_norm_() with the total_norm returned by get_total_norm.

Parameters

Returns

Total norm of the parameter gradients (viewed as a single vector).

Return type

Tensor