torch.nn.utils.clip_grads_with_norm_ — PyTorch 2.7 documentation (original) (raw)

torch.nn.utils.clip_grads_with_norm_(parameters, max_norm, total_norm, foreach=None)[source]

Scale the gradients of an iterable of parameters given a pre-calculated total norm and desired max norm.

The gradients will be scaled by the following calculation

grad=grad∗max_normtotal_norm+1e−6grad = grad * \frac{max\_norm}{total\_norm + 1e-6}

Gradients are modified in-place.

This function is equivalent to torch.nn.utils.clip_grad_norm_() with a pre-calculated total norm.

Parameters

Returns

None

Return type

None