torch.nn.utils.clip_grad_norm — PyTorch 2.7 documentation (original) (raw)

Shortcuts

torch.nn.utils.clip_grad_norm(parameters, max_norm, norm_type=2.0, error_if_nonfinite=False, foreach=None)[source][source]

Clip the gradient norm of an iterable of parameters.

Warning

This method is now deprecated in favor oftorch.nn.utils.clip_grad_norm_().

Return type

Tensor

To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. As the current maintainers of this site, Facebook’s Cookies Policy applies. Learn more, including about available controls: Cookies Policy.