torch.nn.utils.clip_grad_value_ — PyTorch 2.7 documentation (original) (raw)
torch.nn.utils.clip_grad_value_(parameters, clip_value, foreach=None)[source][source]¶
Clip the gradients of an iterable of parameters at specified value.
Gradients are modified in-place.
Parameters
- parameters (Iterable _[_Tensor] or Tensor) – an iterable of Tensors or a single Tensor that will have gradients normalized
- clip_value (float) – maximum allowed value of the gradients. The gradients are clipped in the range[-clip_value,clip_value]\left[\text{-clip\_value}, \text{clip\_value}\right]
- foreach (bool) – use the faster foreach-based implementation If
None
, use the foreach implementation for CUDA and CPU native tensors and silently fall back to the slow implementation for other device types. Default:None