torch.autograd.functional.vjp — PyTorch 2.7 documentation (original) (raw)

torch.autograd.functional.vjp(func, inputs, v=None, create_graph=False, strict=False)[source][source]

Compute the dot product between a vector v and the Jacobian of the given function at the point given by the inputs.

Parameters

Returns

tuple with:

func_output (tuple of Tensors or Tensor): output of func(inputs)

vjp (tuple of Tensors or Tensor): result of the dot product with the same shape as the inputs.

Return type

output (tuple)

Example

def exp_reducer(x): ... return x.exp().sum(dim=1) inputs = torch.rand(4, 4) v = torch.ones(4) vjp(exp_reducer, inputs, v) (tensor([5.7817, 7.2458, 5.7830, 6.7782]), tensor([[1.4458, 1.3962, 1.3042, 1.6354], [2.1288, 1.0652, 1.5483, 2.5035], [2.2046, 1.1292, 1.1432, 1.3059], [1.3225, 1.6652, 1.7753, 2.0152]]))

vjp(exp_reducer, inputs, v, create_graph=True) (tensor([5.7817, 7.2458, 5.7830, 6.7782], grad_fn=), tensor([[1.4458, 1.3962, 1.3042, 1.6354], [2.1288, 1.0652, 1.5483, 2.5035], [2.2046, 1.1292, 1.1432, 1.3059], [1.3225, 1.6652, 1.7753, 2.0152]], grad_fn=))

def adder(x, y): ... return 2 * x + 3 * y inputs = (torch.rand(2), torch.rand(2)) v = torch.ones(2) vjp(adder, inputs, v) (tensor([2.4225, 2.3340]), (tensor([2., 2.]), tensor([3., 3.])))