torch.optim.Optimizer.register_state_dict_pre_hook — PyTorch 2.7 documentation (original) (raw)

Optimizer.register_state_dict_pre_hook(hook, prepend=False)[source][source]

Register a state dict pre-hook which will be called before state_dict() is called.

It should have the following signature:

The optimizer argument is the optimizer instance being used. The hook will be called with argument self before calling state_dict on self. The registered hook can be used to perform pre-processing before the state_dictcall is made.

Parameters

Returns

a handle that can be used to remove the added hook by callinghandle.remove()

Return type

torch.utils.hooks.RemoveableHandle