torch.optim.Optimizer.register_state_dict_pre_hook — PyTorch 2.7 documentation (original) (raw)
Optimizer.register_state_dict_pre_hook(hook, prepend=False)[source][source]¶
Register a state dict pre-hook which will be called before state_dict() is called.
It should have the following signature:
The optimizer
argument is the optimizer instance being used. The hook will be called with argument self
before calling state_dict
on self
. The registered hook can be used to perform pre-processing before the state_dict
call is made.
Parameters
- hook (Callable) – The user defined hook to be registered.
- prepend (bool) – If True, the provided pre
hook
will be fired before all the already registered pre-hooks onstate_dict
. Otherwise, the providedhook
will be fired after all the already registered pre-hooks. (default: False)
Returns
a handle that can be used to remove the added hook by callinghandle.remove()
Return type
torch.utils.hooks.RemoveableHandle