torch.optim.Optimizer.register_load_state_dict_pre_hook — PyTorch 2.7 documentation (original) (raw)
Optimizer.register_load_state_dict_pre_hook(hook, prepend=False)[source][source]¶
Register a load_state_dict pre-hook which will be called beforeload_state_dict() is called. It should have the following signature:
hook(optimizer, state_dict) -> state_dict or None
The optimizer
argument is the optimizer instance being used and thestate_dict
argument is a shallow copy of the state_dict
the user passed in to load_state_dict
. The hook may modify the state_dict inplace or optionally return a new one. If a state_dict is returned, it will be used to be loaded into the optimizer.
The hook will be called with argument self
and state_dict
before calling load_state_dict
on self
. The registered hook can be used to perform pre-processing before the load_state_dict
call is made.
Parameters
- hook (Callable) – The user defined hook to be registered.
- prepend (bool) – If True, the provided pre
hook
will be fired before all the already registered pre-hooks onload_state_dict
. Otherwise, the providedhook
will be fired after all the already registered pre-hooks. (default: False)
Returns
a handle that can be used to remove the added hook by callinghandle.remove()
Return type
torch.utils.hooks.RemoveableHandle