StepLR — PyTorch 2.7 documentation (original) (raw)
class torch.optim.lr_scheduler.StepLR(optimizer, step_size, gamma=0.1, last_epoch=-1)[source][source]¶
Decays the learning rate of each parameter group by gamma every step_size epochs.
Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler. When last_epoch=-1, sets initial lr as lr.
Parameters
- optimizer (Optimizer) – Wrapped optimizer.
- step_size (int) – Period of learning rate decay.
- gamma (float) – Multiplicative factor of learning rate decay. Default: 0.1.
- last_epoch (int) – The index of last epoch. Default: -1.
Example
Assuming optimizer uses lr = 0.05 for all groups
lr = 0.05 if epoch < 30
lr = 0.005 if 30 <= epoch < 60
lr = 0.0005 if 60 <= epoch < 90
...
scheduler = StepLR(optimizer, step_size=30, gamma=0.1) for epoch in range(100): train(...) validate(...) scheduler.step()
Return last computed learning rate by current scheduler.
Return type
Compute the learning rate of each parameter group.
load_state_dict(state_dict)[source]¶
Load the scheduler’s state.
Parameters
state_dict (dict) – scheduler state. Should be an object returned from a call to state_dict().
Return the state of the scheduler as a dict.
It contains an entry for every variable in self.__dict__ which is not the optimizer.
Perform a step.