PolynomialLR — PyTorch 2.7 documentation (original) (raw)
class torch.optim.lr_scheduler.PolynomialLR(optimizer, total_iters=5, power=1.0, last_epoch=-1)[source][source]¶
Decays the learning rate of each parameter group using a polynomial function in the given total_iters.
When last_epoch=-1, sets initial lr as lr.
Parameters
- optimizer (Optimizer) – Wrapped optimizer.
- total_iters (int) – The number of steps that the scheduler decays the learning rate. Default: 5.
- power (float) – The power of the polynomial. Default: 1.0.
Example
Assuming optimizer uses lr = 0.001 for all groups
lr = 0.001 if epoch == 0
lr = 0.00075 if epoch == 1
lr = 0.00050 if epoch == 2
lr = 0.00025 if epoch == 3
lr = 0.0 if epoch >= 4
scheduler = PolynomialLR(optimizer, total_iters=4, power=1.0) for epoch in range(100): train(...) validate(...) scheduler.step()
Return last computed learning rate by current scheduler.
Return type
Compute the learning rate.
load_state_dict(state_dict)[source]¶
Load the scheduler’s state.
Parameters
state_dict (dict) – scheduler state. Should be an object returned from a call to state_dict().
Return the state of the scheduler as a dict.
It contains an entry for every variable in self.__dict__ which is not the optimizer.
Perform a step.