CosineAnnealingLR — PyTorch 2.7 documentation (original) (raw)
class torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max, eta_min=0.0, last_epoch=-1)[source][source]¶
Set the learning rate of each parameter group using a cosine annealing schedule.
The ηmax\eta_{max} is set to the initial lr andTcurT_{cur} is the number of epochs since the last restart in SGDR:
ηt=ηmin+12(ηmax−ηmin)(1+cos(TcurTmaxπ)),Tcur≠(2k+1)Tmax;ηt+1=ηt+12(ηmax−ηmin)(1−cos(1Tmaxπ)),Tcur=(2k+1)Tmax.\begin{aligned} \eta_t & = \eta_{min} + \frac{1}{2}(\eta_{max} - \eta_{min})\left(1 + \cos\left(\frac{T_{cur}}{T_{max}}\pi\right)\right), & T_{cur} \neq (2k+1)T_{max}; \\ \eta_{t+1} & = \eta_{t} + \frac{1}{2}(\eta_{max} - \eta_{min}) \left(1 - \cos\left(\frac{1}{T_{max}}\pi\right)\right), & T_{cur} = (2k+1)T_{max}. \end{aligned}
When last_epoch=-1, sets initial lr as lr. Notice that because the schedule is defined recursively, the learning rate can be simultaneously modified outside this scheduler by other operators. If the learning rate is set solely by this scheduler, the learning rate at each step becomes:
ηt=ηmin+12(ηmax−ηmin)(1+cos(TcurTmaxπ))\eta_t = \eta_{min} + \frac{1}{2}(\eta_{max} - \eta_{min})\left(1 + \cos\left(\frac{T_{cur}}{T_{max}}\pi\right)\right)
It has been proposed inSGDR: Stochastic Gradient Descent with Warm Restarts. Note that this only implements the cosine annealing part of SGDR, and not the restarts.
Parameters
- optimizer (Optimizer) – Wrapped optimizer.
- T_max (int) – Maximum number of iterations.
- eta_min (float) – Minimum learning rate. Default: 0.
- last_epoch (int) – The index of last epoch. Default: -1.
Return last computed learning rate by current scheduler.
Return type
Retrieve the learning rate of each parameter group.
load_state_dict(state_dict)[source]¶
Load the scheduler’s state.
Parameters
state_dict (dict) – scheduler state. Should be an object returned from a call to state_dict().
Return the state of the scheduler as a dict.
It contains an entry for every variable in self.__dict__ which is not the optimizer.
Perform a step.