LearningRateFinder — PyTorch Lightning 2.5.1.post0 documentation (original) (raw)

class lightning.pytorch.callbacks.LearningRateFinder(min_lr=1e-08, max_lr=1, num_training_steps=100, mode='exponential', early_stop_threshold=4.0, update_attr=True, attr_name='')[source]

Bases: Callback

The LearningRateFinder callback enables the user to do a range test of good initial learning rates, to reduce the amount of guesswork in picking a good starting learning rate.

Parameters:

Example:

Customize LearningRateFinder callback to run at different epochs.

This feature is useful while fine-tuning models.

from lightning.pytorch.callbacks import LearningRateFinder

class FineTuneLearningRateFinder(LearningRateFinder): def init(self, milestones, *args, **kwargs): super().init(*args, **kwargs) self.milestones = milestones

def on_fit_start(self, *args, **kwargs):
    return

def on_train_epoch_start(self, trainer, pl_module):
    if trainer.current_epoch in self.milestones or trainer.current_epoch == 0:
        self.lr_find(trainer, pl_module)

trainer = Trainer(callbacks=[FineTuneLearningRateFinder(milestones=(5, 10))]) trainer.fit(...)

Raises:

MisconfigurationException – If learning rate/lr in model or model.hparams isn’t overridden, or if you are using more than one optimizer.

on_fit_start(trainer, pl_module)[source]

Called when fit begins.

Return type:

None