LearningRateMonitor — PyTorch Lightning 2.5.1.post0 documentation (original) (raw)

class lightning.pytorch.callbacks.LearningRateMonitor(logging_interval=None, log_momentum=False, log_weight_decay=False)[source]

Bases: Callback

Automatically monitor and logs learning rate for learning rate schedulers during training.

Parameters:

Raises:

MisconfigurationException – If logging_interval is none of "step", "epoch", or None.

Example:

from lightning.pytorch import Trainer from lightning.pytorch.callbacks import LearningRateMonitor lr_monitor = LearningRateMonitor(logging_interval='step') trainer = Trainer(callbacks=[lr_monitor])

Logging names are automatically determined based on optimizer class name. In case of multiple optimizers of same type, they will be named Adam,Adam-1 etc. If an optimizer has multiple parameter groups they will be named Adam/pg1, Adam/pg2 etc. To control naming, pass in aname keyword in the construction of the learning rate schedulers. A name keyword can also be used for parameter groups in the construction of the optimizer.

Example:

def configure_optimizer(self): optimizer = torch.optim.Adam(...) lr_scheduler = { 'scheduler': torch.optim.lr_scheduler.LambdaLR(optimizer, ...) 'name': 'my_logging_name' } return [optimizer], [lr_scheduler]

Example:

def configure_optimizer(self): optimizer = torch.optim.SGD( [{ 'params': [p for p in self.parameters()], 'name': 'my_parameter_group_name' }], lr=0.1 ) lr_scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, ...) return [optimizer], [lr_scheduler]

on_train_batch_start(trainer, *args, **kwargs)[source]

Called when the train batch begins.

Return type:

None

on_train_epoch_start(trainer, *args, **kwargs)[source]

Called when the train epoch begins.

Return type:

None

on_train_start(trainer, *args, **kwargs)[source]

Called before training, determines unique names for all lr schedulers in the case of multiple of the same type or in the case of multiple parameter groups.

Raises:

MisconfigurationException – If Trainer has no logger.

Return type:

None