Track and Visualize Experiments — PyTorch Lightning 2.5.1.post0 documentation (original) (raw)

MLflow Logger

The MLflow logger in PyTorch Lightning now includes a checkpoint_path_prefix parameter. This parameter allows you to prefix the checkpoint artifact’s path when logging checkpoints as artifacts.

Example usage:

import lightning as L from lightning.pytorch.loggers import MLFlowLogger

mlf_logger = MLFlowLogger( experiment_name="lightning_logs", tracking_uri="file:./ml-runs", checkpoint_path_prefix="my_prefix" ) trainer = L.Trainer(logger=mlf_logger)

Your LightningModule definition

class LitModel(L.LightningModule): def training_step(self, batch, batch_idx): # example self.logger.experiment.whatever_ml_flow_supports(...)

def any_lightning_module_function_or_hook(self):
    self.logger.experiment.whatever_ml_flow_supports(...)

Train your model

model = LitModel() trainer.fit(model)