XLACheckpointIO — PyTorch Lightning 2.5.1.post0 documentation (original) (raw)

class lightning.pytorch.plugins.io.XLACheckpointIO(*args, **kwargs)[source]

Bases: TorchCheckpointIO

CheckpointIO that utilizes xm.save to save checkpoints for TPU training strategies.

save_checkpoint(checkpoint, path, storage_options=None)[source]

Save model/training states as a checkpoint file through state-dump and file-write.

Parameters:

Raises:

TypeError – If storage_options arg is passed in

Return type:

None