XLACheckpointIO — PyTorch Lightning 2.5.1.post0 documentation (original) (raw)
class lightning.pytorch.plugins.io.XLACheckpointIO(*args, **kwargs)[source]¶
Bases: TorchCheckpointIO
CheckpointIO that utilizes xm.save
to save checkpoints for TPU training strategies.
save_checkpoint(checkpoint, path, storage_options=None)[source]¶
Save model/training states as a checkpoint file through state-dump and file-write.
Parameters:
- checkpoint¶ (dict[str, Any]) – dict containing model and trainer state
- path¶ (Union[str, Path]) – write-target path
- storage_options¶ (Optional[Any]) – not used in
XLACheckpointIO.save_checkpoint
Raises:
TypeError – If storage_options
arg is passed in
Return type: