TorchCheckpointIO — PyTorch Lightning 2.5.1.post0 documentation (original) (raw)
class lightning.pytorch.plugins.io.TorchCheckpointIO[source]¶
Bases: CheckpointIO
CheckpointIO that utilizes torch.save() and torch.load() to save and load checkpoints respectively, common for most use cases.
load_checkpoint(path, map_location=<function TorchCheckpointIO.>)[source]¶
Loads checkpoint using torch.load(), with additional handling for fsspec
remote loading of files.
Parameters:
- path¶ (Union[str, Path]) – Path to checkpoint
- map_location¶ (Optional[Callable]) – a function, torch.device, string or a dict specifying how to remap storage locations.
Return type:
Returns: The loaded checkpoint.
Raises:
FileNotFoundError – If path
is not found by the fsspec
filesystem
remove_checkpoint(path)[source]¶
Remove checkpoint file from the filesystem.
Parameters:
path¶ (Union[str, Path]) – Path to checkpoint
Return type:
save_checkpoint(checkpoint, path, storage_options=None)[source]¶
Save model/training states as a checkpoint file through state-dump and file-write.
Parameters:
- checkpoint¶ (dict[str, Any]) – dict containing model and trainer state
- path¶ (Union[str, Path]) – write-target path
- storage_options¶ (Optional[Any]) – not used in
TorchCheckpointIO.save_checkpoint
Raises:
TypeError – If storage_options
arg is passed in
Return type: