initialize_dist (original) (raw)

Back to top

Edit this page

Toggle table of contents sidebar

composer.utils.dist.initialize_dist(device, timeout=300.0)[source]#

Initialize the default PyTorch distributed process group.

This function assumes that the following environment variables are set:

If none of the environment variables are set, this function will assume a single-rank configuration and initialize the default process group using a torch.distributed.HashStore store.

Parameters