mmengine.dist.broadcast — mmengine 0.10.7 documentation (original) (raw)

mmengine.dist.broadcast(data, src=0, group=None)[source]

Broadcast the data from src process to the whole group.

data must have the same number of elements in all processes participating in the collective.

Note

Calling broadcast in non-distributed environment does nothing.

Parameters:

Return type:

None

Examples

import torch import mmengine.dist as dist

non-distributed environment

data = torch.arange(2, dtype=torch.int64) data tensor([0, 1]) dist.broadcast(data) data tensor([0, 1])

distributed environment

We have 2 process groups, 2 ranks.

data = torch.arange(2, dtype=torch.int64) + 1 + 2 * rank data tensor([1, 2]) # Rank 0 tensor([3, 4]) # Rank 1 dist.broadcast(data) data tensor([1, 2]) # Rank 0 tensor([1, 2]) # Rank 1