mmengine.dist.broadcast — mmengine 0.10.7 documentation (original) (raw)
mmengine.dist.broadcast(data, src=0, group=None)[source]¶
Broadcast the data from src
process to the whole group.
data
must have the same number of elements in all processes participating in the collective.
Note
Calling broadcast
in non-distributed environment does nothing.
Parameters:
- data (Tensor) – Data to be sent if
src
is the rank of current process, and data to be used to save received data otherwise. - src (int) – Source rank. Defaults to 0.
- group (ProcessGroup , optional) – The process group to work on. If None, the default process group will be used. Defaults to None.
Return type:
None
Examples
import torch import mmengine.dist as dist
non-distributed environment
data = torch.arange(2, dtype=torch.int64) data tensor([0, 1]) dist.broadcast(data) data tensor([0, 1])
distributed environment
We have 2 process groups, 2 ranks.
data = torch.arange(2, dtype=torch.int64) + 1 + 2 * rank data tensor([1, 2]) # Rank 0 tensor([3, 4]) # Rank 1 dist.broadcast(data) data tensor([1, 2]) # Rank 0 tensor([1, 2]) # Rank 1