-
-
Notifications
You must be signed in to change notification settings - Fork 10.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Your current environment
0.8.0 & 0.8.1
🐛 Describe the bug
#14899 introduces error for normal DP.
As https://github.com/vllm-project/vllm/blob/main/vllm/distributed/parallel_state.py#L904C35-L904C45 shown, "config.parallel_config.world_size" means tp-size * pp-size
defined in config.py while world_size is size of all visible devices from "torch.distributed". As a result, as long as a regular DP size, it will enter this branch as unexpected, thereby preventing the establishment of DP group.
Suppose replace "config.parallel_config.world_size" with "config.parallel_config.world_size_cross_dp" which matches the design of external_dp more.
if config.parallel_config.world_size_cross_dp != world_size:
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working