Skip to content

Conversation

MengqingCao
Copy link
Collaborator

@MengqingCao MengqingCao commented Jun 16, 2025

What this PR does / why we need it?

  1. Fix rank set in DP scenario. The new poc version of torch-npu support setting ASCEND_RT_VISIBLE_DEVICES dynamically, thus we could use the rank set in DPEngineCoreProc directly instead of calculating local rank across dp by hand in the patched _init_data_parallel

Closes: #1170

  1. Bump torch-npu version to 2.5.1.post1.dev20250528

Closes: #1242
Closes: #1232

How was this patch tested?

CI passed with new added test.

@MengqingCao
Copy link
Collaborator Author

This should be merged after #884

@wangxiyuan
Copy link
Collaborator

I like the PR which remove the patch code. Let's merge this asap.

@MengqingCao MengqingCao force-pushed the dpfix branch 2 times, most recently from 6f48563 to a92e5fb Compare June 16, 2025 08:36
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

@github-actions github-actions bot added documentation Improvements or additions to documentation ci/build labels Jun 16, 2025
MengqingCao and others added 3 commits June 16, 2025 13:05
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Icey <1790571317@qq.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
@Yikun Yikun changed the title [DP][V1] Fix rank set in DP scenario [DP][V1] Bump torch-npu version to 2.5.1.post1.dev20250528 and fix rank set in DP Jun 16, 2025
@MengqingCao MengqingCao changed the title [DP][V1] Bump torch-npu version to 2.5.1.post1.dev20250528 and fix rank set in DP [DP][V1] Fix rank set in DP scenario & Bump torch-npu version to 2.5.1.post1.dev20250528 Jun 16, 2025
@Yikun Yikun added the ready read for review label Jun 16, 2025
@Yikun Yikun merged commit 96fa7ff into vllm-project:main Jun 16, 2025
24 checks passed
@Yikun Yikun added long-term-test enable long term test for PR accuracy-test enable all accuracy test for PR pd-test enable pd test for PR ready-for-test start test by label for PR labels Jun 16, 2025
ganyi1996ppo pushed a commit that referenced this pull request Jun 17, 2025
… to 2.5.1.post1.dev20250528 (#1247)

### What this PR does / why we need it?
Cherry-pick form #1235

1. Fix rank set in DP scenario. The new poc version of torch-npu support
setting `ASCEND_RT_VISIBLE_DEVICES` dynamically, thus we could use the
rank set in `DPEngineCoreProc` directly instead of calculating local
rank across dp by hand in the patched `_init_data_parallel`

Closes: #1170

2. Bump torch-npu version to 2.5.1.post1.dev20250528

Closes: #1242
Closes: #1232

### How was this patch tested?
CI passed with new added test.

---------

Signed-off-by: Icey <1790571317@qq.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: Icey <1790571317@qq.com>
Yikun added a commit to Yikun/vllm-ascend that referenced this pull request Jun 21, 2025
@MengqingCao MengqingCao deleted the dpfix branch June 28, 2025 01:43
shiyuan680 pushed a commit to raindaywhu/vllm-ascend that referenced this pull request Jul 7, 2025
…1.post1.dev20250528 (vllm-project#1235)

1. Fix rank set in DP scenario. The new poc version of torch-npu support
setting `ASCEND_RT_VISIBLE_DEVICES` dynamically, thus we could use the
rank set in `DPEngineCoreProc` directly instead of calculating local
rank across dp by hand in the patched `_init_data_parallel`

Closes: vllm-project#1170

2. Bump torch-npu version to 2.5.1.post1.dev20250528

Closes: vllm-project#1242
Closes: vllm-project#1232

CI passed with new added test.

---------

Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Icey <1790571317@qq.com>
Co-authored-by: Icey <1790571317@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

accuracy-test enable all accuracy test for PR ci/build documentation Improvements or additions to documentation long-term-test enable long term test for PR module:tests pd-test enable pd test for PR ready read for review ready-for-test start test by label for PR

Projects

None yet

5 participants