Skip to content

Conversation

22dimensions
Copy link
Collaborator

@22dimensions 22dimensions commented Jul 31, 2025

What this PR does / why we need it?

cherry-pick #1501 from 0.9.1-dev to main

Currently, Ray is not compatible with ACL Graph, so we need to fall back to eager mode when using the Ray backend.

co-authored: Yizhou Liu liu_yizhou@outlook.com

Does this PR introduce any user-facing change?

How was this patch tested?

Copy link

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

@Yikun
Copy link
Collaborator

Yikun commented Jul 31, 2025

Please rename title and commits msg more meaningful.

@22dimensions 22dimensions changed the title [Misc] cherry-pick vllm-project#1501 from v0.9.1-dev to main [Misc] Add warning for incompatible Ray backend with ACL Graph mode Jul 31, 2025
@Yikun Yikun added the ready read for review label Jul 31, 2025
@22dimensions
Copy link
Collaborator Author

Please rename title and commits msg more meaningful.

done

Copy link
Collaborator

@Yikun Yikun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM if CI passed

Signed-off-by: 22dimensions <waitingwind@foxmail.com>
Copy link

codecov bot commented Jul 31, 2025

Codecov Report

❌ Patch coverage is 33.33333% with 2 lines in your changes missing coverage. Please review.
✅ Project coverage is 75.03%. Comparing base (72eceff) to head (754d42b).
⚠️ Report is 14 commits behind head on main.

Files with missing lines Patch % Lines
vllm_ascend/platform.py 33.33% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2132      +/-   ##
==========================================
+ Coverage   74.41%   75.03%   +0.62%     
==========================================
  Files         100      103       +3     
  Lines       11208    11358     +150     
==========================================
+ Hits         8340     8523     +183     
+ Misses       2868     2835      -33     
Flag Coverage Δ
unittests 75.03% <33.33%> (+0.62%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@wangxiyuan wangxiyuan merged commit 9e65da9 into vllm-project:main Aug 1, 2025
25 checks passed
chopper0126 pushed a commit to chopper0126/vllm-ascend that referenced this pull request Sep 26, 2025
…llm-project#2132)

### What this PR does / why we need it?

cherry-pick vllm-project#1501 from 0.9.1-dev to main

Currently, Ray is not compatible with ACL Graph, so we need to fall back
to eager mode when using the Ray backend.

co-authored: Yizhou Liu <liu_yizhou@outlook.com>

- vLLM version: v0.10.0
- vLLM main:
vllm-project/vllm@2836dd7

Signed-off-by: 22dimensions <waitingwind@foxmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants