Skip to content

Conversation

Yikun
Copy link
Collaborator

@Yikun Yikun commented Sep 20, 2025

What this PR does / why we need it?

Bump main to vllm-project/vllm@c60e613

Does this PR introduce any user-facing change?

No

How was this patch tested?

CI passed

Copy link

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

@Yikun Yikun changed the title Bump main to https://github.com/vllm-project/vllm/commit/c60e6137f0bf… Bump main to 0920 Sep 20, 2025
@Yikun Yikun changed the title Bump main to 0920 [CI] Upgrade vLLM to 20250920 (c60e613) and address config break Sep 20, 2025
Copy link

codecov bot commented Sep 20, 2025

Codecov Report

❌ Patch coverage is 42.85714% with 8 lines in your changes missing coverage. Please review.
✅ Project coverage is 71.93%. Comparing base (1bbb20e) to head (230d695).
⚠️ Report is 80 commits behind head on main.

Files with missing lines Patch % Lines
vllm_ascend/sample/sampler.py 42.85% 8 Missing ⚠️

❌ Your patch status has failed because the patch coverage (42.85%) is below the target coverage (80.00%). You can increase the patch coverage or adjust the target coverage.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3067      +/-   ##
==========================================
- Coverage   74.76%   71.93%   -2.84%     
==========================================
  Files         150      168      +18     
  Lines       20891    23554    +2663     
==========================================
+ Hits        15620    16944    +1324     
- Misses       5271     6610    +1339     
Flag Coverage Δ
unittests 71.93% <42.85%> (-2.84%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
@Yikun Yikun added ready read for review ready-for-test start test by label for PR labels Sep 20, 2025
@Yikun
Copy link
Collaborator Author

Yikun commented Sep 20, 2025

 UnboundLocalError: cannot access local variable 'raw_logprobs' where it is not associated with a value

https://github.com/vllm-project/vllm-ascend/actions/runs/17881166432/job/50849001999?pr=3067

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
@Yikun Yikun marked this pull request as ready for review September 21, 2025 00:16
@Yikun Yikun requested a review from wangxiyuan September 21, 2025 00:26
logits_to_return = logits
elif self.logprobs_mode == LogprobsMode.PROCESSED_LOGPROBS:
logits_to_return = logits.log_softmax(dim=-1, dtype=torch.float32)
if vllm_version_is("0.10.2"):
Copy link
Collaborator Author

@Yikun Yikun Sep 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

>>> from enum import Enum
>>> class Color(Enum):
...     RED = "red"
...     GREEN = "green"
...     BLUE = "blue"
...
>>> Color.RED
<Color.RED: 'red'>
>>> Color.RED=="red"
False
>>>

Here is the note why we need this

@wangxiyuan wangxiyuan merged commit b8b68b3 into vllm-project:main Sep 21, 2025
21 of 22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module:tests ready read for review ready-for-test start test by label for PR vllm-break
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants