-
-
Notifications
You must be signed in to change notification settings - Fork 9.4k
[Misc] code clean duplicate set_current_vllm_config in _set_vllm_config #22566
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Misc] code clean duplicate set_current_vllm_config in _set_vllm_config #22566
Conversation
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request correctly removes a redundant and ineffective call to set_current_vllm_config
. The change is a good code cleanup.
While reviewing this function, I noticed a potential resource leak. The temporary file created on line 37 with tempfile.mkstemp()
is not being cleaned up after use. This could lead to an accumulation of temporary files, potentially causing issues in CI environments.
I recommend addressing this in a follow-up pull request by using a context manager, for example tempfile.NamedTemporaryFile
, to ensure the temporary file is automatically deleted. This is just a suggestion for future improvement and doesn't block this PR.
Signed-off-by: Andy Xie <andy.xning@gmail.com>
65f09de
to
2078a06
Compare
Seems ci failures are flakes and some error about entrypoint which also fails in other PRs. Can you help force merge of this. @DarkLight1337 |
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com> Signed-off-by: jingyu <jingyu@omniml.ai>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com> Signed-off-by: Avery Yingyi Huang <yingyihuang2000@outlook.com>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com> Signed-off-by: Paul Pak <paulpak58@gmail.com>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com> Signed-off-by: Boyuan Feng <boyuan@meta.com>
…ig (vllm-project#22566) Signed-off-by: Andy Xie <andy.xning@gmail.com> Signed-off-by: Diego-Castan <diego.castan@ibm.com>
Essential Elements of an Effective PR Description Checklist
supported_models.md
andexamples
for a new model.Purpose
set_current_vllm_config
is duplicated.Test Plan
NA
Test Result
NA
(Optional) Documentation Update
NA