Replace npu_incre_flash_attention with npu_fused_infer_attention_score #5330
Triggered via pull request
September 12, 2025 14:07
panchao-hub
synchronize
#2831
Status
Success
Total duration
16s
Artifacts
–
format_pr_body.yaml
on: pull_request_target
update vLLM version
10s
Annotations
1 warning
update vLLM version
The `python-version` input is not set. The version of Python currently in `PATH` will be used.
|