Skip to content

Replace npu_incre_flash_attention with npu_fused_infer_attention_score #5330

Replace npu_incre_flash_attention with npu_fused_infer_attention_score

Replace npu_incre_flash_attention with npu_fused_infer_attention_score #5330

Triggered via pull request September 12, 2025 14:07
@panchao-hubpanchao-hub
synchronize #2831
Status Success
Total duration 16s
Artifacts

format_pr_body.yaml

on: pull_request_target
update vLLM version
10s
update vLLM version
Fit to window
Zoom out
Zoom in

Annotations

1 warning
update vLLM version
The `python-version` input is not set. The version of Python currently in `PATH` will be used.