Skip to content

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems #13558

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems #13558

Triggered via pull request September 16, 2025 12:31
@qyqc731qyqc731
synchronize #2962
Status Success
Total duration 9s
Artifacts

labeler.yml

on: pull_request_target
Fit to window
Zoom out
Zoom in