Skip to content

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems #1283

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems #1283

Triggered via pull request September 22, 2025 02:20
Status Skipped
Total duration 1s
Artifacts

nightly_benchmarks.yaml

on: pull_request
Matrix: test
Fit to window
Zoom out
Zoom in