Skip to content

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems #1482

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems

[Perf] Add new npu_fused_infer_attention_score op to improve perfomance in splitfuse cases and resolve long-seq mask problems #1482

Triggered via pull request September 20, 2025 08:30
Status Skipped
Total duration 1s
Artifacts

accuracy_test.yaml

on: pull_request
Matrix: accuracy_tests
create_pr
0s
create_pr
Fit to window
Zoom out
Zoom in