Skip to content

Commit 18935cd

Browse files
committed
Update test_attention_v1.py
1 parent 1e09cbe commit 18935cd

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

tests/ut/attention/test_attention_v1.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -453,9 +453,9 @@ def test_forward_decode_only_swa(self, mock_fused_infer_attention_score,
453453
kv_cache,
454454
metadata,
455455
trace_flag=False)
456-
456+
print(output.shape)
457457
mock_fused_infer_attention_score.assert_called_once()
458-
assert output.shape == (10, 8 * 64)
458+
assert output.shape == (10, 8, 64)
459459

460460
@patch('vllm_ascend.attention.attention_v1.is_310p', return_value=False)
461461
@patch('torch_npu._npu_reshape_and_cache')

0 commit comments

Comments
 (0)