Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
[Feat] Sage Attention Kernels Support for sm80, sm89, sm90 #9848
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uh oh!
There was an error while loading. Please reload this page.
[Feat] Sage Attention Kernels Support for sm80, sm89, sm90 #9848
Changes from all commits
3e9fe77
ca60579
7bb2c79
0a52545
787011b
1b9e4e0
ac29e3c
c460cce
452a9de
ec505b6
17a6bd8
7035e1e
38ea097
c42d1f5
22894d7
41f2900
44350a1
490db5a
7e9730d
2004a8a
File filter
Filter by extension
Conversations
Uh oh!
There was an error while loading. Please reload this page.
Jump to
Uh oh!
There was an error while loading. Please reload this page.
There are no files selected for viewing
Large diffs are not rendered by default.
Uh oh!
There was an error while loading. Please reload this page.
Large diffs are not rendered by default.
Uh oh!
There was an error while loading. Please reload this page.
Large diffs are not rendered by default.
Uh oh!
There was an error while loading. Please reload this page.
Large diffs are not rendered by default.
Uh oh!
There was an error while loading. Please reload this page.
Large diffs are not rendered by default.
Uh oh!
There was an error while loading. Please reload this page.
Large diffs are not rendered by default.
Uh oh!
There was an error while loading. Please reload this page.
Check warning on line 2973 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2973
Check warning on line 2976 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2975-L2976
Check warning on line 2979 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2978-L2979
Check warning on line 2982 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2981-L2982
Check warning on line 2984 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2984
Check warning on line 2995 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2994-L2995
Check warning on line 2997 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L2997
Check warning on line 3330 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3330
Check warning on line 3333 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3332-L3333
Check warning on line 3336 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3335-L3336
Check warning on line 3339 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3338-L3339
Check warning on line 3341 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3341
Check warning on line 3352 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3351-L3352
Check warning on line 3354 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L3354
Check warning on line 5049 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5049
Check warning on line 5052 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5051-L5052
Check warning on line 5055 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5054-L5055
Check warning on line 5058 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5057-L5058
Check warning on line 5060 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5060
Check warning on line 5071 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5070-L5071
Check warning on line 5073 in paddlenlp/experimental/transformers/fused_transformer_layers.py
paddlenlp/experimental/transformers/fused_transformer_layers.py#L5073
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.