Skip to content

Commit 7845658

Browse files
committed
clearer in readme that it is cross attention
1 parent 2b93d55 commit 7845658

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ Cross attention
3636
import torch
3737
from memory_efficient_attention_pytorch import Attention
3838

39-
attn = Attention(
39+
cross_attn = Attention(
4040
dim = 512,
4141
dim_head = 64,
4242
heads = 8,
@@ -49,7 +49,7 @@ x = torch.randn(1, 16384, 512).cuda()
4949
context = torch.randn(1, 16384, 512).cuda()
5050
mask = torch.ones(1, 16384).bool().cuda()
5151

52-
out = attn(x, context = context, mask = mask) # (1, 16384, 512)
52+
out = cross_attn(x, context = context, mask = mask) # (1, 16384, 512)
5353
```
5454

5555
- [ ] add enwik8 example with 8192 context length

0 commit comments

Comments
 (0)