Releases: lucidrains/memory-efficient-attention-pytorch
Releases · lucidrains/memory-efficient-attention-pytorch
0.0.12
0.0.12
0.0.11
expose memory efficient attention functions in __init__.py
0.0.10
fix causal mask leading to high mem consumption for 65536 length
0.0.9
remove normal head scale from cosine sim attention
0.0.8
0.0.8
0.0.7
add ability to add attention bias, for dynamic positional bias and ex…
0.0.6
if the chunk is to be all masked out causally, skip summarizing the b…
0.0.5
fix bug with scale
0.0.4
release
0.0.2
complete numerical stability for memory efficient attention