Fast and memory-efficient exact attention
Why do you think that https://github.com/facebookresearch/xformers is a good alternative to flash-attention
Fast and memory-efficient exact attention
Why do you think that https://github.com/facebookresearch/xformers is a good alternative to flash-attention