Suggest an alternative to

memory-efficient-attention-pytorch

Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"

Why do you think that https://github.com/lucidrains/vit-pytorch is a good alternative to memory-efficient-attention-pytorch

A URL to the alternative repo (e.g. GitHub, GitLab)

Here you can share your experience with the project you are suggesting or its comparison with memory-efficient-attention-pytorch. Optional.

A valid email to send you a verification link when necessary or log in.