A simple but complete full-attention transformer with a set of promising experimental features from various papers
Here you can share your experience with the project you are suggesting or its comparison with x-transformers. Optional.
A valid email to send you a verification link when necessary or log in.