awesome-fast-attention
CrabNet
awesome-fast-attention | CrabNet | |
---|---|---|
1 | 1 | |
827 | 81 | |
- | - | |
1.0 | 3.7 | |
over 2 years ago | about 1 year ago | |
Python | Python | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
awesome-fast-attention
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
What are some alternatives?
how-do-vits-work - (ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
a-PyTorch-Tutorial-to-Transformers - Attention Is All You Need | a PyTorch Tutorial to Transformers
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
hummingbird - Hummingbird compiles trained ML models into tensor computation for faster inference.
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.