CrabNet
Invariant-Attention
CrabNet | Invariant-Attention | |
---|---|---|
1 | 1 | |
81 | 6 | |
- | - | |
3.7 | 10.0 | |
about 1 year ago | over 1 year ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
Invariant-Attention
What are some alternatives?
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
af2complex - Predicting direct protein-protein interactions with AlphaFold deep learning neural network models.
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
hummingbird - Hummingbird compiles trained ML models into tensor computation for faster inference.
best-of-ml-python - 🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.