attention_to_gif
CrabNet
attention_to_gif | CrabNet | |
---|---|---|
1 | 1 | |
1 | 81 | |
- | - | |
4.4 | 3.7 | |
about 3 years ago | about 1 year ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
attention_to_gif
-
[P] attention_to_gif: Visualizing The Transition of Attention In BERT as a GIF
[Colab Link] [Github Code Link]
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
What are some alternatives?
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
how-do-vits-work - (ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
hummingbird - Hummingbird compiles trained ML models into tensor computation for faster inference.
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.