CrabNet
attention_to_gif
CrabNet | attention_to_gif | |
---|---|---|
1 | 1 | |
96 | 3 | |
- | - | |
3.7 | 4.4 | |
almost 2 years ago | almost 4 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
attention_to_gif
-
[P] attention_to_gif: Visualizing The Transition of Attention In BERT as a GIF
[Colab Link] [Github Code Link]
What are some alternatives?
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
how-do-vits-work - (ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
ProteinStructurePrediction - Protein structure prediction is the task of predicting the 3-dimensional structure (shape) of a protein given its amino acid sequence and any available supporting information. In this section, we will Install and inspect sidechainnet, a dataset with tools for predicting and inspecting protein structures, complete two simplified implementations of Attention based Networks for predicting protein angles from amino acid sequences, and visualize our predictions along the way.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.
SAITS - The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Informer2020 - The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention