GAT
CrabNet
GAT | CrabNet | |
---|---|---|
2 | 1 | |
3,045 | 81 | |
- | - | |
0.0 | 3.7 | |
about 2 years ago | about 1 year ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
GAT
-
[D] Dr. Petar Veličković (Deepmind) - Categories, Graphs, Reasoning and Graph Expander Propagation
Found relevant code at https://github.com/PetarV-/GAT + all code implementations here
-
Graph Attention Networks (GAT) v2 implementation with side-by-side notes
Code for https://arxiv.org/abs/1710.10903 found: https://github.com/PetarV-/GAT
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
What are some alternatives?
pytorch-GAT - My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
awesome-graph-classification - A collection of important graph embedding, classification and representation learning papers with implementations.
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
how_attentive_are_gats - Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
hummingbird - Hummingbird compiles trained ML models into tensor computation for faster inference.
bottleneck - Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.