CrabNet
GAT
CrabNet | GAT | |
---|---|---|
1 | 2 | |
81 | 3,045 | |
- | - | |
3.7 | 0.0 | |
about 1 year ago | about 2 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
GAT
-
[D] Dr. Petar Veličković (Deepmind) - Categories, Graphs, Reasoning and Graph Expander Propagation
Found relevant code at https://github.com/PetarV-/GAT + all code implementations here
-
Graph Attention Networks (GAT) v2 implementation with side-by-side notes
Code for https://arxiv.org/abs/1710.10903 found: https://github.com/PetarV-/GAT
What are some alternatives?
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
pytorch-GAT - My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
awesome-graph-classification - A collection of important graph embedding, classification and representation learning papers with implementations.
hummingbird - Hummingbird compiles trained ML models into tensor computation for faster inference.
how_attentive_are_gats - Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
bottleneck - Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.