CrabNet
GAT
CrabNet | GAT | |
---|---|---|
1 | 2 | |
96 | 3,269 | |
- | - | |
3.7 | 0.0 | |
almost 2 years ago | almost 3 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
GAT
-
[D] Dr. Petar Veličković (Deepmind) - Categories, Graphs, Reasoning and Graph Expander Propagation
Found relevant code at https://github.com/PetarV-/GAT + all code implementations here
-
Graph Attention Networks (GAT) v2 implementation with side-by-side notes
Code for https://arxiv.org/abs/1710.10903 found: https://github.com/PetarV-/GAT
What are some alternatives?
ProteinStructurePrediction - Protein structure prediction is the task of predicting the 3-dimensional structure (shape) of a protein given its amino acid sequence and any available supporting information. In this section, we will Install and inspect sidechainnet, a dataset with tools for predicting and inspecting protein structures, complete two simplified implementations of Attention based Networks for predicting protein angles from amino acid sequences, and visualize our predictions along the way.
how_attentive_are_gats - Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
pytorch-GAT - My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
awesome-graph-classification - A collection of important graph embedding, classification and representation learning papers with implementations.
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.
bottleneck - Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention
hummingbird - Hummingbird compiles trained ML models into tensor computation for faster inference.