how_attentive_are_gats
grand-cypher
Our great sponsors
how_attentive_are_gats | grand-cypher | |
---|---|---|
1 | 4 | |
275 | 61 | |
6.2% | - | |
0.0 | 6.0 | |
about 2 years ago | 2 months ago | |
Python | Python | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
how_attentive_are_gats
-
Graph Attention Networks (GAT) v2 implementation with side-by-side notes
Code for https://arxiv.org/abs/2105.14491 found: https://github.com/tech-srl/how_attentive_are_gats
grand-cypher
What are some alternatives?
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
dotmotif - A performant, powerful query framework to search for network motifs
transformer-pytorch - Transformer: PyTorch Implementation of "Attention Is All You Need"
grand - Your favorite Python graph libraries, scalable and interoperable. Graph databases in memory, and familiar graph APIs for cloud databases.
bottleneck - Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"
pytorch-GAT - My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
movies-python-bolt - Neo4j Movies Example application with Flask backend using the neo4j-python-driver