pytorch_geometric_temporal
Graph-Convolution-on-Structured-Documents
Our great sponsors
pytorch_geometric_temporal | Graph-Convolution-on-Structured-Documents | |
---|---|---|
18 | 1 | |
2,436 | 141 | |
- | - | |
2.6 | 0.0 | |
10 days ago | over 1 year ago | |
Python | Python | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pytorch_geometric_temporal
-
Ask HN: ML Papers to Implement
I have done this a few times now. Alone (e.g. https://github.com/paulmorio/geo2dr) and in collaboration with others (e.g. https://github.com/benedekrozemberczki/pytorch_geometric_tem...) primarily as a way to learn about the methods I was interested in from a research perspective whilst improving my skills in software engineering. I am still learning.
Starting out I would recommend implementing fundamental building blocks within whatever 'subculture' of ML you are interested in whether that be DL, kernel methods, probabilistic models, etc.
Let's say you are interested in deep learning methods (as that's something I could at least speak more confidently about). In that case build yourself an MLP layer, then an RNN layer, then a GNN layer, then a CNN layer, and an attention layer along with some full models with those layers on some case studies exhibiting different data modalities (images, graphs, signals). This should give you a feel for the assumptions driving the inductive biases in each layer and what motivates their existence (vs. an MLP). It also gives you the all the building blocks you can then extend to build every other DL layer+model out there. Another reason is that these fundamental building blocks have been implemented many times so you have a reference to look to when you get stuck.
On that note: here are some fun GNN papers to implement in order of increasing difficulty (try building using vanilla PyTorch/Jax instead of PyG).
Graph-Convolution-on-Structured-Documents
We haven't tracked posts mentioning Graph-Convolution-on-Structured-Documents yet.
Tracking mentions began in Dec 2020.
What are some alternatives?
osmnx - OSMnx is a Python package to easily download, model, analyze, and visualize street networks and other geospatial features from OpenStreetMap.
torchdrug - A powerful and flexible machine learning platform for drug discovery
dgl - Python package built to ease deep learning on graph, on top of existing DL frameworks.
gnn - TensorFlow GNN is a library to build Graph Neural Networks on the TensorFlow platform.
graphein - Protein Graph Library
pytorch_geometric - Graph Neural Network Library for PyTorch
awesome-graph-classification - A collection of important graph embedding, classification and representation learning papers with implementations.
euler - A distributed graph deep learning framework.
karateclub - Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (CIKM 2020)
RecBole - A unified, comprehensive and efficient recommendation library
PDN - The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
awesome-drug-pair-scoring - Readings for "A Unified View of Relational Deep Learning for Drug Pair Scoring." (IJCAI 2022)