pytorch_geometric_temporal
awesome-graph-classification
Our great sponsors
pytorch_geometric_temporal | awesome-graph-classification | |
---|---|---|
18 | 1 | |
2,484 | 4,698 | |
- | - | |
1.8 | 1.0 | |
10 days ago | about 1 year ago | |
Python | Python | |
MIT License | Creative Commons Zero v1.0 Universal |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pytorch_geometric_temporal
-
Ask HN: ML Papers to Implement
I have done this a few times now. Alone (e.g. https://github.com/paulmorio/geo2dr) and in collaboration with others (e.g. https://github.com/benedekrozemberczki/pytorch_geometric_tem...) primarily as a way to learn about the methods I was interested in from a research perspective whilst improving my skills in software engineering. I am still learning.
Starting out I would recommend implementing fundamental building blocks within whatever 'subculture' of ML you are interested in whether that be DL, kernel methods, probabilistic models, etc.
Let's say you are interested in deep learning methods (as that's something I could at least speak more confidently about). In that case build yourself an MLP layer, then an RNN layer, then a GNN layer, then a CNN layer, and an attention layer along with some full models with those layers on some case studies exhibiting different data modalities (images, graphs, signals). This should give you a feel for the assumptions driving the inductive biases in each layer and what motivates their existence (vs. an MLP). It also gives you the all the building blocks you can then extend to build every other DL layer+model out there. Another reason is that these fundamental building blocks have been implemented many times so you have a reference to look to when you get stuck.
On that note: here are some fun GNN papers to implement in order of increasing difficulty (try building using vanilla PyTorch/Jax instead of PyG).
- GitHub - benedekrozemberczki/pytorch_geometric_temporal: PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models (CIKM 2021)
- PyTorch Geometric Temporal 0.37
- PyTorch Geometric Temporal - Spatiotemporal Signal Processing with Neural Machine Learning Models
- [P] PyTorch Geometric Temporal
- Show HN: Deep Learning for Windmill Output Forecasting with PyTorch
-
[R] PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models
Repo: https://github.com/benedekrozemberczki/pytorch_geometric_temporal
- PyTorch Geometric Temporal 0.27
- Show HN: Machine Learning on Spatiotemporal Data – PyTorch Geometric Temporal
- PyTorch Geometric Temporal
awesome-graph-classification
What are some alternatives?
osmnx - OSMnx is a Python package to easily download, model, analyze, and visualize street networks and other geospatial features from OpenStreetMap.
euler - A distributed graph deep learning framework.
dgl - Python package built to ease deep learning on graph, on top of existing DL frameworks.
PDN - The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
torchdrug - A powerful and flexible machine learning platform for drug discovery
karateclub - Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (CIKM 2020)
graphein - Protein Graph Library
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
gnn - TensorFlow GNN is a library to build Graph Neural Networks on the TensorFlow platform.
GraphGPS - Recipe for a General, Powerful, Scalable Graph Transformer
pytorch_geometric - Graph Neural Network Library for PyTorch