pytorch_geometric_temporal
dgl
Our great sponsors
pytorch_geometric_temporal | dgl | |
---|---|---|
18 | 4 | |
2,436 | 12,913 | |
- | 1.4% | |
2.6 | 9.9 | |
10 days ago | about 16 hours ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pytorch_geometric_temporal
-
Ask HN: ML Papers to Implement
I have done this a few times now. Alone (e.g. https://github.com/paulmorio/geo2dr) and in collaboration with others (e.g. https://github.com/benedekrozemberczki/pytorch_geometric_tem...) primarily as a way to learn about the methods I was interested in from a research perspective whilst improving my skills in software engineering. I am still learning.
Starting out I would recommend implementing fundamental building blocks within whatever 'subculture' of ML you are interested in whether that be DL, kernel methods, probabilistic models, etc.
Let's say you are interested in deep learning methods (as that's something I could at least speak more confidently about). In that case build yourself an MLP layer, then an RNN layer, then a GNN layer, then a CNN layer, and an attention layer along with some full models with those layers on some case studies exhibiting different data modalities (images, graphs, signals). This should give you a feel for the assumptions driving the inductive biases in each layer and what motivates their existence (vs. an MLP). It also gives you the all the building blocks you can then extend to build every other DL layer+model out there. Another reason is that these fundamental building blocks have been implemented many times so you have a reference to look to when you get stuck.
On that note: here are some fun GNN papers to implement in order of increasing difficulty (try building using vanilla PyTorch/Jax instead of PyG).
dgl
-
[P] We are building a curated list of open source tooling for data-centric AI workflows, looking for contributions.
For graph embeddings, there's quite a few. I'd recommend this one, but there's also this one (disclaimer: I'm the author) or this one, more of a DGL library.
-
Detecting Out-of-Distribution Datapoints via Embeddings or Predictions
For trees/graphs, you’ll want a neural net that can take these as inputs for which I’m not sure a standard library exists. One recommendation is to checkout dgl: https://github.com/dmlc/dgl
What are some alternatives?
pytorch_geometric - Graph Neural Network Library for PyTorch
osmnx - OSMnx is a Python package to easily download, model, analyze, and visualize street networks and other geospatial features from OpenStreetMap.
torchdrug - A powerful and flexible machine learning platform for drug discovery
graphein - Protein Graph Library
gnn - TensorFlow GNN is a library to build Graph Neural Networks on the TensorFlow platform.
awesome-graph-classification - A collection of important graph embedding, classification and representation learning papers with implementations.
spektral - Graph Neural Networks with Keras and Tensorflow 2.
deep_gcns_torch - Pytorch Repo for DeepGCNs (ICCV'2019 Oral, TPAMI'2021), DeeperGCN (arXiv'2020) and GNN1000(ICML'2021): https://www.deepgcns.org
SuperGluePretrainedNetwork - SuperGlue: Learning Feature Matching with Graph Neural Networks (CVPR 2020, Oral)
euler - A distributed graph deep learning framework.