dgl
grape
dgl | grape | |
---|---|---|
4 | 3 | |
13,018 | 483 | |
0.8% | 3.1% | |
9.9 | 6.4 | |
8 days ago | 2 months ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dgl
-
[P] We are building a curated list of open source tooling for data-centric AI workflows, looking for contributions.
For graph embeddings, there's quite a few. I'd recommend this one, but there's also this one (disclaimer: I'm the author) or this one, more of a DGL library.
-
Detecting Out-of-Distribution Datapoints via Embeddings or Predictions
For trees/graphs, you’ll want a neural net that can take these as inputs for which I’m not sure a standard library exists. One recommendation is to checkout dgl: https://github.com/dmlc/dgl
- Beyond Message Passing: A Physics-Inspired Paradigm for Graph Neural Networks
-
[D] Convenient libs to use for new research project at the intersection of GNN and RL.
The best pkg for GCN - https://github.com/dmlc/dgl
grape
- Grape (Graph Representation LeArning, Predictions and Evaluation)
-
Zoomable, animated scatterplots in the browser that scales over a billion points
Ideally, you'd embed the graph into 2 or 3d first, then visualize it as a scatterplot.
Visualizing the edges at scale doesnt yield nice results in general.
The way to do it is to reduce the graph to some 300d or 500d embeddings, then use TSNE/UMAP/PACMAP to reduce that to 3d. Then visualize.
My prefered way is to use some first order embedding method like GGVec in this library [1] (disclaimer I wrote it). Node2Vec and ProNE don't yield great embeddings for visualization (the first is too filamented, the second too close to the unit ball).
Another great library to do this work is GRAPE [2]. Try first-order embedding methods, or short walks on second order methods to avoid the embeddings being too filamented by long random walk sampling.
[1] https://github.com/VHRanger/nodevectors
[2] https://github.com/AnacletoLAB/grape/
-
[P] We are building a curated list of open source tooling for data-centric AI workflows, looking for contributions.
For graph embeddings, there's quite a few. I'd recommend this one, but there's also this one (disclaimer: I'm the author) or this one, more of a DGL library.
What are some alternatives?
pytorch_geometric - Graph Neural Network Library for PyTorch
deodel - A mixed attributes predictive algorithm implemented in Python.
pytorch_geometric_temporal - PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models (CIKM 2021)
refinery - The data scientist's open-source choice to scale, assess and maintain natural language data. Treat training data like a software artifact.
torchdrug - A powerful and flexible machine learning platform for drug discovery
deepscatter - Zoomable, animated scatterplots in the browser that scales over a billion points
spektral - Graph Neural Networks with Keras and Tensorflow 2.
nanocube
deep_gcns_torch - Pytorch Repo for DeepGCNs (ICCV'2019 Oral, TPAMI'2021), DeeperGCN (arXiv'2020) and GNN1000(ICML'2021): https://www.deepgcns.org
cleanlab - The standard data-centric AI package for data quality and machine learning with messy, real-world data and labels.
SuperGluePretrainedNetwork - SuperGlue: Learning Feature Matching with Graph Neural Networks (CVPR 2020, Oral)
awesome-production-machine-learning - A curated list of awesome open source libraries to deploy, monitor, version and scale your machine learning