Graphormer
transformers
Graphormer | transformers | |
---|---|---|
3 | 178 | |
1,918 | 125,741 | |
2.8% | 2.0% | |
5.5 | 10.0 | |
about 1 month ago | 4 days ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Graphormer
-
RAG Using Structured Data: Overview and Important Questions
Ok, using ChatGPT and Bard (the irony lol) I learned a bit more about GNNs:
GNNs are probabilistic and can be trained to learn representations in graph-structured data and handling complex relationships, while classical graph algorithms are specialized for specific graph analysis tasks and operate based on predefined rules/steps.
* Why is PyG it called "Geometric" and not "Topologic" ?
Properties like connectivity, neighborhoods, and even geodesic distances can all be considered topological features of a graph. These features remain unchanged under continuous deformations like stretching or bending, which is the defining characteristic of topological equivalence. In this sense, "PyTorch Topologic" might be a more accurate reflection of the library's focus on analyzing the intrinsic structure and connections within graphs.
However, the term "geometric" still has some merit in the context of PyG. While most GNN operations rely on topological principles, some do incorporate notions of Euclidean geometry, such as:
- Node embeddings: Many GNNs learn low-dimensional vectors for each node, which can be interpreted as points in a vector space, allowing geometric operations like distances and angles to be applied.
- Spectral GNNs: These models leverage the eigenvalues and eigenvectors of the graph Laplacian, which encodes information about the geometric structure and distances between nodes.
- Manifold learning: Certain types of graphs can be seen as low-dimensional representations of high-dimensional manifolds. Applying GNNs in this context involves learning geometric properties on the manifold itself.
Therefore, although topology plays a primary role in understanding and analyzing graphs, geometry can still be relevant in certain contexts and GNN operations.
* Real world applications:
- HuggingFace has a few models [0] around things like computational chemistry [1] or weather forecasting.
- PyGod [2] can be used for Outlier Detection (Anomaly Detection).
- Apparently ULTRA [3] can "infer" (in the knowledge graph sense), that Michael Jackson released some disco music :-p (see the paper).
- RGCN [4] can be used for knowledge graph link prediction (recovery of missing facts, i.e. subject-predicate-object triples) and entity classification (recovery of missing entity attributes).
- GreatX [5] tackles removing inherent noise, "Distribution Shift" and "Adversarial Attacks" (ex: noise purposely introduced to hide a node presence) from networks. Apparently this is a thing and the field is called "Graph Reliability" or "Reliable Deep Graph Learning". The author even has a bunch of "awesome" style lists of links! [6]
- Finally this repo has a nice explanation of how/why to run machine learning algorithms "outside of the DB":
"Pytorch Geometric (PyG) has a whole arsenal of neural network layers and techniques to approach machine learning on graphs (aka graph representation learning, graph machine learning, deep graph learning) and has been used in this repo [7] to learn link patterns, also known as link or edge predictions."
--
0: https://huggingface.co/models?pipeline_tag=graph-ml&sort=tre...
1: https://github.com/Microsoft/Graphormer
2: https://github.com/pygod-team/pygod
3: https://github.com/DeepGraphLearning/ULTRA
4: https://huggingface.co/riship-nv/RGCN
5: https://github.com/EdisonLeeeee/GreatX
6: https://edisonleeeee.github.io/projects.html
7: https://github.com/Orbifold/pyg-link-prediction
-
graphormer pretrained-models
Github Repo Here: https://github.com/microsoft/Graphormer
-
[D] Autoregressive model for graph generation?
Autoregressive models like GPT-2 do fairly well in text generation. Is it possible to do the same for graph data? A transformer based model Graphormer has recently shown its effectiveness in graph representation learning. Is there any way I can train Graphormer or any other model to generate graphs from an initial graph context?
transformers
-
XLSTM: Extended Long Short-Term Memory
Fascinating work, very promising.
Can you summarise how the model in your paper differs from this one ?
https://github.com/huggingface/transformers/issues/27011
-
AI enthusiasm #9 - A multilingual chatbot📣🈸
transformers is a package by Hugging Face, that helps you interact with models on HF Hub (GitHub)
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the “trax” repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
Lossless Acceleration of LLM via Adaptive N-Gram Parallel Decoding
The HuggingFace transformers library already has support for a similar method called prompt lookup decoding that uses the existing context to generate an ngram model: https://github.com/huggingface/transformers/issues/27722
I don't think it would be that hard to switch it out for a pretrained ngram model.
-
AI enthusiasm #6 - Finetune any LLM you want💡
Most of this tutorial is based on Hugging Face course about Transformers and on Niels Rogge's Transformers tutorials: make sure to check their work and give them a star on GitHub, if you please ❤️
-
Schedule-Free Learning – A New Way to Train
* Superconvergence + LR range finder + Fast AI's Ranger21 optimizer was the goto optimizer for CNNs, and worked fabulously well, but on transformers, the learning rate range finder sadi 1e-3 was the best, whilst 1e-5 was better. However, the 1 cycle learning rate stuck. https://github.com/huggingface/transformers/issues/16013
-
Gemma doesn't suck anymore – 8 bug fixes
Thanks! :) I'm pushing them into transformers, pytorch-gemma and collabing with the Gemma team to resolve all the issues :)
The RoPE fix should already be in transformers 4.38.2: https://github.com/huggingface/transformers/pull/29285
My main PR for transformers which fixes most of the issues (some still left): https://github.com/huggingface/transformers/pull/29402
- HuggingFace Transformers: Qwen2
- HuggingFace Transformers Release v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2
- HuggingFace: Support for the Mixtral Moe
What are some alternatives?
LaTeX-OCR - pix2tex: Using a ViT to convert images of equations into LaTeX code.
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
OpenPrompt - An Open-Source Framework for Prompt-Learning.
sentence-transformers - Multilingual Sentence & Image Embeddings with BERT
temporal-graph-gen - Pre-trained models for our work on Temporal Graph Generation
llama - Inference code for Llama models
ULTRA - A foundation model for knowledge graph reasoning
transformer-pytorch - Transformer: PyTorch Implementation of "Attention Is All You Need"
pyg-link-prediction - Pytorch Geometric link prediction of a homogeneous social graph.
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
huggingface_hub - The official Python client for the Huggingface Hub.
OpenNMT-py - Open Source Neural Machine Translation and (Large) Language Models in PyTorch