SaaSHub helps you find the best software and product alternatives Learn more โ
Top 23 Python attention Projects
-
nn
๐งโ๐ซ 60+ Implementations/tutorials of deep learning papers with side-by-side notes ๐; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), ๐ฎ reinforcement learning (ppo, dqn), capsnet, distillation, ... ๐ง
-
InfluxDB
InfluxDB โ Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
-
-
-
ABSA-PyTorch
Aspect Based Sentiment Analysis, PyTorch Implementations. ๅบไบๆน้ข็ๆ ๆๅๆ๏ผไฝฟ็จPyTorchๅฎ็ฐใ
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
transfusion-pytorch
Pytorch implementation of Transfusion, "Predict the Next Token and Diffuse Images with One Multi-Modal Model", from MetaAI
Project mention: Transfusion: Predict the Next Token and Diffuse Images with One Multimodal Model | news.ycombinator.com | 2024-09-10Doesn't appear to be any weights uploaded anywhere that I can find.
There are the starts of two (non-original-author) public implementations available on Github, but again -- doesn't appear to be any pretrained weights in either.
* https://github.com/lucidrains/transfusion-pytorch
* https://github.com/VachanVY/Transfusion.torch
-
graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
-
LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
-
punctuator2
A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
-
open-musiclm
Implementation of MusicLM, a text to music model published by Google Research, with a few modifications.
-
SAITS
The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
-
PEN-Net-for-Inpainting
[CVPR'2019] PEN-Net: Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting
-
ScreenAI
Implementation of the ScreenAI model from the paper: "A Vision-Language Model for UI and Infographics Understanding"
-
-
nanodl
A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.
-
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
-
DeepViewAgg
[CVPR'22 Best Paper Finalist] Official PyTorch implementation of the method presented in "Learning Multi-View Aggregation In the Wild for Large-Scale 3D Semantic Segmentation"
-
-
-
-
pytorch-handwriting-synthesis-toolkit
Handwriting generation and handwriting synthesis as described in Alex Graves's paper https://arxiv.org/abs/1308.0850. Pytorch implementation.
Project mention: Show HN: Handwriter.ttf โ Handwriting Synthesis with Harfbuzz WASM | news.ycombinator.com | 2024-08-21I didn't train the model. A pretrained model is adopted from another repo [0].
[0]: https://github.com/X-rayLaser/pytorch-handwriting-synthesis-...
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Python attention discussion
Python attention related posts
-
ElevenLabs Launches Voice Translation Tool to Break Down Language Barriers
-
LongLlama
-
LongNet: Scaling Transformers to 1,000,000,000 Tokens
-
Which features you wish that were added to Character Ai?
-
Why AI will not replace programmers.
-
An open model that beats ChatGPT. We're seeing a real shift towards open source models that will accelerate in the coming weeks.
-
GitHub - kyegomez/LongNet: Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
-
A note from our sponsor - SaaSHub
www.saashub.com | 14 May 2025
Index
What are some of the best open-source attention projects in Python? This list will help you:
# | Project | Stars |
---|---|---|
1 | nn | 60,495 |
2 | attention-is-all-you-need-pytorch | 9,130 |
3 | transformer-pytorch | 3,680 |
4 | scenic | 3,531 |
5 | ABSA-PyTorch | 2,054 |
6 | gansformer | 1,335 |
7 | performer-pytorch | 1,126 |
8 | transfusion-pytorch | 1,097 |
9 | graphtransformer | 953 |
10 | LongNet | 702 |
11 | punctuator2 | 672 |
12 | open-musiclm | 536 |
13 | SAITS | 404 |
14 | PEN-Net-for-Inpainting | 361 |
15 | ScreenAI | 340 |
16 | how_attentive_are_gats | 327 |
17 | nanodl | 287 |
18 | gnn-lspe | 255 |
19 | DeepViewAgg | 229 |
20 | flashattention2-custom-mask | 111 |
21 | CrabNet | 100 |
22 | Perceiver | 88 |
23 | pytorch-handwriting-synthesis-toolkit | 75 |