Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Top 14 Python attention-mechanism Projects
-
PaLM-rlhf-pytorch
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
-
musiclm-pytorch
Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
audiolm-pytorch
Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch
-
make-a-video-pytorch
Implementation of Make-A-Video, new SOTA text to video generator from Meta AI, in Pytorch
-
muse-maskgit-pytorch
Implementation of Muse: Text-to-Image Generation via Masked Generative Transformers, in Pytorch
-
phenaki-pytorch
Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
-
MEGABYTE-pytorch
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
-
recurrent-memory-transformer-pytorch
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
-
iTransformer
Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
-
q-transformer
Implementation of Q-Transformer, Scalable Offline Reinforcement Learning via Autoregressive Q-Functions, out of Google Deepmind
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Project mention: How should I get an in-depth mathematical understanding of generative AI? | /r/datascience | 2023-05-18ChatGPT isn't open sourced so we don't know what the actual implementation is. I think you can read Open Assistant's source code for application design. If that is too much, try Open Chat Toolkit's source code for developer tools . If you need very bare implementation, you should go for lucidrains/PaLM-rlhf-pytorch.
Has anyone tried to train this model : lucidrains/musiclm-pytorch: Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch (github.com) ? Could you provide any useful resources that can help me? Or share your process?
It’s mostly there in https://github.com/lucidrains/audiolm-pytorch#hierarchical-t....
Project mention: Google's StyleDrop can transfer style from a single image | /r/StableDiffusion | 2023-06-03If google doesnt, someone like lucidrains probably would implement it, just like he did for imagen and muse.
If you want to talk immature looking, longnet wouldn't even compile. That's a big oof, considering it's a python and usually nonworking code is good enough to generate byte code. (also it has hard-coded dtype and device)
Project mention: [R] MEGABYTE: Predicting Million-byte Sequences with Multiscale Transformers | /r/MachineLearning | 2023-05-15
Project mention: Implementation of iTransformer – SOTA Time Series Forecasting Attention Networks | news.ycombinator.com | 2023-10-13
Being implemented as we speak, by the always impressive LucidRains [1]
[1]: https://github.com/lucidrains/q-transformer
Python attention-mechanisms related posts
-
LongLlama
-
Which features you wish that were added to Character Ai?
-
Why AI will not replace programmers.
-
An open model that beats ChatGPT. We're seeing a real shift towards open source models that will accelerate in the coming weeks.
-
GitHub - kyegomez/LongNet: Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
-
GitHub - kyegomez/LongNet: Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
-
GitHub - kyegomez/LongNet: Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
-
A note from our sponsor - InfluxDB
www.influxdata.com | 4 May 2024
Index
What are some of the best open-source attention-mechanism projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | PaLM-rlhf-pytorch | 7,593 |
2 | musiclm-pytorch | 3,013 |
3 | audiolm-pytorch | 2,249 |
4 | toolformer-pytorch | 1,889 |
5 | make-a-video-pytorch | 1,840 |
6 | muse-maskgit-pytorch | 816 |
7 | phenaki-pytorch | 715 |
8 | LongNet | 652 |
9 | MEGABYTE-pytorch | 590 |
10 | recurrent-memory-transformer-pytorch | 380 |
11 | iTransformer | 334 |
12 | q-transformer | 284 |
13 | block-recurrent-transformer-pytorch | 204 |
14 | flash-attention-jax | 175 |
Sponsored