- block-recurrent-transformer-pytorch VS iris
- block-recurrent-transformer-pytorch VS flash-attention-jax
- block-recurrent-transformer-pytorch VS block-recurrent-transformer-py
- block-recurrent-transformer-pytorch VS RWKV-LM
- block-recurrent-transformer-pytorch VS heinsen_routing
- block-recurrent-transformer-pytorch VS PaLM-rlhf-pytorch
- block-recurrent-transformer-pytorch VS musiclm-pytorch
Block-recurrent-transformer-pytorch Alternatives
Similar projects and alternatives to block-recurrent-transformer-pytorch
-
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
heinsen_routing
Reference implementation of "An Algorithm for Routing Vectors in Sequences" (Heinsen, 2022) and "An Algorithm for Routing Capsules in All Domains" (Heinsen, 2019), for composing deep neural networks.
-
PaLM-rlhf-pytorch
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
-
musiclm-pytorch
Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
block-recurrent-transformer-pytorch reviews and mentions
-
From Deep to Long Learning
that line of research is still going. https://github.com/lucidrains/block-recurrent-transformer-py... i think it is worth continuing research on both fronts.
Stats
lucidrains/block-recurrent-transformer-pytorch is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of block-recurrent-transformer-pytorch is Python.
Popular Comparisons
Sponsored