PaLM-flax Alternatives
Similar projects and alternatives to PaLM-flax
-
PaLM-pytorch
Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
x-transformers
A concise but complete full-attention transformer with a set of promising experimental features from various papers
-
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
-
DALLE-pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
-
RWKV-LM
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
-
soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
-
whisper-timestamped
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
PaLM-flax discussion
PaLM-flax reviews and mentions
-
[R] Proprietary ML model in research paper
Google, Deepmind, and OpenAI normally provide a section in their research papers for replicating the pre-training and fine-tuning architectures of the models. For example, a replication of the pre-training architecture outlined in the LaMDA research paper in PyTorch (https://github.com/conceptofmind/LaMDA-pytorch/blob/main/lamda\_pytorch/lamda\_pytorch.py) or another implementation of Google's SOTA Pathways Language Model in JAX/FLAX (https://github.com/conceptofmind/PaLM-flax).
Stats
conceptofmind/PaLM-flax is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of PaLM-flax is Python.