PaLM-flax

Implementation of the SOTA Transformer architecture from PaLM - Scaling Language Modeling with Pathways in JAX/Flax (by conceptofmind)

PaLM-flax Alternatives

Similar projects and alternatives to PaLM-flax

  1. PaLM-pytorch

    Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways

  2. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  3. x-transformers

    A concise but complete full-attention transformer with a set of promising experimental features from various papers

  4. nuwa-pytorch

    Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch

  5. DALLE-pytorch

    Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch

  6. RWKV-LM

    84 PaLM-flax VS RWKV-LM

    RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.

  7. soundstorm-pytorch

    Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch

  8. whisper-timestamped

    Multilingual Automatic Speech Recognition with word-level timestamps and confidence

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better PaLM-flax alternative or higher similarity.

PaLM-flax discussion

Log in or Post with

PaLM-flax reviews and mentions

Posts with mentions or reviews of PaLM-flax. We have used some of these posts to build our list of alternatives and similar projects.
  • [R] Proprietary ML model in research paper
    1 project | /r/MachineLearning | 1 Jul 2022
    Google, Deepmind, and OpenAI normally provide a section in their research papers for replicating the pre-training and fine-tuning architectures of the models. For example, a replication of the pre-training architecture outlined in the LaMDA research paper in PyTorch (https://github.com/conceptofmind/LaMDA-pytorch/blob/main/lamda\_pytorch/lamda\_pytorch.py) or another implementation of Google's SOTA Pathways Language Model in JAX/FLAX (https://github.com/conceptofmind/PaLM-flax).

Stats

Basic PaLM-flax repo stats
1
14
4.2
over 2 years ago

conceptofmind/PaLM-flax is an open source project licensed under MIT License which is an OSI approved license.

The primary programming language of PaLM-flax is Python.


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com