dm-haiku VS t5x

Compare dm-haiku vs t5x and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
dm-haiku t5x
10 7
2,806 2,503
0.9% 2.3%
7.8 8.5
27 days ago 4 days ago
Python Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

dm-haiku

Posts with mentions or reviews of dm-haiku. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.
  • Maxtext: A simple, performant and scalable Jax LLM
    10 projects | news.ycombinator.com | 23 Apr 2024
    Is t5x an encoder/decoder architecture?

    Some more general options.

    The Flax ecosystem

    https://github.com/google/flax?tab=readme-ov-file

    or dm-haiku

    https://github.com/google-deepmind/dm-haiku

    were some of the best developed communities in the Jax AI field

    Perhaps the “trax” repo? https://github.com/google/trax

    Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...

    Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py

  • Help with installing python packages.
    3 projects | /r/NixOS | 18 Aug 2022
    I am fresh to nix os especially when it comes to using python on it how do I install packages withought using pip I need to install numpy~=1.19.5 transformers~=4.8.2 tqdm~=4.45.0 setuptools~=51.3.3 wandb>=0.11.2 einops~=0.3.0 requests~=2.25.1 fabric~=2.6.0 optax==0.0.6 git+https://github.com/deepmind/dm-haiku git+https://github.com/EleutherAI/lm-evaluation-harness/ ray[default]==1.4.1 jax~=0.2.12 Flask~=1.1.2 cloudpickle~=1.3.0 tensorflow-cpu~=2.5.0 google-cloud-storage~=1.36.2 smart_open[gcs] func_timeout ftfy fastapi uvicorn lm_dataformat ​ which‍ I can just do pip -r thetxtfile but idk how to do this in nix os also I would be using python3.7 so far this is what I have come up with but I know its wrong { pkgs ? import {} }: let packages = python-packages: with python-packages; [ mesh-transformer-jax/ jax==0.2.12 numpy~=1.19.5 transformers~=4.8.2 tqdm~=4.45.0 setuptools~=51.3.3 wandb>=0.11.2 einops~=0.3.0 requests~=2.25.1 fabric~=2.6.0 optax==0.0.6 #the other packages ]; pkgs.mkShell { nativeBuildInputs = [ pkgs.buildPackages.python37 ]; }
  • [D] Should We Be Using JAX in 2022?
    8 projects | /r/MachineLearning | 15 Feb 2022
    What's your favorite Deep Learning API for JAX - Flax, Haiku, Elegy, something else?
  • [D] Current State of JAX vs Pytorch?
    3 projects | /r/MachineLearning | 1 Feb 2022
    Just going to add that you should check out haiku if you are considering JAX: https://github.com/deepmind/dm-haiku
  • PyTorch vs. TensorFlow in 2022
    13 projects | news.ycombinator.com | 14 Dec 2021
    As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
  • [D] JAX learning resources?
    4 projects | /r/JAX | 23 Sep 2021
    - https://github.com/deepmind/dm-haiku/tree/main/examples
  • Why would I want to develop yet another deep learning framework?
    4 projects | /r/learnmachinelearning | 16 Sep 2021
  • Help with installing python packages
    6 projects | /r/NixOS | 18 Aug 2021

t5x

Posts with mentions or reviews of t5x. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.
  • Maxtext: A simple, performant and scalable Jax LLM
    10 projects | news.ycombinator.com | 23 Apr 2024
    [3]: https://github.com/google-research/t5x

    Asking because I have worked extensively on training a large model on a TPU cluster, and started with Levanter, then tried MaxText, and finally ended up on EasyLM. My thoughts are:

    - Levanter is well intentioned but is unproven and lacking in features. For instance, their sharding is odd in that it requires embedding dimension to be a multiple of the number of devices, so I can't test using a model with embedding dimension 768 on a 512-device pod. Lost confidence in Levanter after finding some glaring correctness bugs (and helping get them fixed). Also, while I'm a huge fan of Equinox's approach, it's sadly underdeveloped (for instance, there's no way to specify non-default weight initialization strategies without manually doing model surgery to set weights).

    - MaxText was just very difficult to work with. We felt like we were fighting against it every time we needed to change something because we would be digging through numerous needless layers of abstraction. My favorite was after one long day of debugging, I found a function who's only purpose was to pass its arguments to another function untouched; this function's only purpose was to pass its arguments untouched to a new, third function, that then slightly changed them and passed them to a fourth function that did the work.

    - EasyLM is, as the name says, easy. But on a deeper dive, the sharding functionality seems to be underdeveloped. What they call "FSDP" is not necessarily true FSDP, it's literally just a certain axis that the JAX mesh is being sharded around that happens to shard some data axes and some model weight axes.

    I'm still searching for a "perfect" JAX LLM codebase - any pointers?

  • Mixtral of Experts
    4 projects | news.ycombinator.com | 11 Dec 2023
    > Are you using a normal training script i.e. "continued pretraining" on ALL parameters with just document fragments rather than input output pairs?

    Yes, this one.

    > do you make a custom dataset that has qa pairs about that particular knowledgebase?

    This one. Once you have a checkpoint w knowledge, it makes sense to finetune. You can use either LORA or PEFT. We do it depending on the case. (some orgs have like millions of tokens and i am not that confident that PEFT).

    LoRA with raw document text may not work, haven't tried that. Google has a good example of training scripts here: https://github.com/google-research/t5x (under training. and then finetuning). I like this one. Facebook Research also has a few on their repo.

    If you are just looking to scrape by, I would suggest just do what they tell you to do. You can offer suggestions, but better let them take the call. A lot of fluff, a lot of chatter online, so everyone is figuring out stuff.

    One note about pretraining is that it is costly, so most OSS devs just do direct finetuning/LoRA. Works because their dataset is from the open internet. Orgs aren't finding much value with these. And yet, many communities are filled with these tactics.

  • Mixtures of Experts
    2 projects | news.ycombinator.com | 9 Oct 2023
    Google have released the models and code for the Switch Transformer from Fedus et al. (2021) under the Apache 2.0 licence. [0]

    There's also OpenMoE - an open-source effort to train a mixture of experts model. Currently they've released a model with 8 billion parameters. [1]

    [0] https://github.com/google-research/t5x/blob/main/docs/models...

    [1] https://github.com/XueFuzhao/OpenMoE

  • [D] ClosedAI license, open-source license which restricts only OpenAI, Microsoft, Google, and Meta from commercial use
    5 projects | /r/MachineLearning | 7 May 2023
  • [P] T5 Implementation in PyTorch
    3 projects | /r/MachineLearning | 4 Jan 2023
    You can find the official T5x repository by Google AI here: https://github.com/google-research/t5x
  • Google AI Introduces Confident Adaptive Language Modeling (CALM) For 3x Faster Text Generation With Language Models (LMs)
    1 project | /r/machinelearningnews | 20 Dec 2022
    Quick Read: https://www.marktechpost.com/2022/12/20/google-ai-introduces-confident-adaptive-language-modeling-calm-for-3x-faster-text-generation-with-language-models-lms/ Paper: https://arxiv.org/pdf/2207.07061.pdf Code: https://github.com/google-research/t5x/tree/main/t5x/contrib/calm
  • New free open source 20B parameter model (Not GPT Neo) achieves state-of-the-art results (SOTA) and outperforms GPT-3
    2 projects | /r/NovelAi | 12 May 2022
    From Section 9.1 in the paper, it looks like the weights in the Google buckets are associated with the T5X model(s?) here: https://github.com/google-research/t5x

What are some alternatives?

When comparing dm-haiku and t5x you can also consider the following projects:

flax - Flax is a neural network library for JAX that is designed for flexibility.

google-research - Google Research

jax-resnet - Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).

t5-pytorch - Implementation of Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer in PyTorch.

trax - Trax — Deep Learning with Clear Code and Speed

bad-licenses - A compendium of absurd open-source licenses.

equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/

Flux.jl - Relax! Flux is the ML library that doesn't make you tensor

elegy - A High Level API for Deep Learning in JAX

darwin-xnu - Legacy mirror of Darwin Kernel. Replaced by https://github.com/apple-oss-distributions/xnu

jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

OpenMoE - A family of open-sourced Mixture-of-Experts (MoE) Large Language Models