trax VS dm-haiku

Compare trax vs dm-haiku and see what are their differences.

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
trax dm-haiku
7 10
7,957 2,806
0.7% 3.7%
4.7 7.8
3 months ago 20 days ago
Python Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

trax

Posts with mentions or reviews of trax. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.
  • Maxtext: A simple, performant and scalable Jax LLM
    10 projects | news.ycombinator.com | 23 Apr 2024
    Is t5x an encoder/decoder architecture?

    Some more general options.

    The Flax ecosystem

    https://github.com/google/flax?tab=readme-ov-file

    or dm-haiku

    https://github.com/google-deepmind/dm-haiku

    were some of the best developed communities in the Jax AI field

    Perhaps the “trax” repo? https://github.com/google/trax

    Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...

    Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py

  • Replit's new Code LLM was trained in 1 week
    12 projects | news.ycombinator.com | 3 May 2023
    and the implementation https://github.com/google/trax/blob/master/trax/models/resea... if you are interested.

    Hope you get to look into this!

  • RedPajama: Reproduction of Llama with Friendly License
    4 projects | news.ycombinator.com | 17 Apr 2023
    Thank you for developing the pipeline and amassing considerable compute for gathering and preprocessing this dataset!

    I'm not sure if this is the right place to ask about this, but could you consider training an LLM using a more advanced, sparse transformer architecture (specifically, "Terraformer" from this paper https://arxiv.org/abs/2111.12763 and this codebase https://github.com/google/trax/blob/master/trax/models/resea... by Google Brain and OpenAI)? I understand the pressure to focus on training a straightforward LLaMA replication, but of course you see that it's a legacy dense architecture which limits its inference performance. This new architecture is not just an academic curiosity but is already validated at scale by Google, providing 10x+ inference performance boost on the same hardware.

    Frankly, the community's compute budget - for training and for inference - isn't infinite, and neither is the public's interest in models that do not have advantage (at least in convenience) over closed-source ones; and so we should utilize both those resources as efficiently as possible. It could be a big step forward if you trained at least LLaMA-Terraformer-7B and 13B foundation models on the whole dataset.

  • The founder of Gmail claims that ChatGPT can “kill” Google in two years.
    1 project | /r/Futurology | 31 Jan 2023
    But a couple years later they came out with open source implementations yeah: https://github.com/google/trax/tree/master/trax/models/reformer
  • [D] Paper Explained - Sparse is Enough in Scaling Transformers (aka Terraformer) | Video Walkthrough
    1 project | /r/MachineLearning | 1 Dec 2021
    Code: https://github.com/google/trax/blob/master/trax/examples/Terraformer_from_scratch.ipynb
  • Why would I want to develop yet another deep learning framework?
    4 projects | /r/learnmachinelearning | 16 Sep 2021
  • How to train large models on a normal laptop?
    1 project | /r/LanguageTechnology | 14 Feb 2021
    Training language models is expensive. Train the biggest model you can afford. I assume you've tried the colab from the reformer GitHub: https://github.com/google/trax/tree/master/trax/models/reformer

dm-haiku

Posts with mentions or reviews of dm-haiku. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.
  • Maxtext: A simple, performant and scalable Jax LLM
    10 projects | news.ycombinator.com | 23 Apr 2024
    Is t5x an encoder/decoder architecture?

    Some more general options.

    The Flax ecosystem

    https://github.com/google/flax?tab=readme-ov-file

    or dm-haiku

    https://github.com/google-deepmind/dm-haiku

    were some of the best developed communities in the Jax AI field

    Perhaps the “trax” repo? https://github.com/google/trax

    Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...

    Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py

  • Help with installing python packages.
    3 projects | /r/NixOS | 18 Aug 2022
    I am fresh to nix os especially when it comes to using python on it how do I install packages withought using pip I need to install numpy~=1.19.5 transformers~=4.8.2 tqdm~=4.45.0 setuptools~=51.3.3 wandb>=0.11.2 einops~=0.3.0 requests~=2.25.1 fabric~=2.6.0 optax==0.0.6 git+https://github.com/deepmind/dm-haiku git+https://github.com/EleutherAI/lm-evaluation-harness/ ray[default]==1.4.1 jax~=0.2.12 Flask~=1.1.2 cloudpickle~=1.3.0 tensorflow-cpu~=2.5.0 google-cloud-storage~=1.36.2 smart_open[gcs] func_timeout ftfy fastapi uvicorn lm_dataformat ​ which‍ I can just do pip -r thetxtfile but idk how to do this in nix os also I would be using python3.7 so far this is what I have come up with but I know its wrong { pkgs ? import {} }: let packages = python-packages: with python-packages; [ mesh-transformer-jax/ jax==0.2.12 numpy~=1.19.5 transformers~=4.8.2 tqdm~=4.45.0 setuptools~=51.3.3 wandb>=0.11.2 einops~=0.3.0 requests~=2.25.1 fabric~=2.6.0 optax==0.0.6 #the other packages ]; pkgs.mkShell { nativeBuildInputs = [ pkgs.buildPackages.python37 ]; }
  • [D] Should We Be Using JAX in 2022?
    8 projects | /r/MachineLearning | 15 Feb 2022
    What's your favorite Deep Learning API for JAX - Flax, Haiku, Elegy, something else?
  • [D] Current State of JAX vs Pytorch?
    3 projects | /r/MachineLearning | 1 Feb 2022
    Just going to add that you should check out haiku if you are considering JAX: https://github.com/deepmind/dm-haiku
  • PyTorch vs. TensorFlow in 2022
    13 projects | news.ycombinator.com | 14 Dec 2021
    As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
  • [D] JAX learning resources?
    4 projects | /r/JAX | 23 Sep 2021
    - https://github.com/deepmind/dm-haiku/tree/main/examples
  • Why would I want to develop yet another deep learning framework?
    4 projects | /r/learnmachinelearning | 16 Sep 2021
  • Help with installing python packages
    6 projects | /r/NixOS | 18 Aug 2021

What are some alternatives?

When comparing trax and dm-haiku you can also consider the following projects:

flax - Flax is a neural network library for JAX that is designed for flexibility.

muzero-general - MuZero

jax-resnet - Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).

extending-jax - Extending JAX with custom C++ and CUDA code

equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/

ML-Optimizers-JAX - Toy implementations of some popular ML optimizers using Python/JAX

elegy - A High Level API for Deep Learning in JAX

objax

jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

numpyro - Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.

jaxline