ML-Optimizers-JAX VS dm-haiku

Compare ML-Optimizers-JAX vs dm-haiku and see what are their differences.

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
ML-Optimizers-JAX dm-haiku
1 10
40 2,806
- 3.7%
4.5 8.0
almost 3 years ago 15 days ago
Python Python
- Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

ML-Optimizers-JAX

Posts with mentions or reviews of ML-Optimizers-JAX. We have used some of these posts to build our list of alternatives and similar projects.

dm-haiku

Posts with mentions or reviews of dm-haiku. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.
  • Maxtext: A simple, performant and scalable Jax LLM
    10 projects | news.ycombinator.com | 23 Apr 2024
    Is t5x an encoder/decoder architecture?

    Some more general options.

    The Flax ecosystem

    https://github.com/google/flax?tab=readme-ov-file

    or dm-haiku

    https://github.com/google-deepmind/dm-haiku

    were some of the best developed communities in the Jax AI field

    Perhaps the “trax” repo? https://github.com/google/trax

    Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...

    Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py

  • Help with installing python packages.
    3 projects | /r/NixOS | 18 Aug 2022
    I am fresh to nix os especially when it comes to using python on it how do I install packages withought using pip I need to install numpy~=1.19.5 transformers~=4.8.2 tqdm~=4.45.0 setuptools~=51.3.3 wandb>=0.11.2 einops~=0.3.0 requests~=2.25.1 fabric~=2.6.0 optax==0.0.6 git+https://github.com/deepmind/dm-haiku git+https://github.com/EleutherAI/lm-evaluation-harness/ ray[default]==1.4.1 jax~=0.2.12 Flask~=1.1.2 cloudpickle~=1.3.0 tensorflow-cpu~=2.5.0 google-cloud-storage~=1.36.2 smart_open[gcs] func_timeout ftfy fastapi uvicorn lm_dataformat ​ which‍ I can just do pip -r thetxtfile but idk how to do this in nix os also I would be using python3.7 so far this is what I have come up with but I know its wrong { pkgs ? import {} }: let packages = python-packages: with python-packages; [ mesh-transformer-jax/ jax==0.2.12 numpy~=1.19.5 transformers~=4.8.2 tqdm~=4.45.0 setuptools~=51.3.3 wandb>=0.11.2 einops~=0.3.0 requests~=2.25.1 fabric~=2.6.0 optax==0.0.6 #the other packages ]; pkgs.mkShell { nativeBuildInputs = [ pkgs.buildPackages.python37 ]; }
  • [D] Should We Be Using JAX in 2022?
    8 projects | /r/MachineLearning | 15 Feb 2022
    What's your favorite Deep Learning API for JAX - Flax, Haiku, Elegy, something else?
  • [D] Current State of JAX vs Pytorch?
    3 projects | /r/MachineLearning | 1 Feb 2022
    Just going to add that you should check out haiku if you are considering JAX: https://github.com/deepmind/dm-haiku
  • PyTorch vs. TensorFlow in 2022
    13 projects | news.ycombinator.com | 14 Dec 2021
    As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
  • [D] JAX learning resources?
    4 projects | /r/JAX | 23 Sep 2021
    - https://github.com/deepmind/dm-haiku/tree/main/examples
  • Why would I want to develop yet another deep learning framework?
    4 projects | /r/learnmachinelearning | 16 Sep 2021
  • Help with installing python packages
    6 projects | /r/NixOS | 18 Aug 2021

What are some alternatives?

When comparing ML-Optimizers-JAX and dm-haiku you can also consider the following projects:

RAdam - On the Variance of the Adaptive Learning Rate and Beyond

flax - Flax is a neural network library for JAX that is designed for flexibility.

DemonRangerOptimizer - Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay

jax-resnet - Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).

trax - Trax — Deep Learning with Clear Code and Speed

AdasOptimizer - ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance

equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/

dnn_from_scratch - A high level deep learning library for Convolutional Neural Networks,GANs and more, made from scratch(numpy/cupy implementation).

elegy - A High Level API for Deep Learning in JAX

flaxOptimizers - A collection of optimizers, some arcane others well known, for Flax.

jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more