entity-table
flax
entity-table | flax | |
---|---|---|
1 | 10 | |
0 | 5,680 | |
- | 2.8% | |
10.0 | 9.7 | |
over 1 year ago | 7 days ago | |
Rust | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
entity-table
flax
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the “trax” repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
What is the JAX/Flax equivalent of torch.nn.Parameter?
https://github.com/google/flax/discussions/919 https://flax.readthedocs.io/en/latest/_modules/flax/linen/attention.html
-
Announcing flax 0.2 - A fully featured ECS
Just as an FYI, you might be competing against another big open source project with the same name https://github.com/google/flax
-
Flax: How to use one linen module inside another for training?
I have asked the same question on the Flax discussion page on Github as well.
-
[D] Should We Be Using JAX in 2022?
What's your favorite Deep Learning API for JAX - Flax, Haiku, Elegy, something else?
-
PyTorch vs. TensorFlow in 2022
As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
- [D] Getting Started with Deep Learning in JAX with Treex in 16 lines
-
[D] JAX learning resources?
- https://github.com/google/flax/tree/main/examples
- Why would I want to develop yet another deep learning framework?
-
[D] Why is tensorflow so hated on and pytorch is the cool kids framework?
Any thoughts on Flax?
What are some alternatives?
flax - Batteries included ECS library for rust with entity relations and much more
dm-haiku - JAX-based neural network library
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
trax - Trax — Deep Learning with Clear Code and Speed
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
objax
tf-transformers - State of the art faster Transformer with Tensorflow 2.0 ( NLP, Computer Vision, Audio ).
jaxline
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
captum - Model interpretability and understanding for PyTorch