XLA.jl
flax
XLA.jl | flax | |
---|---|---|
2 | 10 | |
46 | 5,545 | |
- | 2.4% | |
10.0 | 9.7 | |
almost 4 years ago | 5 days ago | |
Julia | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
XLA.jl
- PyTorch vs. TensorFlow in 2022
-
Supercharged high-resolution ocean simulation with Jax
https://github.com/FluxML/XLA.jl
When in doubt, piggybacking on (or at least interoperating with) what the large technology companies are investing in is probably savvy, sort of what the OP did.
flax
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the βtraxβ repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
What is the JAX/Flax equivalent of torch.nn.Parameter?
https://github.com/google/flax/discussions/919 https://flax.readthedocs.io/en/latest/_modules/flax/linen/attention.html
-
Announcing flax 0.2 - A fully featured ECS
Just as an FYI, you might be competing against another big open source project with the same name https://github.com/google/flax
-
Flax: How to use one linen module inside another for training?
I have asked the same question on the Flax discussion page on Github as well.
-
[D] Should We Be Using JAX in 2022?
What's your favorite Deep Learning API for JAX - Flax, Haiku, Elegy, something else?
-
PyTorch vs. TensorFlow in 2022
As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
- [D] Getting Started with Deep Learning in JAX with Treex in 16 lines
-
[D] JAX learning resources?
- https://github.com/google/flax/tree/main/examples
- Why would I want to develop yet another deep learning framework?
-
[D] Why is tensorflow so hated on and pytorch is the cool kids framework?
Any thoughts on Flax?
What are some alternatives?
MATDaemon.jl
dm-haiku - JAX-based neural network library
pyhpc-benchmarks - A suite of benchmarks for CPU and GPU performance of the most popular high-performance libraries for Python :rocket:
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
Oceananigans.jl - π Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs
trax - Trax β Deep Learning with Clear Code and Speed
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
DataProfiler - What's in your data? Extract schema, statistics and entities from datasets
objax