The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning. Learn more →
Top 23 Python Jax Projects
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
jax
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
-
d2l-en
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
einops
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
-
foolbox
A Python toolbox to create adversarial examples that fool neural networks in PyTorch, TensorFlow, and JAX
-
EasyLM
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
-
pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
-
numpyro
Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.
-
equinox
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
-
machine_learning_refined
Notes, examples, and Python demos for the 2nd edition of the textbook "Machine Learning Refined" (published by Cambridge University Press).
-
TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Project mention: Lossless Acceleration of LLM via Adaptive N-Gram Parallel Decoding | news.ycombinator.com | 2024-04-21The HuggingFace transformers library already has support for a similar method called prompt lookup decoding that uses the existing context to generate an ngram model: https://github.com/huggingface/transformers/issues/27722
I don't think it would be that hard to switch it out for a pretrained ngram model.
The dual numbers exist just as surely as the real numbers and have been used well over 100 years
https://en.m.wikipedia.org/wiki/Dual_number
Pytorch has had them for many years.
https://pytorch.org/docs/stable/generated/torch.autograd.for...
JAX implements them and uses them exactly as stated in this thread.
https://github.com/google/jax/discussions/10157#discussionco...
As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.
See also https://github.com/unifyai/ivy which I have not tried but seems along the lines of what you are describing, working with all the major frameworks
and the implementation https://github.com/google/trax/blob/master/trax/models/resea... if you are interested.
Hope you get to look into this!
Project mention: Einops: Flexible and powerful tensor operations for readable and reliable code | news.ycombinator.com | 2023-12-12
https://github.com/google/flax/discussions/919 https://flax.readthedocs.io/en/latest/_modules/flax/linen/attention.html
Project mention: JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation | news.ycombinator.com | 2023-09-28Agree, though I wouldn’t call PyTorch a drop-in for NumPy either. CuPy is the drop-in. Excepting some corner cases, you can use the same code for both. Thinc’s ops work with both NumPy and CuPy:
https://github.com/explosion/thinc/blob/master/thinc/backend...
Go ahead, play with any adversarial attacks from https://github.com/bethgelab/foolbox you will not find an attack that is both robust to perturbations and almost visually imperceptible
Project mention: How To Fine-Tune LLaMA, OpenLLaMA, And XGen, With JAX On A GPU Or A TPU | /r/LocalLLaMA | 2023-07-04
I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).
At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!
But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.
Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.
[1] https://github.com/patrick-kidger/equinox
Project mention: Benchmarking Large Language Models on NVIDIA H100 GPUs with CoreWeave (Part 1) | /r/nvidia | 2023-04-304090 now has its 8-bit float enabled as well, see the [transformer engine issue](https://github.com/NVIDIA/TransformerEngine/issues/15)
Python Jax related posts
- Julia GPU-based ODE solver 20x-100x faster than those in Jax and PyTorch
- [P] LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite
- Apple releases MLX for Apple Silicon
- About Monte Carlo tree search in Jax
- MLPerf training tests put Nvidia ahead, Intel close, and Google well behind
- [P] Optimistix, nonlinear optimisation in JAX+Equinox!
- Show HN: Optimistix: Nonlinear Optimisation in Jax+Equinox
-
A note from our sponsor - WorkOS
workos.com | 23 Apr 2024
Index
What are some of the best open-source Jax projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | transformers | 124,557 |
2 | Keras | 60,902 |
3 | jax | 27,842 |
4 | d2l-en | 21,628 |
5 | best-of-ml-python | 15,302 |
6 | ivy | 14,022 |
7 | trax | 7,953 |
8 | einops | 7,897 |
9 | flax | 5,497 |
10 | datasets | 4,162 |
11 | scenic | 2,985 |
12 | alpa | 2,979 |
13 | dm-haiku | 2,806 |
14 | thinc | 2,787 |
15 | foolbox | 2,655 |
16 | deepxde | 2,328 |
17 | EasyLM | 2,221 |
18 | mctx | 2,201 |
19 | pennylane | 2,106 |
20 | numpyro | 2,033 |
21 | equinox | 1,789 |
22 | machine_learning_refined | 1,584 |
23 | TransformerEngine | 1,411 |
Sponsored