thinc
extending-jax
Our great sponsors
thinc | extending-jax | |
---|---|---|
4 | 2 | |
2,789 | 352 | |
0.5% | - | |
7.6 | 3.5 | |
3 days ago | 6 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
thinc
-
JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
Agree, though I wouldn’t call PyTorch a drop-in for NumPy either. CuPy is the drop-in. Excepting some corner cases, you can use the same code for both. Thinc’s ops work with both NumPy and CuPy:
https://github.com/explosion/thinc/blob/master/thinc/backend...
-
Tinygrad: A simple and powerful neural network framework
I love those tiny DNN frameworks, some examples that I studied in the past (I still use PyTorch for work related projects) :
thinc.by the creators of spaCy https://github.com/explosion/thinc
-
good examples of functional-like python code that one can study?
thinc - defining neural nets in functional way jax, a new deep learning framework puts emphasis on functions rather than tensors, I've tested it for a couple of applications and it's really cool, you can write stuff like you'd write math expressions in papers using numpy. That speeds up development significantly, and makes code much more readable
- thinc - A refreshing functional take on deep learning, compatible with your favorite libraries
extending-jax
-
[D] Should We Be Using JAX in 2022?
You can check out this or this for more info. I think it is safe to assume that it is less stable than PyTorch - some other commenters have spoken about running into trouble with XLA in certain corner cases, but I have not experienced this so I can't speak to it.
- Extending JAX with custom C++ and CUDA code
What are some alternatives?
quantulum3 - Library for unit extraction - fork of quantulum for python3
einops - Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
mpi4jax - Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python :zap:
horovod - Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
dm-haiku - JAX-based neural network library
trax - Trax — Deep Learning with Clear Code and Speed
AIF360 - A comprehensive set of fairness metrics for datasets and machine learning models, explanations for these metrics, and algorithms to mitigate bias in datasets and models.
elegy - A High Level API for Deep Learning in JAX
textacy - NLP, before and after spaCy
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/