jax-models
flaxmodels
jax-models | flaxmodels | |
---|---|---|
6 | 1 | |
138 | 223 | |
- | - | |
0.0 | 3.1 | |
almost 2 years ago | 9 months ago | |
Python | Python | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jax-models
-
[D] How to contribute to open source ML and DL without having access to high quality setup?
I was in the same position as you are and the best thing you can do is to start reproducing papers (that's what I did with jax-models). This will
-
[D] Should We Be Using JAX in 2022?
I've been using JAX, especially Flax for quite some time now for my reproducibility initiative (jax_models) and this is what I really appreciate about the framework
- Weekly updated open sourced model implementations in Flax
- Weekly updated open sourced deep learning model implementations in Flax
- [P] Weekly updated open sourced model implementations in Flax
flaxmodels
-
[P] Training StyleGAN2 in Jax (FFHQ and Anime Faces)
a while ago I posted here regarding a repository containing some pretrained models implemented in Jax/Flax. I decided to add training code for these models. Here is the training code for StyleGAN2: https://github.com/matthias-wright/flaxmodels/tree/main/training/stylegan2
What are some alternatives?
datasets - TFDS is a collection of datasets ready to use with TensorFlow, Jax, ...
jax-resnet - Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
flax - Flax is a neural network library for JAX that is designed for flexibility.
stylegan2-pytorch - Implementation of Analyzing and Improving the Image Quality of StyleGAN (StyleGAN 2) in PyTorch
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
long-range-arena - Long Range Arena for Benchmarking Efficient Transformers
GradCache - Run Effective Large Batch Contrastive Learning Beyond GPU/TPU Memory Constraint
ffhq-dataset - Flickr-Faces-HQ Dataset (FFHQ)
elegy - A High Level API for Deep Learning in JAX
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
prompt-tuning - Original Implementation of Prompt Tuning from Lester, et al, 2021