fortuna
GradCache
fortuna | GradCache | |
---|---|---|
5 | 1 | |
855 | 308 | |
1.9% | - | |
8.2 | 4.5 | |
22 days ago | about 1 month ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
fortuna
- 🚀 AWS launches Fortuna, an open-source library for Uncertainty Quantification
-
[P] 🚀 AWS launches Fortuna, an open-source library for Uncertainty Quantification
What is the best end-to-end example showing it? https://github.com/awslabs/fortuna/blob/main/examples/mnist_classification.ipynb ? It would be nice to have some visual explainer, as in https://github.com/aangelopoulos/conformal_classification .
- AWS Fortuna, an open-source library for Uncertainty Quantification
GradCache
-
[D] How to handle absurd batch sizes in SimCLR / OpenAI's CLIP?
Maybe you should try this thing: https://github.com/luyug/GradCache
What are some alternatives?
surface_normal_uncertainty - (ICCV 2021 - oral) Estimating and Exploiting the Aleatoric Uncertainty in Surface Normal Estimation
h-former - H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh.
uq-vae - Solving Bayesian Inverse Problems via Variational Autoencoders
jax-models - Unofficial JAX implementations of deep learning research papers
pytorch-forecasting - Time series forecasting with PyTorch
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
deep-kernel-transfer - Official pytorch implementation of the paper "Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels" (NeurIPS 2020)
EasyLM - Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
jax-resnet - Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
long-range-arena - Long Range Arena for Benchmarking Efficient Transformers
conformal_classification - Wrapper for a PyTorch classifier which allows it to output prediction sets. The sets are theoretically guaranteed to contain the true class with high probability (via conformal prediction).
flaxmodels - Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.