imagenette
DemonRangerOptimizer
imagenette | DemonRangerOptimizer | |
---|---|---|
9 | 1 | |
877 | 23 | |
0.0% | - | |
0.0 | 0.0 | |
over 1 year ago | over 3 years ago | |
Jupyter Notebook | Python | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
imagenette
-
[P] Graph path traversal with semantic graphs
This idea isn't exclusive to text, the same can be done for images. See this example from the imagenette dataset.
-
Ask HN: In 2022, what is the proper way to get into machine/deep learning?
FastAI has ready-to-run code that does just this. They seem to have an ImageNet package https://github.com/fastai/imagenette
-
How can I download ImageNet dataset with only 20 or 30 classes?
Imagenette is a smaller subset with only 10 classes. https://github.com/fastai/imagenette
You can try this https://github.com/fastai/imagenette which is subset of the main dataset.
-
[D] How can I download ImageNet dataset with only 20 or 30 classes?
Download entire imagenet and annotations. Filter out all annotations that do not contain the classes you're interested in. Consider imagenette.
-
[R] Dataset for research paper
You can try Imagenette, which is a 10-class subset of ImageNet but with the same number of images per class. There are also two datasets linked in the README for increased difficulty.
-
ResNet from scratch - ImageNet
If you're looking for something more bite-sized, how about Imagenette?
-
[D] Tiny-Imagenet original size images
Fastai has made something like that. Does this fit the bill https://github.com/fastai/imagenette?
-
[R] AdasOptimizer Update: Cifar-100+MobileNetV2 Adas generalizes with Adas 15% better and 9x faster than Adam
You don't need Imagenet to verify it really works or not, Checkout https://github.com/fastai/imagenette the fastai folks have a small subset of Imagenet, which has 3 types of the dataset, test on them. If AdasOptimizer really works it you should be able to beat their results, or at least see where it stands.
DemonRangerOptimizer
-
[R] AdasOptimizer Update: Cifar-100+MobileNetV2 Adas generalizes with Adas 15% better and 9x faster than Adam
The results are interesting, but in terms of novelty of the main theory - isn't it almost identical to Baydin et al.? https://arxiv.org/pdf/1703.04782.pdf It seems the difference may be in some implementation details, like using a running average for the past gradient. If it's useful, I implemented a bunch of optimizers with options to synergize different techniques (https://github.com/JRC1995/DemonRangerOptimizer) including hypergradient updates for stuffs (and taking into account decorrelated weight decay and per-parameter lrs for hypergradient lr) when I was bored before practically abandoning it all together. I didn't really run any experiments with it though, but some people tried although they may not have got any particularly striking results.
What are some alternatives?
tiny-imagenet
pytorch-optimizer - torch-optimizer -- collection of optimizers for Pytorch
ML-Optimizers-JAX - Toy implementations of some popular ML optimizers using Python/JAX
AdasOptimizer - ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
txtai - 💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
Gradient-Centralization-TensorFlow - Instantly improve your training performance of TensorFlow models with just 2 lines of code!
RAdam - On the Variance of the Adaptive Learning Rate and Beyond