biprop
Identify a binary weight or binary weight and activation subnetwork within a randomly initialized network by only pruning and binarizing the network. (by chrundle)
assembled-cnn
Tensorflow implementation of "Compounding the Performance Improvements of Assembled Techniques in a Convolutional Neural Network" (by clovaai)
biprop | assembled-cnn | |
---|---|---|
2 | 1 | |
44 | 330 | |
- | 0.6% | |
0.0 | 0.0 | |
about 2 years ago | over 3 years ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
biprop
Posts with mentions or reviews of biprop.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-07-26.
-
[R] Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural Networks by Pruning A Randomly Weighted Network
Code for https://arxiv.org/abs/2103.09377 found: https://github.com/chrundle/biprop
assembled-cnn
Posts with mentions or reviews of assembled-cnn.
We have used some of these posts to build our list of alternatives
and similar projects.
-
[R] ResNet strikes back: An improved training procedure in timm. There has been significant progress on best practices for training neural nets since ResNet's introduction in 2015. With such advances, a vanilla ResNet-50 reaches 80.4% top-1 accuracy on ImageNet without extra data or distillation.
As far as i know, the assemble-ResNet-50 (https://github.com/clovaai/assembled-cnn) gets 82.8% top-1, though they make some (minor) changes to ResNet-50 architecture.
What are some alternatives?
When comparing biprop and assembled-cnn you can also consider the following projects:
mmpretrain - OpenMMLab Pre-training Toolbox and Benchmark
cvat - Annotate better with CVAT, the industry-leading data engine for machine learning. Used and trusted by teams at any scale, for data of any scale. [Moved to: https://github.com/cvat-ai/cvat]
TensorLayer - Deep Learning and Reinforcement Learning Library for Scientists and Engineers
Naruto_Handsign_Classification - Naruto Hand Gesture Recognition with OpenCV and Transfer Learning
autogluon - AutoGluon: AutoML for Image, Text, Time Series, and Tabular Data [Moved to: https://github.com/autogluon/autogluon]
autogluon - Fast and Accurate ML in 3 Lines of Code
One-Piece-Image-Classifier - A quick image classifier trained with manually selected One Piece images.