Lottery_Ticket_Hypothesis-TensorFlow_2
Implementing "The Lottery Ticket Hypothesis" paper by "Jonathan Frankle, Michael Carbin" (by arjun-majumdar)
Neural_Network_Pruning
Implementations of different neural network pruning techniques (by arjun-majumdar)
Lottery_Ticket_Hypothesis-TensorFlow_2 | Neural_Network_Pruning | |
---|---|---|
6 | 4 | |
33 | 12 | |
- | - | |
4.1 | 2.3 | |
about 1 month ago | 9 months ago | |
Jupyter Notebook | Jupyter Notebook | |
- | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Lottery_Ticket_Hypothesis-TensorFlow_2
Posts with mentions or reviews of Lottery_Ticket_Hypothesis-TensorFlow_2.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-05-10.
-
Freeze certain weights - TensorFlow 2
I have already implemented "The Lottery Ticket Hypothesis" by Frankle et al. using TensorFlow 2. You can refer to the code here. Here, a binary mask (0, 1) is used for element-wise multiplication to keep the number of pruned parameters constant because by default, when you apply gradient descent algorithm, then using the weight update rule, all of the weights are updated.
-
[R] Remove pruned connections
Some of my recent experiments in GitHub can be referred: Lottery Ticket Hypothesis implementation and Neural Network Pruning.
-
TensorFlow Lite: RuntimeError
I am using TensorFlow version: 2.3.0 and Python3. I am experimenting in Quantizing a pruned and trained Conv-2 CNN model. The model architecture is: conv -> conv -> max pool -> dense -> dense -> output for CIFAR-10. You can see the Jupyter-notebook here.
-
Iterative Pruning: LeNet-300-100 - PyTorch
The code can be accessed here
-
Neural Network Compression - Implementation benefits
here
-
ValueError: TensorFlow2 Input 0 is incompatible with layer model
True, removing he_normal initialization does increase the accuracy. For most of my previous experiments I have usually used the kernel initialization as mentioned in the different author's paper(s). Therefore for ResNet, I thought of using Kaiming He initialization as he is the author of the research paper. However, the default kernel initialization in TF2 is 'glorot_uniform' which leads to 60.04% val_accuracy.
Neural_Network_Pruning
Posts with mentions or reviews of Neural_Network_Pruning.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-05-10.
-
ResNet-50 PyTorch Pruning
Used Global, Absolute Magnitude Weight, Unstructured and Iterative pruning using ResNet-50 with Transfer Learning on CIFAR-10 dataset. Surprisingly, a sparsity of 99.078% has been achieved. The code can be referred here.
-
[R] Remove pruned connections
Some of my recent experiments in GitHub can be referred: Lottery Ticket Hypothesis implementation and Neural Network Pruning.
-
ResNet-18 Pruning PyTorch
I have coded "Global, unstructured & iterative" pruning using ResNet-18 trained from scratch on CIFAR-10 dataset in PyTorch. You can refer to the code here. Let me know your comments/thoughts.
-
Pruning tutorial
I looked into torch.nn.utils.prune module but it doesn't present an end-to-end example and the code that I came up with doesn't seem to work.
What are some alternatives?
When comparing Lottery_Ticket_Hypothesis-TensorFlow_2 and Neural_Network_Pruning you can also consider the following projects:
labml - 🔎 Monitor deep learning model training and hardware usage from your mobile phone 📱