Lottery_Ticket_Hypothesis-TensorFlow_2
Implementing "The Lottery Ticket Hypothesis" paper by "Jonathan Frankle, Michael Carbin" (by arjun-majumdar)
Generalizing-Lottery-Tickets
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers" (by varungohil)
Lottery_Ticket_Hypothesis-TensorFlow_2 | Generalizing-Lottery-Tickets | |
---|---|---|
6 | 1 | |
33 | 50 | |
- | - | |
4.1 | 0.0 | |
about 1 month ago | almost 2 years ago | |
Jupyter Notebook | Jupyter Notebook | |
- | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Lottery_Ticket_Hypothesis-TensorFlow_2
Posts with mentions or reviews of Lottery_Ticket_Hypothesis-TensorFlow_2.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-05-10.
-
Freeze certain weights - TensorFlow 2
I have already implemented "The Lottery Ticket Hypothesis" by Frankle et al. using TensorFlow 2. You can refer to the code here. Here, a binary mask (0, 1) is used for element-wise multiplication to keep the number of pruned parameters constant because by default, when you apply gradient descent algorithm, then using the weight update rule, all of the weights are updated.
-
[R] Remove pruned connections
Some of my recent experiments in GitHub can be referred: Lottery Ticket Hypothesis implementation and Neural Network Pruning.
-
TensorFlow Lite: RuntimeError
I am using TensorFlow version: 2.3.0 and Python3. I am experimenting in Quantizing a pruned and trained Conv-2 CNN model. The model architecture is: conv -> conv -> max pool -> dense -> dense -> output for CIFAR-10. You can see the Jupyter-notebook here.
-
Iterative Pruning: LeNet-300-100 - PyTorch
The code can be accessed here
-
Neural Network Compression - Implementation benefits
here
-
ValueError: TensorFlow2 Input 0 is incompatible with layer model
True, removing he_normal initialization does increase the accuracy. For most of my previous experiments I have usually used the kernel initialization as mentioned in the different author's paper(s). Therefore for ResNet, I thought of using Kaiming He initialization as he is the author of the research paper. However, the default kernel initialization in TF2 is 'glorot_uniform' which leads to 60.04% val_accuracy.
Generalizing-Lottery-Tickets
Posts with mentions or reviews of Generalizing-Lottery-Tickets.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2022-03-29.
-
Avoid vver fitting in iterative pruning [D]
Code for https://arxiv.org/abs/1906.02773 found: https://github.com/varungohil/Generalizing-Lottery-Tickets
What are some alternatives?
When comparing Lottery_Ticket_Hypothesis-TensorFlow_2 and Generalizing-Lottery-Tickets you can also consider the following projects:
labml - 🔎 Monitor deep learning model training and hardware usage from your mobile phone 📱
Lottery-Ticket-Hypothesis-for-DNNs - This repo aims to provide an easy-to-use interface for searching the lottery ticket of a DNN structure.
Neural_Network_Pruning - Implementations of different neural network pruning techniques
GRAN - Efficient Graph Generation with Graph Recurrent Attention Networks, Deep Generative Model of Graphs, Graph Neural Networks, NeurIPS 2019
distiller - Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
deconstructing-lottery-tickets
Lottery_Ticket_Hypothesis-TensorFlow_2 vs labml
Generalizing-Lottery-Tickets vs Lottery-Ticket-Hypothesis-for-DNNs
Lottery_Ticket_Hypothesis-TensorFlow_2 vs Neural_Network_Pruning
Generalizing-Lottery-Tickets vs GRAN
Generalizing-Lottery-Tickets vs distiller
Generalizing-Lottery-Tickets vs deconstructing-lottery-tickets