optimum-intel
distiller
optimum-intel | distiller | |
---|---|---|
1 | 1 | |
328 | 4,096 | |
8.8% | - | |
9.6 | 10.0 | |
5 days ago | over 1 year ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
optimum-intel
-
If Stable Diffusion "stores" images in lossy compression, as per the lawsuit's claim, how can you retrieve the original training images?
No I haven't. There's an article from Intel about doing it with some of their tools though (code is here).
distiller
-
Avoid vver fitting in iterative pruning [D]
Code for https://arxiv.org/abs/1510.00149 found: https://github.com/NervanaSystems/distiller
What are some alternatives?
WhitenBlackBox - Towards Reverse-Engineering Black-Box Neural Networks, ICLR'18
aimet - AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
deepsparse - Sparsity-aware deep learning inference runtime for CPUs
Lottery-Ticket-Hypothesis-for-DNNs - This repo aims to provide an easy-to-use interface for searching the lottery ticket of a DNN structure.
dl-colab-notebooks - Try out deep learning models online on Google Colab
Generalizing-Lottery-Tickets - This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
deconstructing-lottery-tickets