git-re-basin
CSL
git-re-basin | CSL | |
---|---|---|
9 | 2 | |
438 | 515 | |
- | - | |
3.5 | 1.9 | |
about 1 year ago | 11 months ago | |
Python | Python | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
git-re-basin
-
Merge-Stable-Diffusion-models-without-distortion-gui
Implementation: https://github.com/samuela/git-re-basin
- I'm testing if the 1.5 and 2.0 model combine in Automatic 1111 now...
-
I love SD but the pain is real
Wouldn't "applying the permutation" simply swap all the parameters in a model so they match on both models? For example, in https://github.com/samuela/git-re-basin/blob/main/src/cifar10_vgg_weight_matching.py, on line 184 they apply the permutation, and on line 192 they lerp from model A's params to the permuted model B's params. This lerp is basically a weighted sum merge, isn't it? At a lerp of 0.5, it would be somewhere in between model A and the permuted model B.
-
Not really working, poorly coded sparse tensor compression of Dreambooth models. Help appreciated, code in comments
Definitely interesting, but you might get something useful out of https://github.com/samuela/git-re-basin ?
- Git Re-Basin: Merging models and preserving latent spaces (ie not the A111 linear interpolation)
-
Most Popular AI Research Sept 2022 - Ranked Based On Total GitHub Stars
Git Re-Basin: Merging Models modulo Permutation Symmetries https://github.com/samuela/git-re-basin https://arxiv.org/abs/2209.04836v1
- [D] Most Popular AI Research Sept 2022 - Ranked Based On GitHub Stars
- Git Re-Basin: Merging Models Modulo Permutation Symmetries
CSL
What are some alternatives?
VToonify - [SIGGRAPH Asia 2022] VToonify: Controllable High-Resolution Portrait Video Style Transfer
iris - Transformers are Sample-Efficient World Models. ICLR 2023, notable top 5%.
artbot-for-stable-diffusion - A front-end GUI for interacting with the AI Horde / Stable Diffusion distributed cluster
whisper - Robust Speech Recognition via Large-Scale Weak Supervision
storydalle
setfit - Efficient few-shot learning with Sentence Transformers
Text2Light - [SIGGRAPH Asia 2022] Text2Light: Zero-Shot Text-Driven HDR Panorama Generation
hivemind - Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
motion-diffusion-model - The official PyTorch implementation of the paper "Human Motion Diffusion Model"