notebooks
Colab-deep-daze
notebooks | Colab-deep-daze | |
---|---|---|
1 | 2 | |
41 | 1 | |
- | - | |
5.2 | 5.8 | |
6 months ago | over 3 years ago | |
Jupyter Notebook | Python | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
notebooks
-
test
(Added Feb. 5, 2021) CLIP + TADNE (pytorch) v2 - Colaboratory_v2.ipynb) by nagolinc. Uses TADNE ("This Anime Does Not Exist") to generate images. Instructions and examples. GitHub. Notebook copy_v2.ipynb) by levindabhi
Colab-deep-daze
-
test
(Added Feb. 24, 2021) Colab-deep-daze - Colaboratory by styler00dollar. Uses SIREN to generate images. I did not get this notebook to work, but your results may vary. GitHub.
-
"Big Sleep", a self-portrait by Big Sleep
That notebook didn't work for me either when I added it to the list on Feb. 24. The notebook file Colab-Deep-Daze.ipynb hasn't changed in about 3 months.
What are some alternatives?
aphantasia - CLIP + FFT/DWT/RGB = text to image/video
CLIP-Style-Transfer - Doing style transfer with linguistic features using OpenAI's CLIP.
VectorAscent - Generate vector graphics from a textual caption
AuViMi - AuViMi stands for audio-visual mirror. The idea is to have CLIP generate its interpretation of what your webcam sees, combined with the words thare are spoken.
Colab-BigGANxCLIP
stylegan2-clip-approach - Navigating StyleGAN2 w latent space using CLIP
random-colabs
clip_biggan
TediGAN - [CVPR 2021] Pytorch implementation for TediGAN: Text-Guided Diverse Face Image Generation and Manipulation
clipping-CLIP-to-GAN
deep-daze - Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun