collection-of-notebooks
clipping-CLIP-to-GAN
collection-of-notebooks | clipping-CLIP-to-GAN | |
---|---|---|
1 | 1 | |
103 | 40 | |
- | - | |
1.8 | 10.0 | |
about 1 year ago | over 3 years ago | |
Jupyter Notebook | Python | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
collection-of-notebooks
-
test
(Added Feb. 5, 2021) Text2Image_v3 - Colaboratory by tg_bomze. Uses BigGAN (default) or Sigmoid to generate images. GitHub.
clipping-CLIP-to-GAN
-
test
(Added Feb. 24, 2021) clipping-CLIP-to-GAN by cloneofsimo. Uses FastGAN to generate images.
What are some alternatives?
aphantasia - CLIP + FFT/DWT/RGB = text to image/video
DALLECLIP
stylized-neural-painting - Official Pytorch implementation of the preprint paper "Stylized Neural Painting", in CVPR 2021.
CLIP-Style-Transfer - Doing style transfer with linguistic features using OpenAI's CLIP.
StyleCLIP - Official Implementation for "StyleCLIP: Text-Driven Manipulation of StyleGAN Imagery" (ICCV 2021 Oral)
TediGAN - [CVPR 2021] Pytorch implementation for TediGAN: Text-Guided Diverse Face Image Generation and Manipulation
big-sleep - A simple command line tool for text to image generation, using OpenAI's CLIP and a BigGAN. Technique was originally created by https://twitter.com/advadnoun
VectorAscent - Generate vector graphics from a textual caption
Colab-deep-daze - Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network)
stylegan2-clip-approach - Navigating StyleGAN2 w latent space using CLIP
AuViMi - AuViMi stands for audio-visual mirror. The idea is to have CLIP generate its interpretation of what your webcam sees, combined with the words thare are spoken.