awesome-clip-papers VS StyleCLIP

Compare awesome-clip-papers vs StyleCLIP and see what are their differences.

awesome-clip-papers

The most impactful papers related to contrastive pretraining for multimodal models! (by jacobmarks)

StyleCLIP

Official Implementation for "StyleCLIP: Text-Driven Manipulation of StyleGAN Imagery" (ICCV 2021 Oral) (by orpatashnik)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
awesome-clip-papers StyleCLIP
1 23
16 3,915
- -
5.4 0.0
2 months ago 12 months ago
Python HTML
- MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

awesome-clip-papers

Posts with mentions or reviews of awesome-clip-papers. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-13.
  • A History of CLIP Model Training Data Advances
    8 projects | dev.to | 13 Mar 2024
    For a comprehensive catalog of papers pushing the state of CLIP models forward, check out this Awesome CLIP Papers Github repository. Additionally, the Zero-shot Prediction Plugin for FiftyOne allows you to apply any of the OpenCLIP-compatible models to your own data.

StyleCLIP

Posts with mentions or reviews of StyleCLIP. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-13.

What are some alternatives?

When comparing awesome-clip-papers and StyleCLIP you can also consider the following projects:

encoder4editing - Official implementation of "Designing an Encoder for StyleGAN Image Manipulation" (SIGGRAPH 2021) https://arxiv.org/abs/2102.02766

compare_gan - Compare GAN code.

NVAE - The Official PyTorch Implementation of "NVAE: A Deep Hierarchical Variational Autoencoder" (NeurIPS 2020 spotlight paper)

stylegan2-pytorch - Simplest working implementation of Stylegan2, state of the art generative adversarial network, in Pytorch. Enabling everyone to experience disentanglement

pixel2style2pixel - Official Implementation for "Encoding in Style: a StyleGAN Encoder for Image-to-Image Translation" (CVPR 2021) presenting the pixel2style2pixel (pSp) framework

alias-free-gan - Alias-Free GAN project website and code

tensor2tensor - Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.

Story2Hallucination

aphantasia - CLIP + FFT/DWT/RGB = text to image/video

CLIP-Style-Transfer - Doing style transfer with linguistic features using OpenAI's CLIP.

stylegan-xl - [SIGGRAPH'22] StyleGAN-XL: Scaling StyleGAN to Large Diverse Datasets

StyleCLIP - Using CLIP and StyleGAN to generate faces from prompts.