DALL-E VS deep-vector-quantization

Compare DALL-E vs deep-vector-quantization and see what are their differences.

DALL-E

PyTorch package for the discrete VAE used for DALL·E. (by openai)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
DALL-E deep-vector-quantization
31 2
10,709 463
0.3% -
0.0 0.0
3 months ago over 2 years ago
Python Jupyter Notebook
GNU General Public License v3.0 or later MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

DALL-E

Posts with mentions or reviews of DALL-E. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-25.

deep-vector-quantization

Posts with mentions or reviews of deep-vector-quantization. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-09-08.
  • [D] Intuition about "Discrete Latents" in paper about VQ-VAEs
    2 projects | /r/MachineLearning | 8 Sep 2021
    How the codebook is initialised makes a big difference, if more codes are used at the start then they often stay in use but it can be finicky to figure that out. I've found initialising codes normally with small standard deviation (e.g. 0.01) to help (so codes lie on a hypersphere). Failing that, here is an example with k-means initialisation https://github.com/karpathy/deep-vector-quantization .
  • this is not overfitting but something else, right?
    2 projects | /r/deeplearning | 30 Mar 2021
    The context is, that I am trying to learn a discrete vocabulary of latent codes, i.e. have a discrete learnable embedding to sort of quantize the otherwise continuous latent outputs of the encoder that are then used to reconstruct the input image via the decoder cf. this code snippet. So the idea is not to generate random sampled from noise but to learn an efficient notebook, i.e. bottleneck that captures the essentials of the data set. The decoder then outputs a prob distribution for every pixel over the 255 possible values 8 bit images can take o. The KL (assuming a uniform prior to encourage uniform use of all possible vocabulary entries) is currently weighted with 1.

What are some alternatives?

When comparing DALL-E and deep-vector-quantization you can also consider the following projects:

dalle-2-preview

chainer-VQ-VAE - A Chainer implementation of VQ-VAE.

DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch

DALLE2-pytorch - Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch

big-sleep - A simple command line tool for text to image generation, using OpenAI's CLIP and a BigGAN. Technique was originally created by https://twitter.com/advadnoun

pixray

dalle-mini - DALL·E Mini - Generate images from a text prompt

gpt-3 - GPT-3: Language Models are Few-Shot Learners

DallEval - DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generation Models (ICCV 2023)