CUDA memory error with hlky repo, (4GB Nvidia) - any ideas?

This page summarizes the projects mentioned and recommended in the original post on /r/StableDiffusion

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • stable-diffusion

    Optimized Stable Diffusion modified to run on lower GPU VRAM (by basujindal)

  • I have managed to install and run the optimized stable diffusion by basu jinal, https://github.com/basujindal/stable-diffusion and it runs with my 4GB VRAM card, although I have to add "--precision full" to avoid getting a green square.

  • stable-diffusion

  • I wanted to try hlky version (https://github.com/hlky/stable-diffusion) , due to the WebUI and integration with upscaling models. It should also have the option to be optimized for low VRAM. To avoid getting a green square I have to add the parameters "--precision full --no-half". When I run a prompt, even with the smallest image size, I immediately get a CUDA memory error. Interestingly, without these parameters there isn't any memory error (but, of course, the result is a green square)

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts