SD fine-tuning methods compared: a benchmark

This page summarizes the projects mentioned and recommended in the original post on /r/StableDiffusion

Scout Monitoring - Free Django app performance insights with Scout Monitoring
Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
www.scoutapm.com
featured
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
  • LyCORIS

    Lora beYond Conventional methods, Other Rank adaptation Implementations for Stable diffusion.

  • You might want to expand LoRA to include LoCon and LoHa, (and also add a column for VRAM requirements) (Think of it as a more complete LoRA that works for the kernels in the convolutional units rather than just the weights for the feed-forward network), support is still quite limited, but it's starting to pick up steam https://github.com/KohakuBlueleaf/LyCORIS

  • StableTuner

    Discontinued Finetuning SD in style.

  • Scout Monitoring

    Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.

    Scout Monitoring logo
  • sd-scripts

  • Kohya's GUI has a way to do it, or you can just the script. https://github.com/kohya-ss/sd-scripts/blob/main/networks/extract_lora_from_models.py

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts