Ingest, store, & analyze all types of time series data in a fully-managed, purpose-built database. Keep data forever with low-cost storage and superior data compression. Learn more →
Sd-extension-system-info Alternatives
Similar projects and alternatives to sd-extension-system-info
-
-
-
InfluxDB
Access the most powerful time series database as a service. Ingest, store, & analyze all types of time series data in a fully-managed, purpose-built database. Keep data forever with low-cost storage and superior data compression.
-
ComfyUI
A powerful and modular stable diffusion GUI with a graph/nodes interface.
-
AITemplate
AITemplate is a Python framework which renders neural network into high performance CUDA/HIP C++ code. Specialized for FP16 TensorCore (NVIDIA GPU) and MatrixCore (AMD GPU) inference.
-
multidiffusion-upscaler-for-automatic1111
Tiled Diffusion and VAE optimize, licensed under CC BY-NC-SA 4.0
-
scribble-diffusion
Turn your rough sketch into a refined image using AI
-
-
Sonar
Write Clean Python Code. Always.. Sonar helps you commit clean code every time. With over 225 unique rules to find Python bugs, code smells & vulnerabilities, Sonar finds the issues while you focus on the work.
-
-
voltaML-fast-stable-diffusion
Beautiful Stable Diffusion API and UI with support for AITemplate acceleration
-
stable-diffusion-webui-two-shot
Latent Couple extension (two shot diffusion port) (by ashen-sensored)
sd-extension-system-info reviews and mentions
-
What GPU everyone running?
i'm using a RTX 2060 6GB, i can run a batch of 12 * [512*768] without problems. For GPU i like the RTX 3060 12GB, RTX 4060Ti 16GB, 3090 24GB or well a RTX 4090 24GB, you can check some benchmarks here
-
Tips on improving it/s on rtx 3080? Is 3-4it/s normal for this card?
your it/s are low, it/s depends on other things, like sampler, resolution, batch size... here is a list, you can install the extension and do a benchmark too.
-
Self-reported GPUs and iterations/second based on the "vladmatic" data as of today
It's done using https://github.com/vladmandic/sd-extension-system-info
That's where the vlad stats can be useful, find your card, sort by it/sec, and see what they're doing/running and if you're doing the same. https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html
-
Honest question, how are people getting ~35-40 it/sec on 4090? My spits 20 at most
You're being given a lot of flat out wrong information in this thread. The 4090 is a PCIe 4.0 card. PCIe 5 makes no difference at all. You can also see that on the webui benchamarks windows systems are also hitting 35-40 it/sec, so it's not an OS issue either.
-
Optimal Installation of Pytorch (2.0?) --xformers or --opt-sdp-attention, with a RTX 4090 build for Automatic1111 that is current (not 1mo+ old)?
If you check out the scores, ignore the top three right now which I made yesterday when I hit 64 it/s with the new token merging functionality added a few days ago. I had it ramped up to the max throughput possible, well beyond what would be usable in practice just because I enjoy the big numbers :P. My score in the fourth spot is probably more or less the maximum that's realistic for practical use at the moment (at least with Pytorch 2.0+ and a single GPU, I haven’t tried out multi GPU setups or other libraries like TensorRT), and you can look further down the list to get a sense of the averages at the moment for different GPUs. https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html
(if you want to compare cards, use this site: https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html)
-
Is it true that Stable Diffusion doesn't work well with AMD graphics card?
Check the number here https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html
-
How much better is the RTX 3080 (12GB) compared to the RTX 3060 (12GB) for Stable Diffusion?
Note that the benchmark is done using the extension sd-extension-system-info from https://github.com/vladmandic/sd-extension-system-info.git , which tends to give higher numbers, but uses a more standardized method of measuring performance
Roughly double the iteration speed Personally I would use https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html instead to compare the configurations and to double check my settings if I am not getting the performance I am expecting.
-
A note from our sponsor - InfluxDB
www.influxdata.com | 5 Jun 2023
Stats
vladmandic/sd-extension-system-info is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of sd-extension-system-info is Python.