Our great sponsors
-
You can selfhost Stable diffusion quite easily. It's easiest to do with Nvidia GPUs but there is at least some support for doing it on AMD GPUs with ROCM or Mac M1. Stable Diffusion can be done on 6-8gb of vram or more but you won't be able to do it on something like a raspberry pi. https://github.com/AUTOMATIC1111/stable-diffusion-webui https://github.com/invoke-ai/InvokeAI Models can be found on https://huggingface.co and https://civitai.com.
-
You can selfhost Stable diffusion quite easily. It's easiest to do with Nvidia GPUs but there is at least some support for doing it on AMD GPUs with ROCM or Mac M1. Stable Diffusion can be done on 6-8gb of vram or more but you won't be able to do it on something like a raspberry pi. https://github.com/AUTOMATIC1111/stable-diffusion-webui https://github.com/invoke-ai/InvokeAI Models can be found on https://huggingface.co and https://civitai.com.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
InvokeAI
InvokeAI is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, supports terminal use through a CLI, and serves as the foundation for multiple commercial products.
You can selfhost Stable diffusion quite easily. It's easiest to do with Nvidia GPUs but there is at least some support for doing it on AMD GPUs with ROCM or Mac M1. Stable Diffusion can be done on 6-8gb of vram or more but you won't be able to do it on something like a raspberry pi. https://github.com/AUTOMATIC1111/stable-diffusion-webui https://github.com/invoke-ai/InvokeAI Models can be found on https://huggingface.co and https://civitai.com.