stable-diffusion-nvidia-docker
dream-textures
stable-diffusion-nvidia-docker | dream-textures | |
---|---|---|
6 | 72 | |
348 | 7,620 | |
- | - | |
7.0 | 5.8 | |
7 months ago | 24 days ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
stable-diffusion-nvidia-docker
- Does Stable Diffusion support NVLink?
- The guy behind the viral fake photo of the Pope in a puffy coat says using AI to make images of celebrities 'might be the line' — and calls for greater regulation
- Utilizing Multiple GPUs - Repurposing Mining Rig
- Can we start a list of Stable Diffusion 2.0 compatible UI's?
- Using several computers (GPUs) to speed up Stable Diffusion computation times
-
Clustering GPUs for use with SD
Any update? I just searched through the discord for dataparallel and found someone mentioned this link to a docker image which appears to support multi GPU but I don't have any experience with docker and haven't seen where dataparallel is used in any of the files. I'm also searching through all the mentions of K80.
dream-textures
- Donut done with Artificial Intelligence and Blender
- Tell HN: The next generation of videogames will be great with midjourney
-
After Diffusion, an After Effects Extension Integrating the SD web UI seamlessly.
I'm a long time advanced AE user and would gladly give feedback according to how I envision a nice workflow to be if you want. I recently got into dream textures for blender, which I think is a great reference for the direction things could be heading. It's still not viable for consistent video, but I love how they expose multiple control nets and their weights to be animatable for example. I also suggested them exposed (animatable) prompt weights, which the author now also plans for future release. I see you have such things planned as well for this plugin so big thumbs up!
-
Resources for artists interesting in using StableDiffusion as a tool?
Dream Textures (SD for Blender) - https://github.com/carson-katri/dream-textures
- Using AI for 3d Game art
-
ControlNet fully integrated with Blender using nodes!
Yes, and it can also automatically bake the texture onto the original UV map instead of the projected UVs. The guide is here: https://github.com/carson-katri/dream-textures/wiki/Texture-Projection
- Using DALL-E 2 to create brick and water textures in Unity.
- 3D animation attempt using Sketchup screenshots and ControlNet
- Blender 3.5
-
Master AI Texture Projection for Blender 3
Dream AI latest release: https://github.com/carson-katri/dream-textures/releases
What are some alternatives?
deforum-stable-diffusion
stable-diffusion-webui - Stable Diffusion web UI
nvidia-gpu-scheduler - NVIDIA GPU compute task scheduling utility
stable-diffusion - This version of CompVis/stable-diffusion features an interactive command-line script that combines text2img and img2img functionality in a "dream bot" style interface, a WebGUI, and multiple features and other enhancements. [Moved to: https://github.com/invoke-ai/InvokeAI]
image-super-resolution - 🔎 Super-scale your images and run experiments with Residual Dense and Adversarial Networks.
stable-diffusion - Optimized Stable Diffusion modified to run on lower GPU VRAM
Stable-Diffusion-2.0-CPU-or-GPU-Colab-Gradio - Config files for my GitHub profile.
DeepBump - Normal & height maps generation from single pictures
Mask_RCNN_Pytorch - Mask R-CNN for object detection and instance segmentation on Pytorch
stable-diffusion-webui - Stable Diffusion web UI [Moved to: https://github.com/Sygil-Dev/sygil-webui]
dream-factory - Multi-threaded GUI manager for mass creation of AI-generated art with support for multiple GPUs.
stable-diffusion