BlendArMocap
dream-textures
BlendArMocap | dream-textures | |
---|---|---|
3 | 72 | |
854 | 7,599 | |
- | - | |
4.6 | 5.8 | |
15 days ago | 13 days ago | |
Python | Python | |
GNU General Public License v3.0 only | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
BlendArMocap
-
When you are too poor to buy motion capture and you need to make the animation yourself 🥲
Maybe it was a blender plugin like this: https://github.com/cgtinker/BlendArMocap
- Iron man cgi (showreel) made in blender
- How do I install Mediapipe and OpenCV?
dream-textures
- Donut done with Artificial Intelligence and Blender
- Tell HN: The next generation of videogames will be great with midjourney
-
After Diffusion, an After Effects Extension Integrating the SD web UI seamlessly.
I'm a long time advanced AE user and would gladly give feedback according to how I envision a nice workflow to be if you want. I recently got into dream textures for blender, which I think is a great reference for the direction things could be heading. It's still not viable for consistent video, but I love how they expose multiple control nets and their weights to be animatable for example. I also suggested them exposed (animatable) prompt weights, which the author now also plans for future release. I see you have such things planned as well for this plugin so big thumbs up!
-
Resources for artists interesting in using StableDiffusion as a tool?
Dream Textures (SD for Blender) - https://github.com/carson-katri/dream-textures
- Using AI for 3d Game art
-
ControlNet fully integrated with Blender using nodes!
Yes, and it can also automatically bake the texture onto the original UV map instead of the projected UVs. The guide is here: https://github.com/carson-katri/dream-textures/wiki/Texture-Projection
- Using DALL-E 2 to create brick and water textures in Unity.
- 3D animation attempt using Sketchup screenshots and ControlNet
- Blender 3.5
-
Master AI Texture Projection for Blender 3
Dream AI latest release: https://github.com/carson-katri/dream-textures/releases
What are some alternatives?
fSpy-Blender - Official fSpy importer for Blender
stable-diffusion-webui - Stable Diffusion web UI
mixer - Add-on for real-time collaboration in Blender.
stable-diffusion - This version of CompVis/stable-diffusion features an interactive command-line script that combines text2img and img2img functionality in a "dream bot" style interface, a WebGUI, and multiple features and other enhancements. [Moved to: https://github.com/invoke-ai/InvokeAI]
MB-Lab - MB-Lab is a character creation tool for Blender 4.0 and above, based off ManuelBastioniLAB
stable-diffusion - Optimized Stable Diffusion modified to run on lower GPU VRAM
FreeFaceMoCap - Free Face Tracking Module for facial motion capture in Blender
stable-diffusion-nvidia-docker - GPU-ready Dockerfile to run Stability.AI stable-diffusion model v2 with a simple web interface. Includes multi-GPUs support.
Magic-UV - Blender Add-on: Magic UV
DeepBump - Normal & height maps generation from single pictures
stable-diffusion-webui - Stable Diffusion web UI [Moved to: https://github.com/Sygil-Dev/sygil-webui]
stable-diffusion