DeepBump
Material-Map-Generator
DeepBump | Material-Map-Generator | |
---|---|---|
5 | 3 | |
934 | 265 | |
- | - | |
2.7 | 4.0 | |
about 1 year ago | 10 months ago | |
Python | Python | |
GNU General Public License v3.0 only | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DeepBump
-
Normal (Height) Maps
I stumbled upon this repository and want to try to improve the results. But i would need thousands of good pairs and downloading a a dozen packs here and there and drawing normal myself won't cut it.
-
Making a trailer for my book with midjourney. What do you think?
If you already have Photoshop, you can also generate a depth map in there by using the Depth Blur neural filter and enabling "output depth map only". But there are also lots of other free tools for creating depth maps, some of which you can use within Blender, like https://github.com/HugoTini/DeepBump
-
Stable Diffusion textures with Deepbump
Workflow: Stable diffusion base model works best in my experience. Use the prompt "____ texture". Once you find an image you are happy with, import it into Blender. Install the Deepbump addon: https://github.com/HugoTini/DeepBump. Go to shader view, add your texture to the color tab of the principled BSDF, and open up the menu on the side of the node graph with "n". Open the deepbump tab and click "generate normal map". Once that is completed, select the new image node and click "generate height map". Connect that up to the displacement node and the material output, and make sure you are using cycles. In the material properties panel, choose settings->surface->displacement "displacement only". You may need to subdivide your mesh.
-
A Guide and Resources to Death Games - Made by the Community - Resources
Click Here!
- AI Seamless Texture Generator Built-In to Blender
Material-Map-Generator
-
What do you folks think about using AI to generate normal/roughness/disp maps?
I've recently stumbled upon Joey Ballentine's Material-Map-Generator which does produce surprisingly good results.
-
With StableDiffusion I generated 1000 wall seamless textures 2k resolution with 2k normals + displacement maps, and that's the full guide of how I did it
THIS seems particularly interesting - it's batch CLI, but also based on a GAN model running in Python, which means it could potentially get integrated end-to-end into a web UI for texturing workflows.
-
AI Seamless Texture Generator Built-In to Blender
I saw you were working on adding displacement and material maps to it, you may want to look into this instead of trying to re-train the model. https://github.com/JoeyBallentine/Material-Map-Generator
What are some alternatives?
dream-textures - Stable Diffusion built-in to Blender
Real-ESRGAN - Real-ESRGAN aims at developing Practical Algorithms for General Image/Video Restoration.
3d-ken-burns - an implementation of 3D Ken Burns Effect from a Single Image using PyTorch
stable-diffusion - This version of CompVis/stable-diffusion features an interactive command-line script that combines text2img and img2img functionality in a "dream bot" style interface, a WebGUI, and multiple features and other enhancements. [Moved to: https://github.com/invoke-ai/InvokeAI]
zpy - Synthetic data for computer vision. An open source toolkit using Blender and Python.
Cozy-Auto-Texture - A Blender add-on for generating free textures using the Stable Diffusion AI text to image model.