DeepBump
zpy
DeepBump | zpy | |
---|---|---|
5 | 9 | |
934 | 288 | |
- | 0.0% | |
2.7 | 0.0 | |
about 1 year ago | over 2 years ago | |
Python | Python | |
GNU General Public License v3.0 only | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DeepBump
-
Normal (Height) Maps
I stumbled upon this repository and want to try to improve the results. But i would need thousands of good pairs and downloading a a dozen packs here and there and drawing normal myself won't cut it.
-
Making a trailer for my book with midjourney. What do you think?
If you already have Photoshop, you can also generate a depth map in there by using the Depth Blur neural filter and enabling "output depth map only". But there are also lots of other free tools for creating depth maps, some of which you can use within Blender, like https://github.com/HugoTini/DeepBump
-
Stable Diffusion textures with Deepbump
Workflow: Stable diffusion base model works best in my experience. Use the prompt "____ texture". Once you find an image you are happy with, import it into Blender. Install the Deepbump addon: https://github.com/HugoTini/DeepBump. Go to shader view, add your texture to the color tab of the principled BSDF, and open up the menu on the side of the node graph with "n". Open the deepbump tab and click "generate normal map". Once that is completed, select the new image node and click "generate height map". Connect that up to the displacement node and the material output, and make sure you are using cycles. In the material properties panel, choose settings->surface->displacement "displacement only". You may need to subdivide your mesh.
-
A Guide and Resources to Death Games - Made by the Community - Resources
Click Here!
- AI Seamless Texture Generator Built-In to Blender
zpy
-
Help finding a model that can identify small cubes
This is actually a great problem for synthetic data. You can make a synthetic dataset of colored cubes on different backgrounds using something like Unity or zpy (shameless plug). The synthetic data will be much larger and more varied than one you create & label by hand.
-
Why isn't this technology used more for AI projects?
I'm sort of in the same boat. I just discovered this very interesting package called zpy that's meant to help automate turning synthetic conditions into training data. I have some experience with Python and am fairly new to Blender, but I could probably get as far as making a good segmented dataset.
-
Why Blender Is the Best Software for the 3D Workflow
Thanks for getting this far! If you’re interested in 3D and what it can do for synthetic data, check out our open-source data development toolkit zpy. Everything you need to generate and iterate synthetic data for computer vision is available for free. Your feedback, commits, and feature requests are invaluable as we continue to build a more robust set of tools for generating synthetic data. In the meantime, if you need our support with a particularly tricky problem, please reach out.
-
Discussion on Medium
I work at Zumo Labs, where we create synthetic data for computer vision using zpy (github.com/ZumoLabs/zpy).
-
How to 3D Scan an Object for Synthetic Data
The easiest way to get started with zpy (available on GitHub) is to follow the steps outlined in this short video tutorial series.
-
Use Python and Blender to Make More Dynamic Training Data
Tools that make synthetic data generation easy are fundamentally changing the way machine learning work is done. Iterating and improving the dataset over the course of a project is more important to project success than iterating the model architecture. That's why we are releasing zpy, an open source synthetic data toolkit. All developers should have the option of working with dynamic data rather than static data.
-
[P] Synthetic Data for CV with Python and Blender
https://github.com/ZumoLabs/zpy We just released our open source synthetic data toolkit built on top of Blender. Our package makes it easy to design and generate synthetic data for computer vision projects. Let us know what you think and what features you want us to focus on next!
https://github.com/ZumoLabs/zpy We just released our open source synthetic data toolkit built on top of Blender. Our package makes it easy to design and generate synthetic data for computer vision projects. Let us know what you think and what features you want us to focus on next!
- Using Blender for Computer Vision
What are some alternatives?
dream-textures - Stable Diffusion built-in to Blender
BlenderProc - A procedural Blender pipeline for photorealistic training image generation
Material-Map-Generator - Easily create AI generated Normal maps, Displacement maps, and Roughness maps.
bpycv - Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
3d-ken-burns - an implementation of 3D Ken Burns Effect from a Single Image using PyTorch
retopoflow - A suite of retopology tools for Blender
Cozy-Auto-Texture - A Blender add-on for generating free textures using the Stable Diffusion AI text to image model.
Meshroom - 3D Reconstruction Software
stable-diffusion - This version of CompVis/stable-diffusion features an interactive command-line script that combines text2img and img2img functionality in a "dream bot" style interface, a WebGUI, and multiple features and other enhancements. [Moved to: https://github.com/invoke-ai/InvokeAI]
metaflow - :rocket: Build and manage real-life ML, AI, and data science projects with ease!
TextRecognitionDataGenerator - A synthetic data generator for text recognition
BlenderNeRF - Easy NeRF synthetic dataset creation within Blender