Convert Ai generated 2D images to 3D models for use in Blender and Gravity Sketch.

472,544
0
Published 2023-06-07
This is an overview of how I use Ai generated images from Midjourney and Stable Diffusion to create 3D models that can be used for design reviews and refinement. I cover the 3D conversion process and how I manipulate and use that 3D data in Blender and Gravity sketch to help create 3D forms for further design maturation.

Use this to convert 2D images with depthmap estimation:

huggingface.co/spaces/shariqfarooq/ZoeDepth

All Comments (21)
  • @shawnmoghadam6346
    You're good at figuring this stuff out. You're on the forefront! Please keep sharing.
  • @DevilSpawn1
    pro tip: in blender if you set the fbx "Path Mode" to Copy and then click the little button next to it to embed textures, they should arrive properly in GS
  • @Yuki-rh1ie
    FINALLY somebody using gravity sketch!! thank you this is a game changer!!
  • @PRodi_
    My new favourite chanell - Looking forward to more Gravity Sketch♥ & AI smart optimisation solutions!
  • @rexrip1080
    5:00 Adding an up to scale character model or a face to the scene would make adding depth and fixing the size much easier. There are a bunch of free character models, just check if the character size is around 185 cm (or the equal of the character collision for default characters in the engine of your choice). Simply put the cut side of the helmet over the character's head so the eyes and eye holes meet and then mirror and scale what is needed. This way, you will use a single character model (size) for all of your projects and it would make it easier in the long run as well...
  • @Glowbox3D
    Very cool workflow. I think the clean up in regards to the warped texture bits and the missing back and bottom are really tough bits to work out. Using depth maps like this is a fun technology and tool. Thanks for the vid!
  • @Ytsssss364
    jeez 4:42 a select lasso for vertices omg never knew that thanks - and thanks for the rest too wow! amazing man!
  • @vitalis
    Keep it up. It’s awesome how you tried to implement new technologies. Don’t let negative comments hinder you. Subbed!
  • @LETHALxxTiTAN
    This is amazing , never knew you could use this shortcut.
  • @pedroavelino6379
    Thanks for the video mate! I am having trouble enabling the texture, when i click the top right drop down menu, i cant see the same options as you, i see less options (blender 3.5 btw). Is there a way I can make all these options show up or other way to turn the textures on? Thanks a lot!
  • @nicko_3d_art
    this is actually the FIRST ai workflow that I've seen how ai is intended to be used. we need more content like this. I just wish the model's geo was a lot better
  • @jinxxpwnage
    You can take this and import into zbrush , add some guides and then z remesh a duplicate. This will clean up in quads but you'll lose your uvs. Now take the other duplicate and flip the uv map vertically , assign the texture from the model and divide in order to convert to polypaint , you will be able to store the diffuse map in vertex color info instead , bake that out of substance painter and work from there overlapping more maps and normals as well as perfect the geometry all this in blender. I like this workflow very cool
  • @shimonkey7800
    Really interesting stuff, thx I'll surely try thos workflow
  • @iFaFo_0
    Thank you, this was incredible.