Open JohnTigue opened 1 year ago
Inpainting of textures with Stable Diffusion in Blender 3D shows how SD for texturizing is already available via a Blender plug-in.
How to use Stable Diffusion in Blender
Turns out Blender's scripting language is Python. So, that explains how folks got Stable Diffusion into Blender so quickly. I'm guessing that ML tools packaged as Blender add-ons can probably be integrated with other tools.
Depth2Image: another EXTREMELY promising path is to use Blender to render a good first pass 3D model, then use the depth map from that render to steer SD. The following image is from a tweet which claims to illustrate the concept.
This Bake Master Blender plug-in is not generative AI based (as far as I can tell) but it is relevant because it illustrates other ways folks are using software to accelerate texture creation: Blender Addon for Baking l Bake Master.
This too is not generative AI based but it illustrates how to reduced the generated asset quality in order to deal with the limitations of the hardware of a dev station: Blender Tutorial : Control Textures for Beginners.
Here's an example of using SD and Blender together: Using AI Art to make games (part 1 character creation).
Another Blender and SD plug-in example: "Dream Textures is an all-new installment of AI tools in blender that allows users to create endlessly, from textures to awesome artworks": Dream Textures - New Blender A.I Tool For All!.
This headline is clickbait-y but it is worth a read: Stable Diffusion can texture your entire scene automatically.
"Face Builder" is a Blend plug-in that maps a 2D image to a 3D mesh of a face. https://youtu.be/FKoy7bncHLs?t=75
Just a YouTube show and tell narrated for what the speaker thinks are 8 cool Blender projects: https://www.youtube.com/watch?v=4fJMhWSXaYk&ab_channel=InspirationTuts
Another DreamTextures intro: A.I In Blender is Awesome!
Stable Diffusion for rendering in Blender:
AI Render is an addon that allows you to use Stable Diffusion in Blender to generate images or animations using the combined influence of text prompts and your 3D scene. To add to its allure, it’s a no-brainer to set up, but it can also be used with a local installation of Stable Diffusion, which allows for unlimited generations at the cost of using your own hardware to process your AI images.
So, perhaps we can point it at our SD --api
interface on AWS.
Via Reddit:
I've been working on a library for Stable Diffusion seamless textures to use in games. I made some updates to the site like 3D texture preview, faster searching, and login support :) Check it out at https://pixela.ai/
2D => 3D Blender plug-in shows how to take a SD image and have software estimate the 3D environment, into Blender.
Presenter is an experienced VFX dev, it seems: FREE Lifechanging VFX Plugin Comes to BLENDER! Stable Diffusion Ai Render. Interesting in that he shows how SD can be used in the Blender workflow with other existing tools.
In Dream Textures, this looks to be the place where I need to hack in: https://github.com/carson-katri/dream-textures/blob/0.1.0/preferences.py#L253.
License is GPL-3.0, so we'd HAVE to publish the hack for interfacing to Hypnowerk :)
Blender and SD (especially with ControlNet) are great tool pairing. They work together synergistically. Traditional 3D character rigging (especially hands and feet) can be done in Blender. Then ControlNet works with the exported images and depth map to "grok" the character's pose and gives it the SD treatment. This rainbow stick figure character from ControlNet becomes the bridge for the character rigging system from traditional 3D character rigging.
We are tracking ControlNet in #77.
An interesting example of Blender as tech that is driving the back office and outsourcing strategy, with business strategy implications.
Barnstorm VFX is about a decade old. They have employees and a network of freelancers. Over time they came to use Blender as their main tool for interop between the various creatives, geographically distributed. Free is nice all around, but this has led to lots of folks knowing the tool. Sounds like Barnstorm is a dominant player in the VFX world: How This Big VFX Studio Started Using Blender | Barnstorm VFX.
This is a hub issue for all things related to Blender.
Blender seems to be a tool that folks like Karma use to generated 3D assets. Blend is a great example of an open source project and community, so I feel comfortable adopting it, especially given the evidence of it already being in use by at least one of the creatives currently involved with Verses production.
See also DMT Meshes (#48), which is a 3D gener plug-in for Blender.