lllyasviel / ControlNet

Let us control diffusion models!
Apache License 2.0
29.76k stars 2.69k forks source link

TEXTurePaper + ControlNET (?) #413

Open yosun opened 1 year ago

yosun commented 1 year ago
  1. Draw some rough lines a base texture image in the 3D model submitted to https://github.com/TEXTurePaper
  2. ControlNET controls the texture generation based on the prompt and base texture?
geroldmeisinger commented 12 months ago

look here: https://github.com/carson-katri/dream-textures

yosun commented 12 months ago

It's not quite dream-textures. Is there a way to use prompting + image/concept sketch (similar to controlnet user inputs) to control the texture generated by TEXTurePaper?

geroldmeisinger commented 12 months ago

I don't know Texturepaper. If it is based on Stable Diffusion I guess you could intercept the generation by adding a ControlNet pipeline yourself. If it uses it's own diffusion architecture I guess you have to re-implement ControlNet (and potentially retrain the sketch model). Maybe you have more luck with texture generators which are already based on Stable Diffusion. https://www.reddit.com/r/StableDiffusion/comments/115q7bq/workflow_uv_texture_map_generation_with or maybe some sub-project here: https://github.com/threestudio-project/threestudio