Open yosun opened 1 year ago
It's not quite dream-textures. Is there a way to use prompting + image/concept sketch (similar to controlnet user inputs) to control the texture generated by TEXTurePaper?
I don't know Texturepaper. If it is based on Stable Diffusion I guess you could intercept the generation by adding a ControlNet pipeline yourself. If it uses it's own diffusion architecture I guess you have to re-implement ControlNet (and potentially retrain the sketch model). Maybe you have more luck with texture generators which are already based on Stable Diffusion. https://www.reddit.com/r/StableDiffusion/comments/115q7bq/workflow_uv_texture_map_generation_with or maybe some sub-project here: https://github.com/threestudio-project/threestudio