wty-ustc / HairCLIP

[CVPR 2022] HairCLIP: Design Your Hair by Text and Reference Image
GNU Lesser General Public License v2.1
541 stars 68 forks source link

How to run predict.py #19

Closed panghongwei17 closed 2 years ago

panghongwei17 commented 2 years ago

HI ,Thank you for your work

In line 11, (from cog import BasePredictor, Path, Input)this sentence means ? I have a red wavy line here

chenxwh commented 2 years ago

Hi, Cog is the open source tool used realising the replicate demo, please check the document for installation details or this PR for reference. With cog.yaml and predict.py, you can use Cog to build your own web demo and docker image. However for HairCLIP it is implemented by us here with customised image input, you can try the web demo directly without the need to run predict.py :) Hope it helps.

panghongwei17 commented 2 years ago

Hi, Cog is the open source tool used realising the replicate demo, please check the document for installation details or this PR for reference. With cog.yaml and predict.py, you can use Cog to build your own web demo and docker image. However for HairCLIP it is implemented by us here with customised image input, you can try the web demo directly without the need to run predict.py :) Hope it helps.

Hi, Thank you for your answer. Oh, I see. I also want to ask you a question. Does this work support inputting an original hairstyle image and a reference hairstyle image instead of text input, and then returning the result? If yes, where can I modify this part because I see inference.py, the test data is encoded into latents Thank you again for your work!

chenxwh commented 2 years ago

The predict function in predict.py accepts any input and any Output object as you define them (see this doc for more details, feel free to play with it :)

panghongwei17 commented 2 years ago

The predict function in predict.py accepts any input and any Output object as you define them (see this doc for more details, feel free to play with it :)

OK, thank you for your patience and guidance. Have a nice day!