Tencent / HunyuanDiT

Hunyuan-DiT : A Powerful Multi-Resolution Diffusion Transformer with Fine-Grained Chinese Understanding
https://dit.hunyuan.tencent.com/
Other
3.45k stars 295 forks source link

Error! can not run the hydit_app.py #147

Open PiPiNam opened 4 months ago

PiPiNam commented 4 months ago

I just download the docker that you provided (cuda11 version) and I find the version of torch is the cpu version.

I run the code "python app/hydit_app.py" in the container shows:

root@docker-desktop:/workspace/HunyuanDiT# python app/hydit_app.py 2024-07-07 10:43:54.296 | INFO | hydit.inference:init:160 - Got text-to-image model root path: ckpts/t2i 2024-07-07 10:43:54.297 | INFO | hydit.inference:init:169 - Loading CLIP Text Encoder... 2024-07-07 10:43:57.131 | INFO | hydit.inference:init:172 - Loading CLIP Text Encoder finished 2024-07-07 10:43:57.131 | INFO | hydit.inference:init:175 - Loading CLIP Tokenizer... 2024-07-07 10:43:57.284 | INFO | hydit.inference:init:178 - Loading CLIP Tokenizer finished 2024-07-07 10:43:57.284 | INFO | hydit.inference:init:181 - Loading T5 Text Encoder and T5 Tokenizer... You are using the default legacy behaviour of the <class 'transformers.models.t5.tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the legacy (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set legacy=False. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 /opt/conda/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py:550: UserWarning: The sentencepiece tokenizer that you are converting to a fast tokenizer uses the byte fallback option which is not implemented in the fast tokenizers. In practice this means that the fast version of the tokenizer can produce unknown tokens whereas the sentencepiece version would have converted these unknown tokens into a sequence of byte tokens matching the original piece of text. warnings.warn( You are using a model of type mt5 to instantiate a model of type t5. This is not supported for all configurations of models and can yield errors. Killed

I don't know how to resolve it, could you help me to deal with the problem? Thanks!

zobinimm commented 2 months ago

Has this problem been resolved?