Tencent / HunyuanDiT

Hunyuan-DiT : A Powerful Multi-Resolution Diffusion Transformer with Fine-Grained Chinese Understanding
https://dit.hunyuan.tencent.com/
Other
2.63k stars 190 forks source link

请问为什么添加flash加速后,就会出现The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results. Setting `pad_token_id` to `eos_token_id`:2 for open-end generation. #29

Open White-Friday opened 1 month ago

White-Friday commented 1 month ago

Thanks for your error report and we appreciate it a lot.

Checklist

  1. I have searched related issues but cannot get the expected help.
  2. The bug has not been fixed in the latest version.

Describe the bug A clear and concise description of what the bug is.

Reproduction

  1. What command or script did you run?
A placeholder for the command.
  1. Did you make any modifications on the code or config? Did you understand what you have modified?
  2. What dataset did you use?

Environment

  1. Please run python utils/collect_env.py to collect necessary environment information and paste it here.
  2. You may add addition that may be helpful for locating the problem, such as
    • How you installed PyTorch [e.g., pip, conda, source]
    • Other environment variables that may be related (such as $PATH, $LD_LIBRARY_PATH, $PYTHONPATH, etc.)

Error traceback If applicable, paste the error trackback here.

A placeholder for trackback.

Bug fix If you have already identified the reason, you can provide the information here. If you are willing to create a PR to fix it, please also leave a comment here and that would be much appreciated!

xhinker commented 1 month ago

see the same error

Jarvis73 commented 1 month ago

Hi, @White-Friday , @xhinker Can you run the environment detection script provided in the repo to provide some basic information about the environment?

python utils/collect_env.py
python -m pip list

This information can help us better troubleshoot the problem. Thanks.

ziyaxuanyi commented 1 month ago

Hi, @White-Friday , @xhinker Can you run the environment detection script provided in the repo to provide some basic information about the environment?

python utils/collect_env.py
python -m pip list

This information can help us better troubleshoot the problem. Thanks.

Without flash attention acceleration, this message prompt will also be output, but the picture can be generated normally, and the picture is also normal. It seems that this message prompt is just a warning and has no impact?