ShihaoZhaoZSH / LaVi-Bridge

[ECCV 2024] Bridging Different Language Models and Generative Vision Models for Text-to-Image Generation
MIT License
308 stars 20 forks source link

Error run inference t5_unet #12

Open NguyenNhoTrung opened 5 months ago

NguyenNhoTrung commented 5 months ago

I have error when I run code, how to fix:

(lavi-bridge) user@hg-ai-02:/hdd/trungnn/LaVi-Bridge/test$ bash run.sh /home/user/miniconda3/envs/lavi-bridge/lib/python3.10/site-packages/transformers/models/t5/tokenization_t5_fast.py:160: FutureWarning: This tokenizer was incorrectly instantiated with a model max length of 512 which will be corrected in Transformers v5. For now, this behavior is kept to avoid breaking backwards compatibility when padding/encoding with truncation is True.

Also, I have error when create conda with LaVi-Bridge/environment.yaml: The conflict is caused by: The user requested huggingface-hub==0.17.3 diffusers 0.24.0 depends on huggingface-hub>=0.19.4

ShihaoZhaoZSH commented 5 months ago

The error about LoRA may be due to the _find_modules function in the lora.py not finding target classes like nn.Linear or nn.Conv2d. This could be caused by the class of linear or convolutional layer defined in T5 or U-Net. For example, if the peft package is not installed, the convolutional layers in U-Net would be of the LoRACompatibleConv class instead of nn.Conv2d. As a result, _find_modules would fail to find the nn.Conv2d class for LoRA integration in U-Net. Then, code in line 724 or line 749 in lora.py will not be executed, leading to the error "UnboundLocalError: local variable '_tmp' referenced before assignment". You can check if the peft package is installed or debug based on the reasons mentioned above.

NguyenNhoTrung commented 5 months ago

I already have fixed code. Thank you so much.

Minhluu2911 commented 4 months ago

@NguyenNhoTrung how you fixed the problem "UnboundLocalError: local variable '_tmp' referenced before assignment"

bo-feng commented 3 months ago

The error about LoRA may be due to the _find_modules function in the lora.py not finding target classes like nn.Linear or nn.Conv2d. This could be caused by the class of linear or convolutional layer defined in T5 or U-Net. For example, if the peft package is not installed, the convolutional layers in U-Net would be of the LoRACompatibleConv class instead of nn.Conv2d. As a result, _find_modules would fail to find the nn.Conv2d class for LoRA integration in U-Net. Then, code in line 724 or line 749 in lora.py will not be executed, leading to the error "UnboundLocalError: local variable '_tmp' referenced before assignment". You can check if the peft package is installed or debug based on the reasons mentioned above.

Thanks for your work.I have the same issue, but the problem still persists even after installing the peft package. I found that the peft package is not being called.