Closed Pine-sha closed 1 week ago
i fix the bug using
pip uninstall peft
beacuse the code from /root/.conda/envs/xxx/lib/python3.8/site-packages/diffusers/models/transformer_2d.py:
conv_cls = nn.Conv2d if USE_PEFT_BACKEND else LoRACompatibleConv
linear_cls = nn.Linear if USE_PEFT_BACKEND else LoRACompatibleLinear
# 2. novel content personalization export MODEL_NAME="./res_gaussion/colmap_doll/scene_personalization/checkpoint-1000" export OUTPUT_DIR="./res_gaussion/colmap_doll/content_personalization" export image_root=./res_gaussion/colmap_doll/sample_views/rgb python personalization/content_personalization.py \ --pretrained_model_name_or_path $MODEL_NAME \ --enable_xformers_memory_efficient_attention \ --instance_data_dir $image_root \ --instance_data_dir "./data/object/sunglasses1" \ --class_data_dir './res_gaussion/colmap_doll/class_samples' \ --instance_prompt 'a photo of a plush toy' \
--instance_prompt 'a photo of a sunglasses' \
--class_prompt 'a photo of a plush toy' \
--validation_prompt "a photo of a plush toy wearing a sunglasses" \
--output_dir $OUTPUT_DIR \
--scene_frequency 200 \
--validation_images $image_root/1.375-30.png \
$image_root/1.3_75_0.png \
$image_root/1.3_75_30.png \
--max_train_steps=500
You are using a model of type clip_text_model to instantiate a model of type . This is not supported for all configurations of models and can yield errors. {'variance_type'} was not found in config. Values will be initialized to default values. Some weights of the model checkpoint were not used when initializing UNet2DConditionModel: ['class_embedding.module.linear_1.bias, class_embedding.module.linear_1.weight, class_embedding.module.linear_2.bias, class_embedding.module.linear_2.weight'] -- unet: xFormers memory efficient attention is enabled. -- unet attn_processor_name = down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor
Traceback (most recent call last): File "personalization/content_personalization.py", line 1594, in
main(args)
File "personalization/content_personalization.py", line 1064, in main
attn_module.to_q.set_lora_layer(
File "/home/sha/miniforge3/envs/TIP-E/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1695, in getattr
raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")
AttributeError: 'Linear' object has no attribute 'set_lora_layer'