Closed EladDvash closed 3 months ago
When looking at the code I noticed the LORA config is there but you do not initialize the PEFT model: Line 368 in the model.py file is commented out:
self.text_lora_config = peft.LoraConfig(target_modules=r".*\.cross_attention\.(q_proj|v_proj)", r=32, lora_alpha=64) # self.peft_model.lm.transformer = peft.get_peft_model(self.peft_model.lm.transformer, self.text_lora_config)
and when you print out the trainable layers no LORA layers appear, is this by design or a mistake?
Thanks for this issue. It is because of the ablation study and I believe it should be uncommented. I have fixed this problem.
When looking at the code I noticed the LORA config is there but you do not initialize the PEFT model: Line 368 in the model.py file is commented out:
and when you print out the trainable layers no LORA layers appear, is this by design or a mistake?