Open Kurt232 opened 2 months ago
can you be more specific on the question.
if I understand the question correctly, the freeze is in here https://github.com/YuanGongND/ltu/blob/2002aad8305ee5579a2237a85a6e792c1174cda7/src/ltu_as/peft-main/src/peft/tuners/lora.py#L363-L377, which happens when you do:
so we are actually adding trainable parameters back rather than explicitly freeze some parameters.
-Yuan
In LTU code, only note the LLM has already frozen.