Closed Free-Dreamer closed 1 month ago
Another error after I fix the above one:
UnboundLocalError: local variable 'in_features' referenced before assignment
in local_lorasym_all.py:583
.
I believe the author did not test their code before publishing it...
@Free-Dreamer I can run the code fine. Did you make sure the PEFT version is alright?
The required peft version is 0.5.0 you installed 0.12.0 of course you will get errors.
Do pip install peft==0.5.0
or pip install -r requirements.txt
and let me know if that fixes your issue.
Hi @hammoudhasan Thanks for your comments. I believe I have tried the correct version of peft
. Like what I said,
I follow the instruction to install the environment and try different versions of
peft
andtransformers
.
Now I totally follow the instruction to install the environment and make sure peft==0.5.0
. I try to run the demo with two scripts.
One is the scripts provided by the repo:
from transformers import AutoModelForSequenceClassification
from LoRASYM_peft.local_peft_model_all import PeftModelForCausalLM_local
from LoRASYM_peft.local_lorasym_all import LoRASYMConfig
model = AutoModelForSequenceClassification.from_pretrained(
'FacebookAI/roberta-large',
)
update_rule_dict = para_dict = {"update_A": False, "update_B": True,
"A_init": "rand", "B_init": "zero"}
lorasym_config = LoRASYMConfig(
r=16,
lora_alpha=32,
lora_dropout=0.05,
bias="none",
modules_to_save=["classifier"],
update_rule=update_rule_dict,
task_type="SEQ_CLS",
)
lora_model = PeftModelForCausalLM_local(model, lorasym_config)
The other follows the configuration in the run_glue_origin_ft.py
:
from transformers import AutoModelForSequenceClassification, AutoConfig
from LoRASYM_peft.local_peft_model_all import PeftModelForCausalLM_local
from LoRASYM_peft.local_lorasym_all import LoRASYMConfig
update_rule_dict = para_dict = {"update_A": False, "update_B": True,
"A_init": "rand", "B_init": "zero"}
config = AutoConfig.from_pretrained(
'FacebookAI/roberta-large',
num_labels=2,
finetuning_task='rte',
cache_dir='./',
revision='main',
token=None,
trust_remote_code=False,
)
model = AutoModelForSequenceClassification.from_pretrained(
'FacebookAI/roberta-large',
config=config,
cache_dir='./',
revision='main',
token=None,
trust_remote_code=False,
ignore_mismatched_sizes=False,
)
lorasym_config = LoRASYMConfig(
r=16,
lora_alpha=32,
lora_dropout=0.05,
bias="none",
modules_to_save=["classifier"],
update_rule=update_rule_dict,
task_type="SEQ_CLS",
)
lora_model = PeftModelForCausalLM_local(model, lorasym_config)
Unfortunately, however, I can still NOT run either successfully.
You may want to run by yourself, but I will close this issue since I fine other repos now. Thanks!
Sorry but I think this repo may be not ready to be published yet. I cannot even run the demo in the usage or
run_glue_origin_ft.py
for the GLUE benchmark. When I run the demo in the usage, it raises an error:where
XXX
is a class of sequence classification model based on either GPT2 or roberta.When I run the scripts for the GLUE benchmark, it raises a similar error:
I follow the instruction to install the environment and try different versions of
peft
andtransformers
. I think it is the bug of the code itself. Could you help to address those issues? @Jiacheng-Zhu-AIMLFor lib info: