Open HarryK4673 opened 1 month ago
the same problem, have you fixed it?
@czczup Could you please have a look at this problem? 能麻烦作者看一下这个问题吗,我用自定义数据finetune完推理时也是报这个错,然后如果加上logit_processor会输出乱码
from transformers.generation.logits_process import LogitsProcessor, LogitsProcessorList
from transformers.generation.logits_process import InfNanRemoveLogitsProcessor, MinLengthLogitsProcessor
logits_processor = LogitsProcessorList()
logits_processor.append(MinLengthLogitsProcessor(15, eos_token_id=tokenizer.eos_token_id))
logits_processor.append(InfNanRemoveLogitsProcessor())
This issue can solve your problem. After Lora finetuning, you need to merge the weights.
Checklist
Describe the bug
Cannot use
AutoModelForCausalLM.from_pretrained
to load model. Some parameters cannot be loaded correctly. Cannot usepeft
to load model, since noadapter_config.json
in the folder after fine-tuning.Reproduction
Or
Environment
Error traceback