bigcode-project / bigcode-evaluation-harness

A framework for the evaluation of autoregressive code generation language models.
Apache License 2.0
709 stars 183 forks source link

Support for unmerged peft adapters #114

Closed cassanof closed 11 months ago

cassanof commented 1 year ago

I added some support for loading peft adapters without merging with the base model, instead the merging happens in the harness. the imports are dynamic, so people won't need to install peft==0.3 unless they want to use this. It adds the --peft_model argument, which points to the peft adapter, while the --model argument will be used as the base model.

I also cleaned up the if statement in main.py, and fixed a small bug with loading 8-bit models due to a typo: it checked if args.load_in_9-bit instead of if args.load_in_8bit.

I tested to see if it works with a LoRA adapter for starcoder. Runs just fine.

cassanof commented 1 year ago

Should also partially solve #91