Closed virt9 closed 8 months ago
Hi, you can provide te path to your model in --model
instead of HF repo name:
accelerate launch main.py \
--model PATH \
...
thank you for your help! i have another quesion. may i modify the "get_promt" function to implement few-shot in MBPP dataset? i wonder if there is someting else i need to attention
Yes you can just change get_prompt
. You can see how it's done for GSM8K for example https://github.com/bigcode-project/bigcode-evaluation-harness/blob/e54f33d093f342ffc0c5c057910c00aa2081515d/bigcode_eval/tasks/gsm.py#L125
thank you very much!i get it !!
thx for the work, i just want use the model downloading from hugging face(codellama-13B-python) and i dont know how to do this