bigcode-project / bigcode-evaluation-harness

A framework for the evaluation of autoregressive code generation language models.
Apache License 2.0
825 stars 219 forks source link

If I want to add my own designed prompts before each question, how should I modify the code #230

Open ALLISWELL8 opened 7 months ago

loubnabnl commented 6 months ago

You can try using the --prefix argument which adds a prefix to the prompts, but be careful it might influence the generations style if you're using instruct models (e.g generating some text before the actual code which will cause the test to fail with the default postprocessing).

We have an "instruction-tuning" version of HumanEval under HumanEvalPack which supports different prompt templates depending on the models, example here