bigcode-project / bigcode-evaluation-harness

A framework for the evaluation of autoregressive code generation language models.
Apache License 2.0
805 stars 215 forks source link

add HumanEval-X metric to the HF hub and the task to the harness #6

Closed loubnabnl closed 1 year ago

loubnabnl commented 2 years ago

HumanEval-X from CodeGeeX is a multilingual version of HumanEval for Java, JS, C++ and Go. In addition to code generation, it can also be used for code translation. We want to:

pacman100 commented 2 years ago

Hello, I'm interested in taking this up.

loubnabnl commented 2 years ago

Great thanks! Let me know if I can help

loubnabnl commented 1 year ago

@pacman100 if you haven't started working on this task I think we can we close this issue, since we are now integrating MuliPL-E instead https://github.com/bigcode-project/bigcode-evaluation-harness/issues/12 which is similar but has more programming languages

pacman100 commented 1 year ago

Hello @loubnabnl, sure we can close this as I haven't started on it. Thank you.