Evaluation, benchmark, and scorecard, targeting for performance on throughput and latency, accuracy on popular evaluation harness, safety, and hallucination
Apache License 2.0
18
stars
29
forks
source link
Failed to install dependencies of OPEA benchmark tool if Python version is higher than 3.10 #99
commit id: 71637c0
The supported Python is 3.x in evals/benchmark/README.MD.
Error:
Collecting pyext==0.7 (from bigcode-eval@ git+https://github.com/bigcode-project/bigcode-evaluation-harness.git@e5c2f31625223431d7987f43b70b75b9d26ba118->-r /root/josh/GenAIEval/requirements.txt (line 1))
Using cached pyext-0.7.tar.gz (7.8 kB)
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [9 lines of output]
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-install-7t4xdpv1/pyext_419332a2abb045ec82db85220394d51e/setup.py", line 6, in <module>
import pyext
File "/tmp/pip-install-7t4xdpv1/pyext_419332a2abb045ec82db85220394d51e/pyext.py", line 117, in <module>
oargspec = inspect.getargspec
^^^^^^^^^^^^^^^^^^
AttributeError: module 'inspect' has no attribute 'getargspec'. Did you mean: 'getargs'?
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Reproduce the issue: Ensure python is
3.11
.commit id: 71637c0 The supported Python is 3.x in evals/benchmark/README.MD.
Error:
Workaround: Downgrade python to
3.10
or lower.