pytorch / benchmark

TorchBench is a collection of open source benchmarks used to evaluate PyTorch performance.
BSD 3-Clause "New" or "Revised" License
879 stars 288 forks source link

Undocumented LLVM dependency #498

Closed jamesr66a closed 2 years ago

jamesr66a commented 3 years ago

When I try to install.py, the build for tacotron2 fails because it can't find llvm-config:

Error for /data/users/jamesreed/benchmark/torchbenchmark/models/tacotron2:
---------------------------------------------------------------------------
  ERROR: Command errored out with exit status 1:
   command: /home/jamesreed/local/miniconda3/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/setup.py'"'"'; __file__='"'"'/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-6s9w112l
       cwd: /tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/
  Complete output (26 lines):
  running bdist_wheel
  /home/jamesreed/local/miniconda3/bin/python /tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/ffi/build.py
  LLVM version... Traceback (most recent call last):
    File "/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/ffi/build.py", line 105, in main_posix
      out = subprocess.check_output([llvm_config, '--version'])
    File "/home/jamesreed/local/miniconda3/lib/python3.9/subprocess.py", line 424, in check_output
      return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
    File "/home/jamesreed/local/miniconda3/lib/python3.9/subprocess.py", line 505, in run
      with Popen(*popenargs, **kwargs) as process:
    File "/home/jamesreed/local/miniconda3/lib/python3.9/subprocess.py", line 951, in __init__
      self._execute_child(args, executable, preexec_fn, close_fds,
    File "/home/jamesreed/local/miniconda3/lib/python3.9/subprocess.py", line 1821, in _execute_child
      raise child_exception_type(errno_num, err_msg, err_filename)
  FileNotFoundError: [Errno 2] No such file or directory: 'llvm-config'

  During handling of the above exception, another exception occurred:
...

    Traceback (most recent call last):
      File "/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/ffi/build.py", line 168, in <module>
        main()
      File "/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/ffi/build.py", line 158, in main
        main_posix('linux', '.so')
      File "/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/ffi/build.py", line 107, in main_posix
        raise RuntimeError("%s failed executing, please point LLVM_CONFIG "
    RuntimeError: llvm-config failed executing, please point LLVM_CONFIG to the path for llvm-config
    error: command '/home/jamesreed/local/miniconda3/bin/python' failed with exit code 1
    ----------------------------------------
ERROR: Command errored out with exit status 1: /home/jamesreed/local/miniconda3/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/setup.py'"'"'; __file__='"'"'/tmp/pip-install-gz4dfgi5/llvmlite_c211fe64d78d402eb45e8ea8aa1d1d8e/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-0764bd1d/install-record.txt --single-version-externally-managed --compile --install-headers /home/jamesreed/local/miniconda3/include/python3.9/llvmlite Check the logs for full command output.
Traceback (most recent call last):
  File "/data/users/jamesreed/benchmark/torchbenchmark/models/tacotron2/install.py", line 9, in <module>
    pip_install_requirements()
  File "/data/users/jamesreed/benchmark/torchbenchmark/models/tacotron2/install.py", line 6, in pip_install_requirements
    subprocess.check_call([sys.executable, '-m', 'pip', 'install', '-q', '-r', 'requirements.txt'])
  File "/home/jamesreed/local/miniconda3/lib/python3.9/subprocess.py", line 373, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/home/jamesreed/local/miniconda3/bin/python', '-m', 'pip', 'install', '-q', '-r', 'requirements.txt']' returned non-zero exit status 1.

This dependency isn't documented anywhere and it probably should be

xuzhao9 commented 2 years ago

This error is because you are using python 3.9. Currently, the repository only supports python 3.7 and python 3.8. Can you try again with python 3.8?

jamesr66a commented 2 years ago

I don't think Python version has anything to do with LLVM.

What you might be seeing is you have LLVM installed in a virtual env/conda etc in your 3.8 environment, then switching to the 3.9 that dependency is not installed. Python version is a red herring in this scenario.

xuzhao9 commented 2 years ago

I don't think Python version has anything to do with LLVM.

What you might be seeing is you have LLVM installed in a virtual env/conda etc in your 3.8 environment, then switching to the 3.9 that dependency is not installed. Python version is a red herring in this scenario.

@jamesr66a It is because some dependencies are frozen by the version locks. For example, tacotron2 freezes its dependency numba==0.48. PyPI with Python 3.8 provides pre-built numba 0.48, so it just downloads and installs the binary. However, on Python 3.9, there is no pre-built numba 0.48 binary (it is too old for Python 3.9), so pip on Python 3.9 decides to compile it from scratch, which requires LLVM dependency.

Removing the version locks (https://github.com/pytorch/benchmark/pull/787) fixes this problem. Please help review it, and let me know what you think.