turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.45k stars 257 forks source link

exllamav2 Installation Error : [Errno 20] Not a directory: 'hipconfig'. #158

Closed watchstep closed 9 months ago

watchstep commented 9 months ago

I'm trying to install exllamav2 in a CUDA version 11.8 and Python (CP) version 3.8 environment. When I try to import exllamav2, I encounter an error: NotADirectoryError: [Errno 20] Not a directory: 'hipconfig'. I would like to resolve this issue. I have attempted all three methods mentioned in the readme (git clone, whl installation, pip package installation), but the same error persists

image

turboderp commented 9 months ago

Are you on an AMD GPU?

watchstep commented 9 months ago

I run on an Nvidia GPU not AMD GPU. image

turboderp commented 9 months ago

Are you on Torch 1.14? I'm not sure that's going to work. All development has been done on 2.x, with 2.1.0 being the current minimum requirement.

watchstep commented 9 months ago

image

The PyTorch version in the development environment is 2.0.1.

turboderp commented 9 months ago

This is an odd one, I must say. I've never used Nvidia GPU Cloud, so I'm not really sure where to start debugging that. Could you say a little more about the setup?

watchstep commented 9 months ago

After updating to Python version 3.10 and Pytorch version 2.0.1+cu118, and installing exllamav2 using pip install, importing exllamav2 worked well.

Thank you for making this package.