turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.53k stars 274 forks source link

Update setup.py UX responses and module checks with helpful messaging #277

Open bgorlick opened 8 months ago

bgorlick commented 8 months ago

Small update to setup.py. I've modified the user experience messaging to make life a bit easier for new users diving in. Smoother module dependency checks now (none requiring anything except base python in case they don't have setuptools or anything) with some added helpful verbage. Example is now if you accidentally forget arguments, it'll nudge towards the README while suggesting the proper setup install command. If you don't have certain modules, it'll tell you which are missing. The previous one if you didn't have torch just exited and you saw a missing torch module error.

turboderp commented 8 months ago

This would need a bit of testing to make sure it works with the build/release workflows. I'll look at it when I bump to 0.0.12

Ph0rk0z commented 8 months ago

So will this fail in conda when I do python setup.py install because I do not add "user".

turboderp commented 8 months ago

The --user argument has been useful for me, but I think it depends how your environment is set up. Sadly there are many ways to go about it, and in my current setup --user actually fails because, I think because I'm not using Conda?

There are also a lot of deprecation warnings so invoking the setup.py script directly has to stop at some point anyway. I'm leaning towards pip install . as the new standard approach. It's been working for me recently both on Linux (at least with a Python venv) and Windows.

Ph0rk0z commented 8 months ago

You're right it is the future. llama.cpp python and others use that now. Often times I have to edit setup files anyway because many push hard versioned deps.