turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.53k stars 272 forks source link

Build on Ubuntu 20.04 for releases #360

Closed Omegastick closed 6 months ago

Omegastick commented 7 months ago

I haven't actually tested this (I think I would need credentials), but the other workflows are running on Ubuntu 20.04 so I think it'll work.

See #338 for details. 20.04 should be compatible with 22.04 and any newer distributions for a good while.

Docs on GitHub runners: https://docs.github.com/en/actions/using-jobs/choosing-the-runner-for-a-job#choosing-github-hosted-runners

turboderp commented 6 months ago

I'll try this, but getting the dependencies right on the build servers is a little tricky sometimes. So we'll see how it goes.

Omegastick commented 6 months ago

Seems to have worked. The 0.0.16 wheel is running on my WSL2 Ubuntu 20.04 box.