mlabonne / llm-autoeval

Automatically evaluate your LLMs in Google Colab
MIT License
527 stars 82 forks source link

Feature request - Local GPUs #7

Open maziyarpanahi opened 8 months ago

maziyarpanahi commented 8 months ago

Thanks @mlabonne for sharing this personal repo, dead simple! I just wanted to say it would be great if it can support local GPUs specially via that vLLM run in a container.

Thanks again, great job.

mlabonne commented 8 months ago

Thanks @maziyarpanahi! No plan at the moment because it's not how I use this tool, but I'll keep it in mind.