turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.52k stars 271 forks source link

Added steps to benchmark in README #488

Closed RodriMora closed 3 months ago

RodriMora commented 3 months ago

Added steps to the readme to install the missing requirements and to run the mmlu.ly script

turboderp commented 3 months ago

Oh wow, I clicked the wrong button. :cat2: sry