issues
search
turboderp
/
exllamav2
A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.19k
stars
234
forks
source link
Fixed minor typo in convert.md doc
#463
Closed
iamrohitanshu
closed
1 month ago
iamrohitanshu
commented
1 month ago
changed '64 GB or RAM' to '64 GB of RAM'
changed '64 GB or RAM' to '64 GB of RAM'