mlabonne / llm-autoeval

Automatically evaluate your LLMs in Google Colab
MIT License
527 stars 82 forks source link

multiple GPUs works in colab #6

Open tdolan21 opened 8 months ago

tdolan21 commented 8 months ago

I just wanted to let you know that I tested the multiple GPU feature and it definitely works great. This is the easiest way to evaluate 8x7b models for sure. Thanks

mlabonne commented 8 months ago

Thanks @tdolan21 for beta testing. I'll update the readme.