mlabonne / llm-autoeval

Automatically evaluate your LLMs in Google Colab
MIT License
527 stars 82 forks source link

Only one GPU is used during the autoeval #15

Open LMSPaul opened 8 months ago

LMSPaul commented 8 months ago

Hi! Great tool!

I attempted to use the autoeval feature on a dual RTX 3090 setup in RunPod, but it appeared that only the first GPU was utilized throughout the evaluation process.

I'm uncertain whether the second GPU was genuinely inactive or if RunPod simply did not display its activity.