SJTU-IPADS / PowerInfer

High-speed Large Language Model Serving on PCs with Consumer-grade GPUs
MIT License
7.96k stars 412 forks source link

Does PowerInfer support multi-GPU? #163

Closed LHQUer closed 7 months ago

LHQUer commented 8 months ago

Prerequisites

Before submitting your question, please ensure the following:

Question Details

Does PowerInfer support multi-GPU? I run this project on machine with 8 RTX3090, but I find that only one gpu was used during the Inference task.

Additional Context

Please provide any additional information that may be relevant to your question, such as specific system configurations, environment details, or any other context that could be helpful in addressing your inquiry.

hodlen commented 7 months ago

Please kindly refer to #152