punica-ai / punica

Serving multiple LoRA finetuned LLM as one
https://arxiv.org/abs/2310.18547
Apache License 2.0
958 stars 45 forks source link

Multi GPU and Multi Node solution #19

Open luciferlinx101 opened 10 months ago

luciferlinx101 commented 10 months ago

I wanted to know how to use Multi-GPUs and Multi-Node solutions with the current Punica code. Also wanted to know about the runner and scheduler code which is mentioned in the paper, if it is implemented can you guide me about that.

luciferlinx101 commented 10 months ago

@abcdabcd987 Your paper states Punica implementations consist of two parts: a Python li- brary on top of PyTorch that runs large language models on a single GPU and other system components to support model serving across a GPU cluster. Was looking for solution with a GPU cluster, Multi GPU or Multi GPU Nodes

luciferlinx101 commented 10 months ago

Hey @abcdabcd987 any update on this?

shenoyvvarun commented 7 months ago

+1

abcdabcd987 commented 7 months ago

Sorry, I still hasn't got time to clean up the code. But here's some old code if you need it right now:

Both are not usable out-of-box.