fishaudio / fish-speech

Brand new TTS solution
https://speech.fish.audio
Other
14.69k stars 1.12k forks source link

Does this project support multi-gpu inference? #641

Open Kingdroper opened 4 weeks ago

Kingdroper commented 4 weeks ago

Self Checks

1. Is this request related to a challenge you're experiencing? Tell us your story.

Does this project support multi-gpu inference? Can you give me some potential solutions?

2. What is your suggested solution?

NO

3. Additional context or comments

No response

4. Can you help us with this feature?

Stardust-minus commented 3 weeks ago

Currently, we don't support multi-gpu inference. However, you can still deploy multiple instance on each card and run a load balancer to achieve parallel inference.

Whale-Dolphin commented 3 weeks ago

You can rewrite the infer script yourself to use subprocess too infer in multi-gpus