Open durant1999 opened 8 months ago
SOS....
same with me, have you solved it?
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!
As I mentioned above, when I try to start two vllm api_servers, both using tensor-parallel(and size =2), I find the SECOND api_server can't run successfully (while the first one is ok). Also, I find that using top command, it shows there are many ray:IDLE.