lm-sys / FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Apache License 2.0
35.67k stars 4.38k forks source link

How to use multiple Ascend NPUs? #3060

Open litmonk opened 4 months ago

litmonk commented 4 months ago

Why is the --device npu parameter fixed to support only one Ascend NPU in code instead of multiple NPUs? def generate_stream_gate(self, params): if self.device == "npu": import torch_npu torch_npu.npu.set_device("npu:0")

QuentinWang1 commented 3 months ago

Same issue. Has this issue be solved?

n1vk commented 1 month ago

I have the same issue, and hardcoding to multiple NPUs won't work either.

zxrneu commented 1 month ago

同样的问题

zxrneu commented 1 month ago

image image ASCEND_RT_VISIBLE_DEVICES=1

n1vk commented 3 weeks ago

image image ASCEND_RT_VISIBLE_DEVICES=1

This only changes that one NPU being used, not using multiple NPUs.