Open litmonk opened 4 months ago
Same issue. Has this issue be solved?
I have the same issue, and hardcoding to multiple NPUs won't work either.
同样的问题
ASCEND_RT_VISIBLE_DEVICES=1
![]()
ASCEND_RT_VISIBLE_DEVICES=1
This only changes that one NPU being used, not using multiple NPUs.
Why is the --device npu parameter fixed to support only one Ascend NPU in code instead of multiple NPUs?
def generate_stream_gate(self, params):
if self.device == "npu":
import torch_npu
torch_npu.npu.set_device("npu:0")