Ascend / pytorch

Ascend PyTorch adapter (torch_npu). Mirror of https://gitee.com/ascend/pytorch
https://ascend.github.io/docs/
Other
263 stars 15 forks source link

replace -> repalce,无伤大雅的笔误 #50

Open cillinzhang opened 2 months ago

cillinzhang commented 2 months ago

grep -nR repalce ./

./pytorch/torch_npu/csrc/aten/common/ToKernelNpu.cpp:139: "dtype cast repalce with float."); ./pytorch/torch_npu/csrc/framework/utils/OpPreparation.cpp:274: ASCEND_LOGW("NPU don't support create double dtype tensor with inner format, repalce with base format."); ./pytorch/test/fx/test_subgraph_rewriter.py:732: found_repalcement_node = False ./pytorch/test/fx/test_subgraph_rewriter.py:735: found_repalcement_node = True ./pytorch/test/fx/test_subgraph_rewriter.py:738: self.assertTrue(found_repalcement_node) ./pytorch/test/fx/test_subgraph_rewriter.py:801: repalcement_node_found = 0 ./pytorch/test/fx/test_subgraph_rewriter.py:804: repalcement_node_found += 1 ./pytorch/test/fx/test_subgraph_rewriter.py:806: self.assertEqual(repalcement_node_found, 2)