Closed aiwinapp closed 10 months ago
@konstik37 Unfortunately PyTorch lacks fp16 support for many CPU-based ops. If you have a CUDA-capable GPU available, I suggest running your code on device; otherwise, you should be able to run inference on CPU in fp32.
thx. I was fix it by adding converter to float32:
self.weight = Parameter(self.weight.to(dtype=torch.float32))
if self.bias is not None:
self.bias = Parameter(self.bias.to(dtype=torch.float32))
@kauterry @cndn fyi
@konstik37 Where did you put this code?
Hello. I have a problems with running
python3 scripts/m4t/predict/predict.py привет t2tt eng --src_lang rus
Any ideas how to solve?