ogkalu2 / comic-translate

Desktop app for automatically translating comics - BDs, Manga, Manhwa, Fumetti and more in a variety of formats (Image, Pdf, Epub, cbr, cbz, etc) and in multiple languages.
Apache License 2.0
964 stars 87 forks source link

Cuda error #5

Closed bropines closed 8 months ago

bropines commented 8 months ago

image

I just installed cuda paytorch and started the translation

╰─ python comic.py

An Error occurred: Could not run 'torchvision::nms' with arguments from the 'CUDA' backend. This could be because the operator doesn't exist for this backend, or was omitted during the selective/custom build process (if using custom build). If you are a Facebook employee using PyTorch on mobile, please visit https://fburl.com/ptmfixes for possible resolutions. 'torchvision::nms' is only available for these backends: [CPU, QuantizedCPU, BackendSelect, Python, FuncTorchDynamicLayerBackMode, Functionalize, Named, Conjugate, Negative, ZeroTensor, ADInplaceOrView, AutogradOther, AutogradCPU, AutogradCUDA, AutogradXLA, AutogradMPS, AutogradXPU, AutogradHPU, AutogradLazy, AutogradMeta, Tracer, AutocastCPU, AutocastCUDA, FuncTorchBatched, FuncTorchVmapMode, Batched, VmapMode, FuncTorchGradWrapper, PythonTLSSnapshot, FuncTorchDynamicLayerFrontMode, PreDispatch, PythonDispatcher].

CPU: registered at C:\actions-runner\_work\vision\vision\pytorch\vision\torchvision\csrc\ops\cpu\nms_kernel.cpp:112 [kernel]
QuantizedCPU: registered at C:\actions-runner\_work\vision\vision\pytorch\vision\torchvision\csrc\ops\quantized\cpu\qnms_kernel.cpp:124 [kernel]
BackendSelect: fallthrough registered at ..\aten\src\ATen\core\BackendSelectFallbackKernel.cpp:3 [backend fallback]
Python: registered at ..\aten\src\ATen\core\PythonFallbackKernel.cpp:153 [backend fallback]
FuncTorchDynamicLayerBackMode: registered at ..\aten\src\ATen\functorch\DynamicLayer.cpp:498 [backend fallback]
Functionalize: registered at ..\aten\src\ATen\FunctionalizeFallbackKernel.cpp:290 [backend fallback]
Named: registered at ..\aten\src\ATen\core\NamedRegistrations.cpp:7 [backend fallback]
Conjugate: registered at ..\aten\src\ATen\ConjugateFallback.cpp:17 [backend fallback]
Negative: registered at ..\aten\src\ATen\native\NegateFallback.cpp:19 [backend fallback]
ZeroTensor: registered at ..\aten\src\ATen\ZeroTensorFallback.cpp:86 [backend fallback]
ADInplaceOrView: fallthrough registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:86 [backend fallback]
AutogradOther: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:53 [backend fallback]
AutogradCPU: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:57 [backend fallback]
AutogradCUDA: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:65 [backend fallback]
AutogradXLA: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:69 [backend fallback]
AutogradMPS: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:77 [backend fallback]
AutogradXPU: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:61 [backend fallback]
AutogradHPU: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:90 [backend fallback]
AutogradLazy: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:73 [backend fallback]
AutogradMeta: registered at ..\aten\src\ATen\core\VariableFallbackKernel.cpp:81 [backend fallback]
Tracer: registered at ..\torch\csrc\autograd\TraceTypeManual.cpp:296 [backend fallback]
AutocastCPU: fallthrough registered at ..\aten\src\ATen\autocast_mode.cpp:382 [backend fallback]
AutocastCUDA: fallthrough registered at ..\aten\src\ATen\autocast_mode.cpp:249 [backend fallback]
FuncTorchBatched: registered at ..\aten\src\ATen\functorch\LegacyBatchingRegistrations.cpp:710 [backend fallback]
FuncTorchVmapMode: fallthrough registered at ..\aten\src\ATen\functorch\VmapModeRegistrations.cpp:28 [backend fallback]
Batched: registered at ..\aten\src\ATen\LegacyBatchingRegistrations.cpp:1075 [backend fallback]
VmapMode: fallthrough registered at ..\aten\src\ATen\VmapModeRegistrations.cpp:33 [backend fallback]
FuncTorchGradWrapper: registered at ..\aten\src\ATen\functorch\TensorWrapper.cpp:203 [backend fallback]
PythonTLSSnapshot: registered at ..\aten\src\ATen\core\PythonFallbackKernel.cpp:161 [backend fallback]
FuncTorchDynamicLayerFrontMode: registered at ..\aten\src\ATen\functorch\DynamicLayer.cpp:494 [backend fallback]
PreDispatch: registered at ..\aten\src\ATen\core\PythonFallbackKernel.cpp:165 [backend fallback]
PythonDispatcher: registered at ..\aten\src\ATen\core\PythonFallbackKernel.cpp:157 [backend fallback]
ogkalu2 commented 8 months ago

@bropines run "pip freeze" in the cmd and show me what it says for torch and torch vision (and torchaudio too if you have that installed)

Should be something like this: torch==2.1.0+cu121 torchvision==0.16.0+cu121

do both have +cu... after the version number ?

bropines commented 8 months ago

image

╰─ pip freeze
annotated-types==0.6.0
anyio==4.2.0
astor==0.8.1
attrdict==2.0.1
azure-ai-vision-imageanalysis==1.0.0b1
azure-core==1.29.7
Babel==2.14.0
bce-python-sdk==0.9.2
beautifulsoup4==4.12.3
blinker==1.7.0
Brotli==1.1.0
cachetools==5.3.2
certifi==2023.11.17
charset-normalizer==3.3.2
click==8.1.7
colorama==0.4.6
contourpy==1.2.0
cssselect==1.2.0
cssutils==2.9.0
cycler==0.12.1
Cython==3.0.8
dearpygui==1.10.1
decorator==5.1.1
deep-translator==1.11.4
deepl==1.16.1
Deprecated==1.2.14
distro==1.9.0
easyocr==1.7.1
EbookLib==0.18
emoji==2.10.0
et-xmlfile==1.1.0
exceptiongroup==1.2.0
filelock==3.13.1
fire==0.5.0
Flask==3.0.1
flask-babel==4.0.0
fonttools==4.47.2
fsspec==2023.12.2
fugashi==1.3.0
future==0.18.3
h11==0.14.0
httpcore==1.0.2
httpx==0.26.0
huggingface-hub==0.20.3
idna==3.6
imageio==2.33.1
img2pdf==0.5.1
imgaug==0.4.0
inflate64==1.0.0
isodate==0.6.1
itsdangerous==2.1.2
jaconv==0.3.4
Jinja2==3.1.3
kiwisolver==1.4.5
largestinteriorrectangle==0.2.0
lazy_loader==0.3
llvmlite==0.41.1
lmdb==1.4.1
loguru==0.7.2
lxml==5.1.0
MarkupSafe==2.1.4
matplotlib==3.8.2
mpmath==1.3.0
multivolumefile==0.2.3
networkx==3.2.1
ninja==1.11.1.1
numba==0.58.1
numpy==1.26.3
openai==1.8.0
opencv-contrib-python==4.6.0.66
opencv-python==4.6.0.66
opencv-python-headless==4.9.0.80
openpyxl==3.1.2
opt-einsum==3.3.0
packaging==23.2
paddleocr==2.7.0.3
paddlepaddle==2.5.2
pandas==2.2.0
pdf2docx==0.5.8
pikepdf==8.11.2
pillow==10.2.0
premailer==3.10.0
protobuf==3.20.2
psutil==5.9.8
py-cpuinfo==9.0.0
py7zr==0.20.8
pybcj==1.0.2
pyclipper==1.3.0.post5
pycryptodome==3.20.0
pycryptodomex==3.20.0
pydantic==2.5.3
pydantic_core==2.14.6
PyMuPDF==1.20.2
pyparsing==3.1.1
pyppmd==1.1.0
python-bidi==0.4.2
python-dateutil==2.8.2
python-docx==1.1.0
pytz==2023.3.post1
PyYAML==6.0.1
pyzstd==0.15.9
rapidfuzz==3.6.1
rarfile==4.1
regex==2023.12.25
requests==2.31.0
safetensors==0.4.2
scikit-image==0.22.0
scipy==1.12.0
seaborn==0.13.2
shapely==2.0.2
six==1.16.0
sniffio==1.3.0
soupsieve==2.5
stanza==1.7.0
sympy==1.12
termcolor==2.4.0
texttable==1.7.0
thop==0.1.1.post2209072238
tifffile==2023.12.9
tokenizers==0.15.1
toml==0.10.2
torch==2.1.2+cu121
torchaudio==2.1.2+cu121
torchvision==0.16.2
tqdm==4.66.1
transformers==4.36.2
typing_extensions==4.9.0
tzdata==2023.4
ultralytics==8.1.3
unidic-lite==1.0.8
urllib3==2.1.0
visualdl==2.5.3
Werkzeug==3.0.1
wget==3.2
win32-setctime==1.1.0
wrapt==1.16.0
bropines commented 8 months ago

The problem is that without the cuda version of pytorch everything works fine (well, if you don’t press the GPU button).

ogkalu2 commented 8 months ago

OK. For whatever reason, it looks like the cuda version of torchvision was not installed. (no +cu121)

run pip uninstall torchvision then explicitly install the cuda version with pip install torchvision==0.16.0+cu121 -f https://download.pytorch.org/whl/torch_stable.html

Try this and let me know if it works

bropines commented 8 months ago

OK. For whatever reason, it looks like the cuda version of torchvision was not installed. (no +cu121)

run pip uninstall torchvision then explicitly install the cuda version with pip install torchvision==0.16.0+cu121 -f https://download.pytorch.org/whl/torch_stable.html

Try this and let me know if it works

It worked. It's strange that they could change it in version 0.16.2, unlike 0.16.0