Closed Xjunjiang closed 5 years ago
We have included ninja
in the project. Can you run ninja --version
in the root folder of the project?
We have included
ninja
in the project. Can you runninja --version
in the root folder of the project?
It can't work, maybe because I use win10 system, thank you very much ! We will continue debugging
Then you should install ninja
in your system firstly.
Traceback (most recent call last): File "scripts/prepare_ade20k.py", line 6, in
from encoding.utils import download, mkdir
File "E:\PycharmProject\venv\lib\site-packages\encoding__init.py", line 13, in
from . import nn, functions, dilated, parallel, utils, models, datasets
File "E:\PycharmProject\venv\lib\site-packages\encoding\nn\ init.py", line 12, in
from .syncbn import
File "E:\PycharmProject\venv\lib\site-packages\encoding\nn\syncbn.py", line 23, in
from ..functions import
File "E:\PycharmProject\venv\lib\site-packages\encoding\functions\ init.py", line 2, in
from .syncbn import *
File "E:\PycharmProject\venv\lib\site-packages\encoding\functions\syncbn.py", line 13, in
from .. import lib
File "E:\PycharmProject\venv\lib\site-packages\encoding\lib\ init__.py", line 14, in
], build_directory=cpu_path, verbose=False)
File "E:\PycharmProject\venv\lib\site-packages\torch\utils\cpp_extension.py", line 645, in load
is_python_module)
File "E:\PycharmProject\venv\lib\site-packages\torch\utils\cpp_extension.py", line 814, in _jit_compile
with_cuda=with_cuda)
File "E:\PycharmProject\venv\lib\site-packages\torch\utils\cpp_extension.py", line 837, in _write_ninja_file_and_build
verify_ninja_availability()
File "E:\PycharmProject\venv\lib\site-packages\torch\utils\cpp_extension.py", line 875, in verify_ninja_availability
raise RuntimeError("Ninja is required to load C++ extensions")
RuntimeError: Ninja is required to load C++ extensions