lzccccc / SMOKE

SMOKE: Single-Stage Monocular 3D Object Detection via Keypoint Estimation
MIT License
696 stars 177 forks source link

Run SMOKE on cuda 11.x environment! #56

Open yhchang1120 opened 3 years ago

yhchang1120 commented 3 years ago

I struggled to run SMOKE model on cuda 11.x environment for last 4 months and finally found a simple solution.

step 1. Visit https://github.com/jinfagang/DCNv2_latest(I appreciate you Jinfagang) step 2. Download all files and folders in src/ folder. step 3. Back to SMOKE directory and replace all files in csrc/ to step 2 files. step 4. rebuild codes: > python setup.py build develop now you can run SMOKE model on cuda 11.x environment. This methods tested on following environment - gpu : quadro rtx 6000 - driver version : 460.73.01 - cuda version : 11.0 - pytorch : 1.8.0 In future I'm planning to test on geforce RTX 3060. If someone tested on RTX 30's series, share us results!
syKevinPeng commented 2 years ago

Hey, I followed your solution and got the following error on RTX 3090:

subprocess.CalledProcessError: Command '['which', 'g++']' returned non-zero exit status 1.

GPU: RTX 3090 Driver version: 510.06 cuda version: 11.4 pytorch: 1.8.1

Entire error message: image

syKevinPeng commented 2 years ago

Hey, I followed your solution and got the following error on RTX 3090:

subprocess.CalledProcessError: Command '['which', 'g++']' returned non-zero exit status 1.

GPU: RTX 3090 Driver version: 510.06 cuda version: 11.4 pytorch: 1.8.1

Entire error message: image

Oh nvm. I'm using docker and I forget to install gcc

bijonguha commented 2 years ago

Worked for me in google colab

Amazingmum commented 2 years ago

running build running build_ext /home/awguitar/.local/lib/python3.8/site-packages/torch/utils/cpp_extension.py:387: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. warnings.warn(msg.format('we could not find ninja.')) Traceback (most recent call last): File "setup.py", line 61, in setup( File "/usr/lib/python3/dist-packages/setuptools/init.py", line 144, in setup return distutils.core.setup(**attrs) File "/usr/lib/python3.8/distutils/core.py", line 148, in setup dist.run_commands() File "/usr/lib/python3.8/distutils/dist.py", line 966, in run_commands self.run_command(cmd) File "/usr/lib/python3.8/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3.8/distutils/command/build.py", line 135, in run self.run_command(cmd_name) File "/usr/lib/python3.8/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/usr/lib/python3.8/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 87, in run _build_ext.run(self) File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 186, in run _build_ext.build_ext.run(self) File "/usr/lib/python3.8/distutils/command/build_ext.py", line 340, in run self.build_extensions() File "/home/awguitar/.local/lib/python3.8/site-packages/torch/utils/cpp_extension.py", line 410, in build_extensions self._check_cuda_version() File "/home/awguitar/.local/lib/python3.8/site-packages/torch/utils/cpp_extension.py", line 787, in _check_cuda_version raise RuntimeError(CUDA_MISMATCH_MESSAGE.format(cuda_str_version, torch.version.cuda)) RuntimeError: The detected CUDA version (11.4) mismatches the version that was used to compile PyTorch (10.2). Please make sure to use the same CUDA versions. #################### RTX3070 CUDA11.4 torch1.11 python3.8 Ubuntu20.4 I got the problem .

yhchang1120 commented 2 years ago

Hi, @liukr001 I think pytorch version 10.2 mismatches with with local CUDA version 11.4. I found similar issue. Please refer to this link. https://github.com/rusty1s/pytorch_sparse/issues/189#issuecomment-1000240508

kaggar11 commented 2 years ago

@yhchang1120 What results did you get in your environment?

baoga1124 commented 1 year ago

Worked for me in google colab

Can you share your note book in ggloab with me,pls!