PRBonn / make_it_dense

Make it Dense: Self-Supervised Geometric Scan Completion of Sparse 3D LiDAR Scans in Large Outdoor Environments
https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/vizzo2022ral-iros.pdf
132 stars 11 forks source link

No to_python (by-value) converter found for C++ type in vdbfusion #2

Closed Gaozhongpai closed 2 years ago

Gaozhongpai commented 2 years ago

Thanks for this project. I tried to train on KITTI and encountered the issue below:

Screenshot from 2022-08-26 10-34-34

I have installed pyopenvdb as follows Screenshot from 2022-08-26 10-59-56

Gaozhongpai commented 2 years ago

I modified the line in vdbfusion: find_package(Boost COMPONENTS python Required) to find_package(Boost COMPONENTS python REQUIRED). Then run python setup install

The above error is gone, but new errors are shown below:

Original TypeError: No to_python (by-value) converter found for C++ type: std::shared_ptr<openvdb::v10_0::Grid<openvdb::v10_0::tree::Tree<openvdb::v10_0::tree::RootNode<openvdb::v10_0::tree::InternalNode<openvdb::v10_0::tree::InternalNode<openvdb::v10_0::tree::LeafNode<float, 3u>, 4u>, 5u> > > > >

(vdb) pai@precision:~/code/make_it_dense$ ./apps/precache.py -s 07
Caching data:   0%|                                                                                                                           | 0/1101 [00:00<?, ? models/s]
Traceback (most recent call last):

  File "/home/pai/code/make_it_dense/./apps/precache.py", line 33, in <module>
    typer.run(precache)

  File "/home/pai/code/make_it_dense/./apps/precache.py", line 28, in precache
    for _ in tqdm(dataloader, desc="Caching data", unit=" models"):

  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/tqdm/std.py", line 1195, in __iter__
    for obj in iterable:

  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 681, in __next__
    data = self._next_data()

  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1376, in _next_data
    return self._process_data(data)

  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1402, in _process_data
    data.reraise()

  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/_utils.py", line 461, in reraise
    raise exception

RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original TypeError: No to_python (by-value) converter found for C++ type: std::shared_ptr<openvdb::v10_0::Grid<openvdb::v10_0::tree::Tree<openvdb::v10_0::tree::RootNode<openvdb::v10_0::tree::InternalNode<openvdb::v10_0::tree::InternalNode<openvdb::v10_0::tree::LeafNode<float, 3u>, 4u>, 5u> > > > >

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 49, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/make_it_dense/dataset/kitti_vdb_multiseq.py", line 40, in __getitem__
    return self.sequences[sequence_id][scan_idx]
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/make_it_dense/utils/cache.py", line 31, in wrapper
    result = func(*args, **kwargs)
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/make_it_dense/dataset/kitti_vdb_sequence.py", line 33, in __getitem__
    return self._get_leaf_node_pairs(self._get_vdb_grids(idx))
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/make_it_dense/dataset/kitti_vdb_sequence.py", line 58, in _get_vdb_grids
    tsdf_volume = VDBVolume(self.voxel_size, self.sdf_trunc)
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/vdbfusion-0.1.6-py3.10-linux-x86_64.egg/vdbfusion/pybind/vdb_volume.py", line 26, in __init__
    self.tsdf = self._volume._tsdf
RuntimeError: Caught an unknown exception!

Exception ignored in atexit callback: <function _exit_function at 0x7fec02e327a0>
Traceback (most recent call last):
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/multiprocessing/util.py", line 357, in _exit_function
    p.join()
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/multiprocessing/process.py", line 149, in join
    res = self._popen.wait(timeout)
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/multiprocessing/popen_fork.py", line 43, in wait
    return self.poll(os.WNOHANG if timeout == 0.0 else 0)
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/multiprocessing/popen_fork.py", line 27, in poll
    pid, sts = os.waitpid(self.pid, flag)
  File "/home/pai/miniconda3/envs/vdb/lib/python3.10/site-packages/torch/utils/data/_utils/signal_handling.py", line 66, in handler
    _error_if_any_worker_fails()
RuntimeError: DataLoader worker (pid 35684) is killed by signal: Terminated. 

The error is from vdbfusion not from pyopenvdb.

nachovizzo commented 2 years ago

Hello, sorry for the late reply. For some reason, I never got the notification that this issue was open!

It looks like you have a mismatch between the OpenVDB and VDBFusion libraries. All the details are on the README of this repo. Me aware that the entire building process might take some time ;) I guess that starting from scratch is the best option in your case.

Next time, don't doubt to tag me @nacho so I can receive an email ;)

Gaozhongpai commented 2 years ago

Thanks, I solved the problem by running in docker. It was the problem with my python version (python 3.10). It works for python 3.8.

jasonCastano commented 1 year ago

Hey @nachovizzo and @Gaozhongpai. First of all, thanks @nachovizzo for this amazing work!. I had the same problem as @Gaozhongpai. I even tried to mount make_it_dense in a docker but still having the same problem. My Dockerfile looks like this:

FROM nvidia/cuda:11.0.3-base-ubuntu20.04

ENV TERM xterm
ENV DEBIAN_FRONTEND=noninteractive

RUN apt-get update && apt-get install --no-install-recommends -y \
    build-essential \
    ccache \
    clang-format \
    cmake \
    git \
    && rm -rf /var/lib/apt/lists/*

RUN apt-get update && apt-get install --no-install-recommends -y \
    python3 \
    python3-numpy \
    python3-pip \
    && rm -rf /var/lib/apt/lists/*

RUN pip3 install --upgrade pip
RUN pip3 install --upgrade \
    black \
    numpy \
    pytest \
    setuptools \
    twine \
    wheel

RUN apt-get update && apt-get install --no-install-recommends -y \
    libblosc-dev \
    libboost-iostreams-dev \
    libboost-numpy-dev \
    libboost-python-dev \
    libboost-system-dev \
    libeigen3-dev \
    libtbb-dev \
    python-dev \
    python-numpy \
    && rm -rf /var/lib/apt/lists/*

RUN apt-get update && apt-get install --no-install-recommends -y \
    build-essential \
    cmake \
    git \
    python3 \
    python3-numpy \
    python3-pip \
    && rm -rf /var/lib/apt/lists/*

RUN apt-get update && apt-get install --no-install-recommends -y \
    libblosc-dev \
    libboost-all-dev \
    libilmbase-dev \
    libsnappy1v5 \
    libtbb-dev \
    zlib1g-dev \
    && rm -rf /var/lib/apt/lists/*

WORKDIR dependencies

RUN git clone --depth 1 https://github.com/nachovizzo/openvdb.git -b nacho/vdbfusion \
    && cd openvdb \
    && mkdir build && cd build \
    && cmake \
    -DOPENVDB_BUILD_PYTHON_MODULE=ON \
    -DUSE_NUMPY=ON \
    -DPYOPENVDB_INSTALL_DIRECTORY="/usr/local/lib/python3.8/dist-packages" \
    -DCMAKE_POSITION_INDEPENDENT_CODE=ON \
    -DUSE_ZLIB=OFF \
    ..\
    && make -j$(nproc) all install \
    && rm -rf /openvdb \
    && cd .. \
    && cd ..

RUN apt-get update && apt-get install --no-install-recommends -y \
    libgl1 \
    libgomp1 \
    libusb-1.0-0 \
    && rm -rf /var/lib/apt/lists/*

RUN git clone --recurse-submodules https://github.com/PRBonn/manifold_python.git \
  && cd manifold_python \
  && make install\
  && cd ..\
  && rm -rf /manifold

RUN pip3 install --upgrade pip
RUN pip3 install --upgrade \
    black \
    np \
    open3d

RUN git clone https://github.com/PRBonn/vdb_to_numpy \
    && cd vdb_to_numpy \
    && git submodule update --init \
    && pip install . \
    && cd ..

RUN git clone https://github.com/PRBonn/vdbfusion.git \
    && cd vdbfusion \
    && git submodule update --init \
    && pip install . \
    && cd ..

RUN cd ..

RUN git clone https://github.com/PRBonn/make_it_dense.git \
    && cd make_it_dense \
    && git submodule update --init \
    && pip install . \
    && cd ..

RUN pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu113

@Gaozhongpai , could you give me an insight if the Dockerfile is correct?. I would really appreciate it. Thanks!

nachovizzo commented 1 year ago

@jasonCastano could you post here the error msg?

NOTE: You can use Markdown syntax to make it more readable ;)

jasonCastano commented 1 year ago

Thanks for your reply @nachovizzo and the suggest for the Markdown syntax. I had make some changes in the Dockerfile and now it looks like this:

FROM nvidia/cuda:11.0.3-base-ubuntu20.04

# Install Python apt dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \
    python3 \
    python3-numpy \
    python3-pip \
    && rm -rf /var/lib/apt/lists/*

# force python 3 to be default
RUN update-alternatives --install /usr/bin/python python /usr/bin/python3.8 1\
    && update-alternatives --install /usr/bin/pip pip /usr/bin/pip3 1
RUN pip3 install --upgrade pip requests setuptools pipenv

# setup environment
ENV TERM xterm
ENV DEBIAN_FRONTEND=noninteractive

# Install essentials
RUN apt-get update && apt-get install --no-install-recommends -y \
    build-essential \
    ccache \
    clang-format \
    cmake \
    git \
    && rm -rf /var/lib/apt/lists/*

# Install Python pip dependencies
RUN pip3 install --upgrade pip
RUN pip3 install --upgrade \
    black \
    numpy \
    pytest \
    setuptools \
    twine \
    wheel

# Install C++ Dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \
    libblosc-dev \
    libboost-iostreams-dev \
    libboost-numpy-dev \
    libboost-python-dev \
    libboost-system-dev \
    libeigen3-dev \
    libtbb-dev \
    python3-dev \
    python3-numpy \
    && rm -rf /var/lib/apt/lists/*

RUN apt-get update && apt-get install --no-install-recommends -y \
    build-essential \
    cmake \
    git \
    python3 \
    python3-numpy \
    python3-pip \
    && rm -rf /var/lib/apt/lists/*

RUN apt-get update && apt-get install --no-install-recommends -y \
    libblosc-dev \
    libboost-all-dev \
    libilmbase-dev \
    libsnappy1v5 \
    libtbb-dev \
    zlib1g-dev \
    && rm -rf /var/lib/apt/lists/*

RUN apt-get update && apt-get install --no-install-recommends -y \
    libgl1 \
    libgomp1 \
    libusb-1.0-0 \
    && rm -rf /var/lib/apt/lists/*

RUN pip3 install --upgrade pip
RUN pip3 install --upgrade \
    black \
    np \
    open3d

WORKDIR make_it_dense_ws

RUN git clone https://github.com/nachovizzo/openvdb.git -b nacho/fix_background_inactive \
  && cd openvdb \
  && mkdir build && cd build \
  && cmake -DOPENVDB_BUILD_PYTHON_MODULE=ON -DUSE_NUMPY=ON .. \
  && make -j$(nproc) all install 

RUN git clone --recurse-submodules https://github.com/PRBonn/manifold_python.git \
  && cd manifold_python \
  && git submodule update --init \
  && make install \
  && cd ..\
  && rm -rf /manifold

RUN git clone https://github.com/PRBonn/vdb_to_numpy \
    && cd vdb_to_numpy \
    && git submodule update --init \
    && python setup.py install \
    && pip install . \
    && cd ..

COPY vdbfusion vdbfusion

COPY openvdb vdbfusion/openvdb

RUN cd vdbfusion/openvdb \
    && git submodule update --init \
    && mkdir build && cd build \
    && cmake \
    -DOPENVDB_BUILD_PYTHON_MODULE=ON \
    -DUSE_NUMPY=ON \
    -DPYOPENVDB_INSTALL_DIRECTORY="/usr/local/lib/python3.8/dist-packages" \
    -DCMAKE_POSITION_INDEPENDENT_CODE=ON \
    -DUSE_ZLIB=OFF \
    ..\
    && make -j$(nproc) all install \
    && cd ../.. \
    && rm -rf /openvdb \
    && git submodule update --init \
    && pip install . \                                                         
    && mkdir -p build && cd build && cmake ..\
    && make install \
    && cd .. \
    && python setup.py install

#RUN git clone https://github.com/PRBonn/vdbfusion.git \
#    && cd vdbfusion \
#    && git submodule update --init \
#    && pip install . \
#    && cd ..

RUN cd

RUN git clone https://github.com/PRBonn/make_it_dense.git \
    && cd make_it_dense \
    && git submodule update --init \
    && pip install . \
    && cd ..

RUN pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu113

When I try to run the precache line

./apps/precache.py -s 07

I get the following error:

AttributeError: Caught AttributeError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop
    data = fetcher.fetch(index)
  File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/fetch.py", line 58, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/usr/local/lib/python3.8/dist-packages/make_it_dense/dataset/kitti_vdb_multiseq.py", line 40, in 
__getitem__
    return self.sequences[sequence_id][scan_idx]
  File "/usr/local/lib/python3.8/dist-packages/make_it_dense/utils/cache.py", line 31, in wrapper
    result = func(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/make_it_dense/dataset/kitti_vdb_sequence.py", line 33, in 
__getitem__
    return self._get_leaf_node_pairs(self._get_vdb_grids(idx))
  File "/usr/local/lib/python3.8/dist-packages/make_it_dense/dataset/kitti_vdb_sequence.py", line 61, in 
_get_vdb_grids
    vdb_grids[f"in_tsdf_{voxel_size_cm}"] = normalize_grid(tsdf_volume.tsdf)
AttributeError: 'VDBVolume' object has no attribute 'tsdf'

When I try to run a single scan test I get this output:

Screenshot from 2022-11-02 12-12-38

I get this output when I check for the pyopenvdb support in vdbfusion

Screenshot from 2022-11-02 12-08-08

nachovizzo commented 1 year ago

Ok, amazing you manage to nailed it down this far. The problem is that you don't have the pyopenvdb python library installed. Please check #4 and let me know how it goes

jasonCastano commented 1 year ago

@nachovizzo I have checked #4 and noticed that it is necessary to enable the OPENVDB_BUILD_PYTHON_MODULE flag when compiling openvdb. Also, to compile vdbfusion I activated the BUILD_PYTHON_BINDINGS flag which is in line 33 of this CMakeLists . With the above I obtain the following information at compiling:

Screenshot from 2022-11-03 15-26-51

The PYOPENVDB_SUPPORT flag is enabled, according to this CMakeLists it is only possible if module pyopenvdb can be imported. However, I still having the same problems:

Screenshot from 2022-11-09 09-46-42

When a try to run make_it_dense I still having the same error: AttributeError: 'VDBVolume' object has no attribute 'tsdf'

nachovizzo commented 1 year ago

@jasonCastano sorry for the delay, wow this is strange.

Although your output made my find a “typo” in the CMakelists.txt, now fixed in 0704860c7c.

I'm testing your docker now to ee if I can check the problem ;)

nachovizzo commented 1 year ago

@jasonCastano, @Gaozhongpai, @amwfarid. I found the problem. At least related to the one reported by @jasonCastano

For whatever reason, /usr/local (where pyopenvdb) is located, is being excluded when running inside the cmake environment. It might be because the cmake version is a bit old 3.16 inside the docker, but there is as an easy fix

If you add this in any part of the cmake script:

find_package(Python COMPONENTS Interpreter)
execute_process(COMMAND ${PYTHON_EXECUTABLE} "-c" "import sys; print(sys.path)")

Then the output would be:

['', '/tmp/pip-build-env-swzj7s57/site', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/tmp/pip-build-env-swzj7s57/overlay/lib/python3.8/site-packages', '/tmp/pip-build-env-swzj7s57/normal/lib/python3.8/site-packages']

Which clearly does not include /usr/local/lib, on the other hand, if you do:

 python3 -c 'import sys;print(sys.path)'
['', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages']

Solution

So something is quite strange, BUT, if you add the -I flag (isolated mode, more to read)

execute_process(COMMAND ${PYTHON_EXECUTABLE} "-I" "-c" "import sys; print(sys.path)")

Then the cmake can find pyopenvdb:

['/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages']

And everyone is happy:

$ python3 -c 'from vdbfusion.pybind import vdbfusion_pybind; print(vdbfusion_pybind._VDBVolume.PYOPENVDB_SUPPORT_ENABLED)'
True

Sadly this would break the local build, so you guys would need to add it in the dockerfiles, conda, etc. It looks like it's something related to run all this installation as root. So using a normal user inside the docker might also help

diff --git a/src/vdbfusion/pybind/CMakeLists.txt b/src/vdbfusion/pybind/CMakeLists.txt
index 77514f8..a176a63 100644
--- a/src/vdbfusion/pybind/CMakeLists.txt
+++ b/src/vdbfusion/pybind/CMakeLists.txt
@@ -27,7 +27,7 @@ target_link_libraries(vdbfusion_pybind PRIVATE VDBFusion::vdbfusion)
 # PYOPENVDB_SUPPORT is defined only by the existence of the pyopenvdb library.
 find_package(Python COMPONENTS Interpreter)
 execute_process(COMMAND
-                ${PYTHON_EXECUTABLE} "-c" "import pyopenvdb; print(True)"
+                ${PYTHON_EXECUTABLE} "-I" "-c" "import pyopenvdb; print(True)"
                 OUTPUT_VARIABLE PYOPENVDB_SUPPORT
                 ERROR_QUIET)
 if(PYOPENVDB_SUPPORT)