peteanderson80 / Matterport3DSimulator

AI Research Platform for Reinforcement Learning from Real Panoramic Images.
Other
504 stars 130 forks source link

Unable to downsize the images using downsize_skybox.py #85

Closed wdqin closed 3 years ago

wdqin commented 3 years ago

Hi all,

I am using the docker container method to prepocess the images and run the unit test with the simulator. So I follow the advice from #83 that I unzipped all the .zip files in v1/scans. And then in the docker container I run the following:

cd /root/mount/Matterport3DSimulator mkdir build && cd build cmake -DEGL_RENDERING=ON .. make cd ../

everything looks fine without error and then I run the preprocess first:

./scripts/downsize_skybox.py

The result returns error as this:


root@4b0e885510fc:~/mount/Matterport3DSimulator# ./scripts/downsize_skybox.py Processing scan 8194nk5LbLH with 20 panoramas Processing scan 2t7WUuJeko7 with 37 panoramas Processing scan gYvKGZ5eRqb with 38 panoramas Processing scan 8WUmhLawc2A with 95 panoramas Processing scan HxpKQynjfin with 44 panoramas Processing scan jh4fc5c5qoQ with 48 panoramas Processing scan 759xd9YjKW5 with 61 panoramas Processing scan 17DRP5sb8fy with 48 panoramas Processing scan cV4RVeZvu5T with 91 panoramas Processing scan EDJbREhghzL with 71 panoramas Processing scan 1pXnuDYAj8r with 95 panoramas Processing scan 5q7pvUzZiYa with 89 panoramas Processing scan JeFG25nYj2p with 92 panoramas Processing scan ac26ZMwG7aT with 171 panoramas Processing scan fzynW3qQPVF with 167 panoramas Processing scan jtcxE69GiFV with 148 panoramas Processing scan 2azQ1b91cZZ with 215 panoramas Processing scan gTV8FGcVJC9 with 231 panoramas Processing scan E9uDoFAP3SH with 217 panoramas Processing scan D7N2EKCX4Sj with 267 panoramas Processing scan mJXqzFtmKg4 with 190 panoramas Processing scan B6ByNegPMKs with 349 panoramas Processing scan pLe4wQe7qrG with 31 panoramas Processing scan p5wJjkQkbXX with 155 panoramas Processing scan pRbA3pwrgk9 with 125 panoramas Processing scan qoiz87JEwZ2 with 133 panoramas Processing scan rPc6DW4iMge with 119 panoramas Processing scan rqfALeAoiTq with 102 panoramas Processing scan r1Q1Z4BcV1o with 116 panoramas Processing scan TbHJrupSAjP with 116 panoramas Processing scan S9hNv5qa7GM with 110 panoramas Processing scan PX4nDJXEHrG with 300 panoramas Processing scan UwV83HsGsw3 with 125 panoramas Processing scan SN83YJsR3w2 with 266 panoramas Processing scan VVfe2KiqLaN with 61 panoramas Processing scan VLzqgDo317F with 126 panoramas Processing scan X7HyMhZNoso with 84 panoramas Processing scan uNb9QFRL6hY with 221 panoramas Processing scan YmJkqBEsHnH with 11 panoramas Processing scan V2XKFyX4ASd with 157 panoramas Processing scan YVUC4YcDtcY with 46 panoramas Processing scan XcA2TqTSSAj with 109 panoramas Processing scan wc2JMjhGNzB with 176 panoramas Processing scan vyrNrziPKCB with 342 panoramas Processing scan ZMojNkEp431 with 77 panoramas multiprocessing.pool.RemoteTraceback: """ Traceback (most recent call last): File "/usr/lib/python3.6/multiprocessing/pool.py", line 119, in worker result = (True, func(*args, *kwds)) File "/usr/lib/python3.6/multiprocessing/pool.py", line 44, in mapstar return list(map(args)) File "./scripts/downsize_skybox.py", line 44, in downsizeWithMerge assert cv2.imwrite(skybox_merge_template % (base_dir,scan,pano), newimg) AssertionError """

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "./scripts/downsize_skybox.py", line 71, in p.map(downsizeWithMerge, scans) File "/usr/lib/python3.6/multiprocessing/pool.py", line 266, in map return self._map_async(func, iterable, mapstar, chunksize).get() File "/usr/lib/python3.6/multiprocessing/pool.py", line 644, in get raise self._value AssertionError


Does anyone have similar issues when pre-processing the images? Or could anyone point out a way I can look into?

Thank you!

wdqin commented 3 years ago

I think I figured out the problem here: the order of the instruction is not precise enough, so if you simply follow instruction steps, it's likely that you could not successfully finish all the test cases within "./build/tests ~Timing" using the mentioned docker method.

TL;DR: to solve the issue mentioned,

  1. create the container with "nvidia-docker run -it --mount type=bind,source=$MATTERPORT_DATA_DIR,target=/root/mount/Matterport3DSimulator/data/v1/scans --volume pwd:/root/mount/Matterport3DSimulator mattersim:9.2-devel-ubuntu18.04" instead, remove the ",readonly" in the original command.
  2. For people who stuck in running "./build/tests ~Timing" but failed, you need to first unzip the data inside the folders from .../v1/scans/ (using things like a shell script will do) as suggested in #83, then run the downscale script to preprocess the images before running "./build/tests ~Timing".

To be more precise what's going on, there seem to be several issues with the current instruction steps: The first problem is that "./build/tests ~Timing" requires down-scaled images instead of the original images, thus simply running "./build/tests ~Timing" after building the simulator in the docker will lead to failed test cases like #74.

The second problem is that you can not simply run "./scripts/downsize_skybox.py" within or outside the docker container, because: if you run outside the docker container, your MatterSim has not been built yet (as it is built within the docker container), so running the downsize script is going to fail. If you run "./scripts/downsize_skybox.py" inside the docker container, this leads to two problems: the folders in v1/scans only contain the zipped data, so you need to unzip them first to get the images that are to be down-scaled. Also, the docker container command given in the instruction goes like this: "nvidia-docker run -it --mount type=bind,source=$MATTERPORT_DATA_DIR,target=/root/mount/Matterport3DSimulator/data/v1/scans,readonly --volume pwd:/root/mount/Matterport3DSimulator mattersim:9.2-devel-ubuntu18.04"

as you can see the ".../v1/scans" is set to be read-only for the reason that, I think, originally this command line is only used for testing the simulator instead of pre-processing the images (which requires permissions more than just reading files). Therefore, if you simply use this command to create the docker container, you will have the described error in the issue mentioned above when running the downscale script, that is, not being permitted to write(generate) preprocessed images in the scans/ folder. Therefore, the right way of creating the docker container to preprocess the images is:

"nvidia-docker run -it --mount type=bind,source=$MATTERPORT_DATA_DIR,target=/root/mount/Matterport3DSimulator/data/v1/scans --volume pwd:/root/mount/Matterport3DSimulator mattersim:9.2-devel-ubuntu18.04"

and then run the preprocessing script. With all the images being pre-processed, run the "./build/tests ~Timing", you should have all the test cases finished successfully. I understand this seems to be a trivial issue and if someone is already familiar with how docker works, he/she should be able to work it out in no time. But I think this is still valuable for people who are not familiar with the data and docker container to save their time on fixing these issues. I am closing this issue as it is solved.