CAOR-MINES-ParisTech / colibri-vr-unity-package

This is the Unity package for COLIBRI VR, the Core Open Lab on Image-Based Rendering Innovation for Virtual Reality.
https://caor-mines-paristech.github.io/colibri-vr
Other
50 stars 11 forks source link

Very interesting system! Some small findings and a question #1

Open lexvandersluijs opened 4 years ago

lexvandersluijs commented 4 years ago

Hi,

Thanks for the great work and making this available! I've been looking into lightfield technology and how it can be used to capture and play back 3D scenes (in VR, AR, regular 3D) using readily available technology, and this looks like a great toolbox for that! And it's even integrated into Unity already, wow.

Here are a few small things I had to do to get it up and running on my computer, and at the end a question on the usage:

  1. In PerViewMeshesQSTR.compute I replaced the first dot in the cginc file names with a double dot, otherwise the file couldn't be found.

    #include "../../CGIncludes/CoreCG.cginc"
    #include "../../CGIncludes/CameraCG.cginc"
    #include "../../CGIncludes/ColorCG.cginc"
  2. When I ran a sparse reconstruction the first time, it completed very quickly and then said it was succesful, but actually there were no results in the sparse folder. I ran the command line from the Console in a command prompt and found that Colmap was missing glew32.dll. So I ended up copying the files in the Colmap 'lib' folder into the 'bin' folder and that solved it :-)

  3. After computing a reconstruction of the 'door' dataset, I was a bit surprised to see this message: COLMAP camera type SIMPLE_RADIAL is not currently supported by COLIBRI VR.. Suggestion to put the currently unsupported camera models between brackets in the Colmap Editor panel (or something like that)

  4. I ran the reconstruction again, with the undistorted images and PINHOLE camera model, and got a good result I think. Just a question: when I viewed it using the Rendering script, I could not see a result larger than in the attached image , by varying the Focal length and Max blend angle parameters. The list of Blending methods is also smaller than in the tutorial video, maybe because the downloadable unitypackage is a bit older than the development version?

Looking forward to doing additional experiments with it (e.g. to export to a VR experience) !

Lex

DinechinGreg commented 4 years ago

Hi,

Thank you very much for your kind message, and for having tried out the toolkit! It's still in an early stage, but we hope to make it more robust in the coming months.

  1. You are absolutely right, sorry about that! I just uploaded your fix to the GitHub. I had put the "PerViewMeshesQSTR.compute" file in a "Resources" subfolder two days ago and had forgotten to update the relative paths. Thanks for notifying me! :)

  2. Ah, interesting. I haven't had this issue on my machines yet, but it's good to know, and thank you for the fix. Just to make sure I understood the issue correctly: you ran the reconstruction from within COLIBRI VR in Unity, it failed because it was missing a dll, but in the Unity console it indicated that it was successful and didn't raise an error? I have to try to reproduce this, to at least be able to raise a helpful error in the Unity console when it occurs.

  3. The file structure when using the helper script for COLMAP should be:

    • Before reconstruction: Data / Dataset / images. The "Source data" folder is Data / Dataset.
    • After sparse reconstruction: Data / Dataset / sparse (this contains any COLMAP camera, e.g. SIMPLE_RADIAL), but also Data / Dataset / dense / 0 / sparse (this contains the undistorted camera, which should be PINHOLE). The Data / Dataset / dense / 0 folder thus becomes the main folder for the dense reconstruction step. So after sparse reconstruction, the "Source data" folder of the Processing component should automatically be changed to Data / Dataset / dense / 0. And if you get the "COLMAP camera not currently supported" error message it probably means that this was not the case, i.e. that it was still looking at the Data / Dataset folder (in which there was indeed a SIMPLE_RADIAL camera, instead of the expected PINHOLE). So I have to fix this. Thanks again!
  4. To be sure I understand correctly: you ran the sparse reconstruction using COLMAP, to get the 3D camera setup; did you also run dense reconstruction? If only sparse reconstruction: only the "focal surface" rendering methods should be available (no depth data or 3D mesh, so indeed less than in the video tutorial). If dense reconstruction: the generated .ply mesh has to be converted to a .obj format to be readable by Unity, which you can do e.g. with Blender (for instance using the dedicated helper class), and then more processing/rendering methods will become available (not the depth map ones yet though, I still have to work to import the .bin depth maps generated by COLMAP during dense reconstruction). Concerning changing the focal length, did modifying the slider affect the output view at all? If not, you may want to update the package from GitHub, I added an important fix on this point yesterday (it was indeed broken before). If it did affect the output view but the images were still too small even at the largest focal length, you can increase the max focal length by changing the value of the "Focal bounds" parameter.

Thank you so much for all of this feedback! If you're interested in experimenting further, keep checking the YouTube tutorials, in the next few days should be coming:

Best regards,

Greg

On Thu, Apr 2, 2020 at 2:46 PM Lex van der Sluijs notifications@github.com wrote:

Hi,

Thanks for the great work and making this available! I've been looking into lightfield technology and how it can be used to capture and play back 3D scenes (in VR, AR, regular 3D) using readily available technology, and this looks like a great toolbox for that! And it's even integrated into Unity already, wow.

Here are a few small things I had to do to get it up and running on my computer, and at the end a question on the usage:

  1. In PerViewMeshesQSTR.compute I replaced the first dot in the cginc file names with a double dot, otherwise the file couldn't be found.

    include "../../CGIncludes/CoreCG.cginc"

    include "../../CGIncludes/CameraCG.cginc"

    include "../../CGIncludes/ColorCG.cginc"

  2. When I ran a sparse reconstruction the first time, it completed very quickly and then said it was succesful, but actually there were no results in the sparse folder. I ran the command line from the Console in a command prompt and found that Colmap was missing glew32.dll. So I ended up copying the files in the Colmap 'lib' folder into the 'bin' folder and that solved it :-)

  3. After computing a reconstruction of the 'door' dataset, I was a bit surprised to see this message: COLMAP camera type SIMPLE_RADIAL is not currently supported by COLIBRI VR.. Suggestion to put the currently unsupported camera models between brackets in the Colmap Editor panel (or something like that)

  4. I ran the reconstruction again, with the undistorted images and PINHOLE camera model, and got a good result I think. Just a question: when I viewed it using the Rendering script, I could not see a result larger than in the attached image https://ibb.co/6Ftcj24, by varying the Focal length and Max blend angle parameters. The list of Blending methods is also smaller than in the tutorial video, maybe because the downloadable unitypackage is a bit older than the development version?

Looking forward to doing additional experiments with it (e.g. to export to a VR experience) !

Lex

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/CAOR-MINES-ParisTech/colibri-vr-unity-package/issues/1, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7FDDFNUQOZR3PEUNCUQJ3RKSBXBANCNFSM4L2MDX4Q .

lexvandersluijs commented 4 years ago

Hi Greg, thanks for your fast reply!

Re 2: yes, that is correct. Perhaps you have the 'lib' folder of Colmap in your path? I just had a look to see how it's possible that Colmap itself works. Turns out that in COLMAP.BAT the 'lib' folder is added to the path on the fly:

set SCRIPT_PATH=%~dp0
set PATH=%SCRIPT_PATH%\lib;%PATH%

Re 3: Aha, it's good to know that the system automatically changes that folder. I may have changed this to another folder myself, trying to understand how to get everything running. The remark about seeing small squares is based on the situation where the folder is set correctly, so \dense\0, so I must have changed it back. Thanks for the tip.

The folder structure you describe is indeed created, and in the sparse folder there is a cameras.txt which indeed has a PINHOLE camera. The dense reconstruction was also generated, and I am able to view both the PLY and OBJ files in MeshLab. After creating the dense reconstruction using Colmap I executed all the follow-up commands, so:

Then in Data processing I check the top two (available) checkmarks:

In the Rendering script I then have three options:

Re 4: Yes, when I use the focal length slider or the Max. blend angle slider, the size of the circle inside the black square changes. I can also modify the focal bounds, but this does not have a clearly visible effect.

A small hypothesis I have is that maybe your pipeline indeed works fully when using synthetic images, but maybe there is a bug when using photographs? As you say, you most likely will see it when recording the last tutorial(s), so I will be on the lookout for them! :-)

When I have some results of my own I will be happy to share them.

All the best, and thanks again, Lex

DinechinGreg commented 4 years ago

Hi again Lex,

The tutorial videos are finally online! Sorry this took so long, my workload has kept me quite busy of late. I hope that they can be of use!

Making the videos has helped me notice and repair quite a few bugs in the processing pipeline, so it should now run more smoothly. And there are a few improvements (e.g. recovered 3D mesh appearing as a preview in the Scene view during processing).

Thanks again for your feedback! All the best, Greg