facebookarchive / Surround360

Surround360 is Facebook's open source hardware and software for capturing stereoscopic 3D 360 video for VR. The repo contains hardware designs, as well as software for camera control and rendering.
Other
2.16k stars 580 forks source link

Geometrical Artifacts in Video #241

Open smdr2670 opened 6 years ago

smdr2670 commented 6 years ago

Hi, in order to get an artifact-free video from my recoreded footage I have a few questions:

We shot a scene where a person is walking towards and around the camera at a distance between 1 - 1.5 meters. At some of those frames, the person start to disappear partly (looks like some image colums of the person disappears) or heavy distortions on the head (minimum distace to cameras) of the person occurs.

How does the minimum distance of an object to the ODS system affects the stitching? Is it the optical flow that fails because the objects are too near resulting in great horizontal disparity? Have you observed something similar?

Also, in the rendering pipeline, when I am stacking the optical flow fields horizontally and visualize them with e.g. color-coding of middlebury or normalized vertical disparity, it is clearly visible that image consists of 14 vertical stripes.

Should the visualized optical flows fields stacked together look consistent?

aparrapo commented 6 years ago

Hi @smdr2670. Optical flow becomes increasingly more difficult the closer you get to the camera rig, exactly for the reason you mention. We have success with objects at > 8ft, and we start having issues closer (depends on the type of object and movement).

Regarding the optical flow visualization, if you can attach a screenshot I can try to make sense of it :)

smdr2670 commented 6 years ago

Hi @aparrapo, sure.

The first image visualizes the optical flow using the color coding of middlebury, i.e. cols1

monosideimagesflow50

The second image shows the horizontal disparity monosideimageshordisp50

Theoretically, the visualization should show spatial continuity right?