Closed starwaver closed 7 years ago
That distortion in the poles is an artifact of the consumer cameras used to capture 360 panoramas. Usually 2 fish-eye lenses on opposite sides that capture 180 degrees. The only way to fix that is using multi camera rigs that minimize distortion across the whole sphere. (e.g https://shop.gopro.com/virtualreality/omni---all-inclusive/MHDHX-006.html). There's nothing we can do on the aframe side.
Hi @dmarcos ,
The distortion isn't caused by an artifact of the camera. The reason is because the experience we are building uses 360 rendered images, and have being reproduced on another viewer (made with Unity) with no artifact.
Here are the comparison: viewer built with Aframe with artifact:
viewer built with Unity without the artifact using the same image:
which versions/commits do we see this problem with? 0.5.0? latest master?
Can you share a link to a working demo that reproduces the problem? The distortion at the poles is a common effect of equirectangular projected 360 images. There's less information per pixel. You can see how any 360 photo on flicker you display also presents the same issues:
@machenmusik We are using 0.5.0 for this project.
@dmarcos Here's the link to our test demo we are building https://beta.babylonvr.ca/viewer
For comparison, here's the exact same image we used but with react VR, they somehow fixed the problem http://test1.babylon360.com/vr/
I think Pannellum gets around this kind of distortion by "using an exact mathematical mapping from an equirectangular image to a rectilinear view". Implementing this looks pretty difficult...cube maps are an easier solution.
I see what you mean about distortion
but I don't recall this happening in prior versions, wondering if it is dimensions distortion (effectively clipping the picture which is bad for equirect) or insufficient sphere tesselation
it is certainly possible to deal with entirely in shader (compute texture coordinates based on observed long/lat regardless of shape) which is probably along the lines of Pannellum suggestion above.
will try to find some time to take a closer look
so far, looks like the default geometry settings for sphere used by a-sky have insufficient detail and exhibit visible distortion... increasing detail mitigates
@starwaver can you confirm that increasing detail of playback surface fixes for you? (see codepen)
@dmarcos should we consider increasing default detail?
@machenmusik The codepen uses a video as a source, which seem to work, but I can't get it working with a photo even at higher detail
@starwaver do you have a CORS-friendly URL of an example image to do in a codepen?
@machenmusik I tried it with a flickr 360 photo http://codepen.io/starwaver/pen/MmVQKp
Here's the same photo uploaded on facebook with no distortion https://www.facebook.com/starwaver/posts/10211510609455389
@starwaver so it turns out that for your codepen, the problem is this: whereas videos automatically set the non-power of two flag, apparently images don't, and so you get distortion from the resampling the browser does. the magic missing piece is material="npot:true"
http://codepen.io/machenmusik/pen/wdmjQb
so there are two pieces to this: (1) default geometry for a-sky and a-videosphere are not high enough to avoid easily visible distortion at poles -- fix is to increase segmentsWidth and/or segmentsHeight (2) support for non-power-of-two textures is only enabled by default for video, so other asset types may experience distortion due to resampling
@dmarcos @ngokevin @cvan can one of you please reopen this issue? and would you agree that the stock values in core should probably be adjusted to compensate?
@machenmusik I'm ok with increasing the sphere segments. People should be submitting power of two textures. Not sure if that should be enabled.
@dmarcos Many consumer cameras take non-power of 2 photos which I think is problem for apps that are photo galleries. I think it would be helpful to enable it.
We tried it and apparently uploading a power of 2 photo doesn't help, but adding npot:true does seem to fix it.
Also sphere segment doesn't seem to affect it much, but higher segment does drop the performance quite a bit.
yes most distortion for images is npot / resampling / mipmap, i think -- there are more fidelity gains to be had by slightly more detailed sphere, but rapidly diminishing tradeoffs beyond 64 segments, IMO
thanks @cvan for reopening! @dmarcos unless one of you gets to it first, I will do a PR that enables npot by default for images, and maybe bumps up segments for default spheres (will need to do a little testing to see what works best)
People should be submitting power of two textures.
The reality is that requiring power of two almost never really works in practice, as very very few devices have native power of two capture resolutions, and the quality loss from resampling is demonstrably worse than mipmap / interpolation compromises... thus this issue exists :-)
Nice catch @machenmusik, npot:true
is a significant improvement!
@bryik yup, now I remember why I did a PR for that parameter! ;-)
@machenmusik but why npot:true also significantly improve this issue on a 4096x2048 image. Is it already a power of 2 image?
I think what is happening is that npot:true
also disables mipmap (which is another potential source of resampling which ends up looking like distortion at poles)
looks MUCH better now, I think we can close this? http://codepen.io/machenmusik/pen/ZKoWLe http://codepen.io/machenmusik/pen/pPVyNE (@starwaver now we can see the underlying stitch errors from your camera rig that @dmarcos meant originally)
There's distortion on the top and bottom of the 360 image
Here's the screenshots from the A-frame demo: http://imgur.com/a/0Fd0K
Any idea how to fix it?