Open neuhaus opened 10 years ago
interesting, yeah could you send me a sample image it produces? might be able to pull it right out of the exif info in javascript without any additional configuration
Sure: This ( r0010050.jpg ) is the original image.
And this ( r0010050_xmp.jpg ) is a autolevelled image with added XMP data. The difference in this particular image is not that great, it is however easy to spot if you look at the front wheel of the bicycle.
According to the CGI at http://like.silk.to/theta360/zenith.cgi and the python script at http://xanxys.hatenablog.jp/entry/20131110/1384094832#20131110f1 the first image has zenith_x: 1.0 zenith_y: -33.0 compass: 0.0
Hey thanks a lot for this. I'll have to give this a think, the way the image is currently projected as 360 is that I simply map it as a texture to the inside of a sphere, throw the camera in the middle and let the user look around. The fix for that image actually seems to morph the pixels around a bit, so it's a little more involved than I thought. I'll keep you posted.
Hey Charlie, awesome. I’ve gotten around it for now by setting up a proxy through nginx to cloud front so all the files look like they are on the same domain. We lose some geographic load balancing, but that’s acceptable for now as the powers that be figure it out. We’ll continue to bang on the player and see what we can help with. Awesome work!
Best,
Tony
On Sep 15, 2014, at 7:53 PM, Charlie Hoey notifications@github.com wrote:
Hey thanks a lot for this. I'll have to give this a think, the way the image is currently projected as 360 is that I simply map it as a texture to the inside of a sphere, throw the camera in the middle and let the user look around. The fix for that image actually seems to morph the pixels around a bit, so it's a little more involved than I thought. I'll keep you posted.
— Reply to this email directly or view it on GitHub.
@flimshaw What happens if you rotate the sphere after applying the texture?
@neuhaus good question! I don't have the cycles right now to look into it, but I can point you in the right direction to do some experimenting.
https://raw.githubusercontent.com/flimshaw/Valiant360/master_debug/src/valiant.jquery.js
^ I threw a global variable called ValiantMeshes into this version of the library. This should allow you do access the sphere the video's being projected on from the Chrome dev console and rotate it after the player has started. If you just have one valiant360 instance, you can try something like the following in the console:
ValiantMeshes[0].rotation.x = 33 * Math.PI / 180;
This would rotate the mesh along the x axis by 33 degrees. Hope it helps, let me know what you find out or if you need some more assistance.
For the record, the Samsung Gear 360 also has a gyro that records the camera orientation
I have another feature request. When taking photos with a Ricoh Theta camera, they contain ZenithX, ZenithY and Compass data in the exif so the viewer can auto-level the horizon if the camera wasn't held 100% vertically during the shot. Could you add support for the Zenith data as extra parameters?
You can find some details on these (japanese) pages http://like.silk.to/theta360/zenith.cgi http://xanxys.hatenablog.jp/entry/20131110/1384094832#20131110f1
The Ricoh Theta windows app can also auto-level the orientation and write a new jpeg file but since that is a lossy operation, I'd prefer it if the rotation happens in the viewer. Perhaps it's just a matter of applying some transformation once when you create the ThreeJS mesh sphere?
PS: I believe this is different than the lot/lan parameters which just specify the initial viewing direction but do not tilt the axis permanently, right?