Accessing the SurfaceTexture in Java from MediaPlayer is not possible without using the new low level MediaCodec (API16). These are in flux and restrict deployment options. I also do not want to "go native".
A shader could handle rendering the left/right eye areas of the video to the left/right eye areas of the VR renderer.
From my research, it seems that the video surfacetexture can be passed to a fragment shader attached to the VideoTexture. SurfaceTexture even provides the transformationMatrix:
public void getTransformMatrix(float[] mtx)
I think I could do this with a raw (glsl) shader, but the new Rajawali shader framework should allow me to write the shader in java, yes? The framework generates and compiles the glsl?
Would this be the recommended way to VR render a 3D SBS video to a VideoTexture?
Accessing the SurfaceTexture in Java from MediaPlayer is not possible without using the new low level MediaCodec (API16). These are in flux and restrict deployment options. I also do not want to "go native".
A shader could handle rendering the left/right eye areas of the video to the left/right eye areas of the VR renderer.
From my research, it seems that the video surfacetexture can be passed to a fragment shader attached to the VideoTexture. SurfaceTexture even provides the transformationMatrix:
public void getTransformMatrix(float[] mtx)
I think I could do this with a raw (glsl) shader, but the new Rajawali shader framework should allow me to write the shader in java, yes? The framework generates and compiles the glsl?
Would this be the recommended way to VR render a 3D SBS video to a VideoTexture?