googlevr / gvr-android-sdk

Google VR SDK for Android
http://developers.google.com/vr/android/
Other
3.28k stars 1.28k forks source link

Using GVR with EGL context created in C++ #300

Open gouletr opened 7 years ago

gouletr commented 7 years ago

Hi, so we got GVR working on iOS, works great so far, thanks for the nice SDK.

However on Android, we are facing some difficulties to understand how we can use the native version in a setup in which we create our EGL context in C++ rather than in the NativeActivity side. There is no samples showing how this could work. Is this a supported configuration? That raises some questions...

Our NativeActivity onCreate is extremely minimalistic:

public class StingrayActivity extends android.app.NativeActivity {
    static {
        System.loadLibrary("gvr");
        System.loadLibrary("gvr_audio");
    }

    @Override
    public void onCreate(Bundle savedInstanceState) {
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().getDecorView().setSystemUiVisibility(
            View.SYSTEM_UI_FLAG_LAYOUT_STABLE
            | View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION
            | View.SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN
            | View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
            | View.SYSTEM_UI_FLAG_FULLSCREEN
            | View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN | WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        super.onCreate(savedInstanceState);
    }
    ...
}

I see in the GVR NDK sample that it performs a few more things that seems important. So how does that translate to C++ if we can't create the EGL context in the NativeActivity?

Thanks!

jdduke commented 7 years ago

Is this GvrLayout object still relevant in that setup?

Indeed, GvrLayout is required regarless of the Activity type. It's from GvrLayout that you obtain your GvrApi instance, and from there the native gvr_context handle.

Do we have to tell GVR about our EGL context somehow?

Not explicitly, as long as you call gvr_initialize_gl appropriately. That said, the View that you provide as your "presentation" view, to GvrLayout, should in general be capable of display GL content. If you're using async reprojection, this isn't technically required, as we create our own SurfaceView for frontbuffer rendering when async reprojection is enabled.

gouletr commented 7 years ago

So that means if I do:

gvrLayout.setPresentationView(new View());

it won't work on cardboard-only devices? i.e. without async reprojection?

jdduke commented 7 years ago

it won't work on cardboard-only devices? i.e. without async reprojection?

Probably not, though we haven't really tested this path so it's hard to say for sure. What you'd probably want to do is use a SurfaceView instead of a View, plumbing down the Surface from Java into native, as shown in this example: https://github.com/wwlinx/android-native-egl-example-buck. You should still be able to use the EGL context you've created from native.

windbagjacket commented 7 years ago

Hi. We're actually having the exact same problem as gouletr with Android - we use android_native_app_glue, create the EGL/GL context in native code and have an almost empty NativeActivity.

I just wanted to clarify... are you saying it should work fine if in my java onCreate() I do:

gvrLayout = new GvrLayout(this);
gvrLayout.setAsyncReprojectionEnabled(true);
gvrLayout.setPresentationView(new SurfaceView);

As long as I pass the gvrLayout to my native code and call gvr_initialize_gl?

Great SDK, just trying to get my head around these implementation details! A sample showing how to do this correctly would be fantastic.

Thanks.

jdduke commented 7 years ago

Yes, @windbagjacket, in theory that should accommodate your needs. However, we haven't done extensive testing with this approach yet, so you might hit some snags along the way. In the meantime, we'll try to vet this configuration internally, providing some code snippets/documentation accordingly.

gouletr commented 7 years ago

@windbagjacket I made it work on Google Pixel phones by doing:

gvrLayout = new GvrLayout(this);
gvrLayout.setAsyncReprojectionEnabled(true);
gvrLayout.setPresentationView(new View(this));

...as opposed to use a SurfaceView.

However this won't work on devices that do not support async reprojection since as I understand the GVR SDK won't create a custom GL surface and you end up rendering at the wrong place.

gouletr commented 7 years ago

@jdduke To continue on the topic, I tried to make it work by creating a SurfaceView instead of a View, and so far it isn't working for me. I looked into the sample you linked but our engine uses OS callbacks such as:

activity->callbacks->onWindowFocusChanged = &aac::focus_changed;
activity->callbacks->onNativeWindowCreated = &aac::window_created;
activity->callbacks->onNativeWindowResized = &aac::window_resized;
activity->callbacks->onNativeWindowRedrawNeeded = &aac::window_redraw_needed;
activity->callbacks->onNativeWindowDestroyed = &aac::window_destroyed;

...to get the native window pointer, while in the example you linked it gets the native window pointer from the SurfaceView.

So how does this mix with creating a SurfaceView if the window we get in these callbacks is not the window from the SurfaceView we created on the java side? Is there a way to tell Android to not create a new window and instead use our SurfaceView's window? Sorry if that sounds inaccurate, learning the ropes as I go... :)

windbagjacket commented 7 years ago

Thanks @jdduke & @gouletr, I have it working doing the same thing now.

As an aside, I notice that doing this through NativeActivity also seems to bypass the usual "Place your phone inside the headset" screen and controller calibration that I was seeing on every launch when not using NativeActivity. Not much of an issue though - it's actually quite useful when iterating as having to calibrate at every re-run was getting cumbersome. The settings and quit buttons are also missing now.

jdduke commented 7 years ago

"Place your phone inside the headset" screen and controller calibration that I was seeing on every launch when not using NativeActivity

Are you forwarding the Activity lifecycle events to GvrLayout? Via GvrLayout.onPause()/onResume()/

The settings and quit buttons are also missing now.

Yeah, this is a general limitation with NativeActivity, as the settings/exit buttons are View-based. There are crude workarounds, which involve doing something like:

getWindow().takeSurface( null );
getWindow().setContentView( gvrLayout );

in your NativeActivity's onCreate() method. However, the buttons will remain unresponsive to MotionEvents (as they're delivered directly at the native layer).

ian-wevr commented 7 years ago

I'm porting an existing mostly-native Android app to Daydream that's landed on the SurfaceView approach mentioned by jdduke and with scanline racing. In general it works well (using 1.0.1) but there's often obvious short vertical white tears that can happen anywhere -- seems to be subtly timing related, perhaps, as it happens during more intensive portions of our application.

This shows up in the log output: I/GVR: [vr/gvr/render/scanline_racer.cc:420] Scanline racing enabled without context sharing

I've had a hard time finding documentation that suggests what can cause this message and/or if this is related to the tearing issues. When I create the GL context, does it need to be shared with any particular existing GL context, or is it enough to create any GL context, make it current, and call gvr_initialize_gl()?

ssaroha commented 7 years ago

@gouletr @jdduke I am trying to get the sample video application working on cardboard-only devices which do not support async reprojection.

To get this working I tried to modify the following piece of code in WatchVideoActivity.java


if (!isSurfaceEnabled || !isAsyncReprojectionEnabled) {
      // The device does not support this API, video will not play.
      Log.e(
          TAG,
          "UnsupportedException: "
              + (!isAsyncReprojectionEnabled ? "Async Reprojection not supported. " : "")
              + (!isSurfaceEnabled ? "Async Reprojection Video Surface not enabled." : ""));

}

to this:

if (!isSurfaceEnabled || !isAsyncReprojectionEnabled) {
      // The device does not support this API, still go ahead with default surface.

initVideoPlayer();

      // The ExternalSurface buffer the GvrApi should reference when drawing the video buffer. This
      // must be called after enabling the Async Reprojection video surface.
//      renderer.setVideoSurfaceId(gvrLayout.getAsyncReprojectionVideoSurfaceId());

          // Simulate cardboard trigger to play/pause video playback.
      gvrLayout.enableCardboardTriggerEmulation(triggerRunnable);

      // The default value puts the viewport behind the eye, so it's invisible. Set the transform
      // now to ensure the video is visible when rendering starts.
      renderer.setVideoTransform(videoTransform);

      surfaceView.setOnTouchListener(
              new View.OnTouchListener() {
                @Override
                public boolean onTouch(View view, MotionEvent event) {
                  if (event.getActionMasked() == MotionEvent.ACTION_DOWN) {
                    triggerRunnable.run();
                    return true;
                  }
                  return false;
                }
              });

effectively I just commented out the setVideoSurfaceId, to avoid using the AsyncReprojectionVideo Surface.

Compiling and running this modified sample on cardboard-only device just shows the stereoscopic 2 eye view but no video frame in there.

What should I be setting for the surfaceId when there is no AsyncReprojection video surface? Anything else I should be modifying to get video frame to show in cardboard only devices.

thanks satender

jdduke commented 7 years ago

Hi @ssaroha. At the moment, the video Surface API is only supported when async reprojection is enabled. We're actively working on Cardboard (without async reprojection) support for the video Surface API.

However, note that, without async reprojection, the video Surface API doesn't buy you a whole lot in terms of functionality (e.g., playing protected content without need a protected GL context). For the Cardboard case, you can simply draw the Surface quad directly into your scene.

ssaroha commented 7 years ago

Thanks @jdduke. I took a stab at removing references to video surface api for async reprojection in the sample example file WatchVideoActivity.java Here is the diff: https://github.com/ssaroha/gvr-android-sdk/commit/0712f03b04b44ec55ff4d9c4fe08214327b964af

I am still a newbie in OpenGL and need a little help to figure out how to draw the surface quad in the scene. I am extending the VideoSceneRenderer whereby it implements SurfaceTexture.OnFrameAvailableListener. In the OnSurfaceCreated method of VideoSceneRenderer class I created the SurfaceTexture object, and the corresponding surface object.

mSurface = new SurfaceTexture(); // I still need to figure out how to give it one of the existing GL textures
mSurface.setOnFrameAvailableListener(this);
Surface surface = new Surface(mSurface);

My goal was that this surface can then be used video scene. In the videoscene class there is updateViewport method implementation which calls BufferViewPort.setExternalSurfaceId here: https://github.com/googlevr/gvr-android-sdk/blob/master/samples/sdk-videoplayer/src/main/java/com/google/vr/sdk/samples/videoplayer/VideoScene.java#L90

I am wondering how to provide this new surface to BufferViewPort(s), as there doesn't see to be any setSurface method on BufferViewPort.

Am I on the right track or is there a better way to modify the sample program while still able to use the videoHoleProgram and spriteProgram.

appreciate your help!

jdduke commented 6 years ago

Hi @ssaroha, apologies for the delayed response. We haven't yet exposed the more general API for supporting arbitrary Surfaces, and they still require use of async reprojection. I'll post back here if/when that changes.