Closed olijuseju closed 1 year ago
According to the communication in a similar question, dual-screen video playback is supported by ExoPlayer, please find more details in that link. Meanwhile let me assign this question to my colleague, just in case there are follow up questions.
I already have seen that link, but that answer doesn't fix my problem, it only displays the video in 360 and you can watch it dragging with the finger, but doesn't follows the gyroscope direction and the view is not displayed in dual-screen mode
If you modify the demo app to set app:surface_type="spherical_gl_surface_view"
on the PlayerView
in player_activity.xml
(to enable using SphericalGLSurfaceView
instead of a plain SurfaceView
) the video rendering should follow the sensor direction.
However, getting the left and right eye views to render together actually requires more work (sorry for my confusing answer on the other bug, which I'll update). The part that is supported in ExoPlayer is extracting/decoding videos encoded with left/right eye videos, and SceneRenderer
is also capable of rendering left/right separately, but the missing part is a view (replacing SphericalGLSurfaceView
) that calls through to the scene renderer to render left/right independently and compose them together. SphericalGLSurfaceView
calls scene.drawFrame(viewProjectionMatrix, /* rightEye= */ false);
to draw for the left eye, whereas a view supporting side by side playback would need to invoke the scene renderer with true
as well to draw for the right eye.
This is my xml code on home_activity.xml:
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".HomeActivity">
<com.google.android.exoplayer2.ui.PlayerView
android:id="@+id/videoView"
android:layout_width="320dp"
android:layout_height="0dp"
android:layout_marginStart="16dp"
android:layout_marginTop="16dp"
app:controller_layout_id="@layout/custom_control_view"
app:layout_constraintBottom_toTopOf="@+id/btPlayPause"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:resize_mode="fit"
app:show_timeout="5000"
app:surface_type="spherical_gl_surface_view" />
</androidx.constraintlayout.widget.ConstraintLayout>
But the streaming doesn't render following the sensor direction
In your code: scene.drawFrame(viewProjectionMatrix, /* rightEye= */ false);
"scene" is an instance of Exoplayer.SceneRenderer??
But the streaming doesn't render following the sensor direction
Please try in the demo app to eliminate the possibility that it's an issue with your app integration.
"scene" is an instance of Exoplayer.SceneRenderer??
See https://github.com/google/ExoPlayer/blob/ac9d5337b2b51a855f3c33f3b126d9ef921ebc69/library/core/src/main/java/com/google/android/exoplayer2/video/spherical/SphericalGLSurfaceView.java#L318 (it's an instance of SceneRenderer
in that package).
Hi, I've tested the demo app with app:surface_type="spherical_gl_surface_view" and it works with the phone sensors, but in my app, if I replicate the same activity, the streaming doesn't render following the sensor direction and the user have to drag with the finger. Is any library, version of Java, or android version that I'm missing??
here is my build.gradle (Module:app)
plugins { id 'com.android.application' id 'org.jetbrains.kotlin.android' }
android { compileSdk 31
defaultConfig { applicationId "com.example.jjpeajar.jors_1" minSdk 21 targetSdk 31 versionCode 1 versionName "1.0" multiDexEnabled true
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
vectorDrawables {
useSupportLibrary true
}
}
buildTypes { release { minifyEnabled false proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro' } } compileOptions { sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 } kotlinOptions { jvmTarget = '1.8' } buildFeatures { compose true } composeOptions { kotlinCompilerExtensionVersion '1.3.2' } packagingOptions { resources { excludes += '/META-INF/{AL2.0,LGPL2.1}' } } }
dependencies { // Required for all Google VR apps implementation 'com.google.vr:sdk-base:1.190.0' implementation 'androidx.appcompat:appcompat:1.4.0' implementation 'com.google.android.material:material:1.4.0' implementation 'androidx.annotation:annotation:1.4.0'
implementation 'androidx.annotation:annotation:1.4.0'
implementation 'androidx.constraintlayout:constraintlayout:2.1.2'
implementation 'com.google.android.exoplayer:exoplayer:2.17.1' implementation "com.google.android.exoplayer:extension-rtmp:2.17.1" implementation "com.google.android.exoplayer:exoplayer-core:2.17.1" implementation "com.google.android.exoplayer:exoplayer-dash:2.17.1" implementation "com.google.android.exoplayer:exoplayer-ui:2.17.1" implementation 'com.google.android.exoplayer:extension-ima:2.17.1'
implementation "androidx.multidex:multidex:2.0.1"
implementation 'org.videolan.android:libvlc-all:3.1.12' implementation("com.squareup.okhttp3:okhttp:4.10.0") implementation 'com.pierfrancescosoffritti.androidyoutubeplayer:core:10.0.0' implementation 'androidx.lifecycle:lifecycle-runtime-ktx:2.3.1' implementation 'androidx.activity:activity-compose:1.5.1' implementation platform('androidx.compose:compose-bom:2022.10.00') implementation 'androidx.compose.ui:ui' implementation 'androidx.compose.ui:ui-graphics' implementation 'androidx.compose.ui:ui-tooling-preview' implementation 'androidx.compose.material3:material3' androidTestImplementation platform('androidx.compose:compose-bom:2022.10.00') androidTestImplementation 'androidx.compose.ui:ui-test-junit4' debugImplementation 'androidx.compose.ui:ui-tooling' debugImplementation 'androidx.compose.ui:ui-test-manifest' }
and that's my project's build.gradle:
// Top-level build file where you can add configuration options common to all sub-projects/modules. buildscript { repositories { google() mavenCentral() maven { url 'https://jitpack.io/' } } dependencies { classpath "com.android.tools.build:gradle:7.0.2" classpath 'org.jetbrains.kotlin:kotlin-gradle-plugin:1.7.20'
// The Google VR NDK requires experimental version 0.9.3 or higher.
// classpath 'com.android.tools.build:gradle-experimental:0.9.3'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
} }
task clean(type: Delete) { delete rootProject.buildDir }
Please check you are calling onResume
+ onPause
on the PlayerView
(or SphericalGLSurfaceView
if you're not using PlayerView
) at the right time. I think that's the most likely thing to be missing. See https://github.com/androidx/media/blob/2fc189d6a40f116bd54da69ab9a065219f6973e7/demos/main/src/main/java/androidx/media3/demo/main/PlayerActivity.java#L151.
Ok, thank you it worked. But now, to have a separate view for each eye, what should I do? Modify the exoplayer library, the SphericalGLSurfaceView library or create a new java class that shows the 2 views and joins them in the same player? Do you have an example of how I could do it?
I would use SphericalGLSurfaceView
as a starting point and try to modify it to render both views. It looks like the existing scene renderer can help with this as per my comment above. Going into much more detail or investigating this further are beyond the scope of support we can provide here.
Hi, sorry for the late response but I can't modify the library SphericalGLSurfaceView . Do I have to create my own library based on SphericalGLSurfaceView and try to use exoplayer with this library instead? How can I make exoplayer work with this other library?
You'd need to fork the file and any files it depends on that you need to modify. Beyond that I'm afraid we can't help further, as it's not related to ExoPlayer/Media3.
Ok, thank you. You know where I can find this kind of information?
You could try asking on Stack Overflow or a similar general Q&A site.
Hi, I'm tying to develop an app in Android Studio te watch a 360 streaming using exoplayer, but I need to watch the streaming like an VR video (cardboard), with a separated view for each eye and control the player with the movement sensors of the phone. Is any way to do this? Thanks