Open epiciskandar opened 3 weeks ago
This problem could be fixed by "double commit" like this:
val callback = object : CameraCaptureSession.StateCallback() {
override fun onConfigured(session: CameraCaptureSession) {
this@MainActivity.session = session
val rr = object: Runnable {
override fun run() {
session.setRepeatingRequest(
requestBuilder.build(),
cameraCaptureCallback,
cameraHandler
)
}
}
cameraHandler.postDelayed(rr, 0)
cameraHandler.postDelayed(rr, 100) // could not be too small, and must post two times
}
}
but I don't know why
edit: This only works on Samsung, on the Pixel 7, double commit will stop main(logical) camera receiving data, and got this error log:
2024-08-23 16:33:44.464 7194-7376 gesture com.gesture E [SurfaceTexture-0-7194-0] detachFromContext: SurfaceTexture is not attached to a GL context
2024-08-23 16:33:44.464 7194-7376 ArCore-TextureStore com.gesture E attachTexImage: calling detachFromGLContext failed:Error during detachFromGLContext (see logcat for details)
But add delay time will make it works on Pixel 7 too, like this:
cameraHandler.postDelayed(rr, 100)
cameraHandler.postDelayed(rr, 300)
SPECIFIC ISSUE ENCOUNTERED
My purpose: use ARCore to get camera 3D position and posture data, and recording individual physical back camera, all at the same time.
When I tried using ARCore with Camera2 api which enables additional individual physical camera recording/preview, only the
main
(logical) camera got images that could be rendering, not those physical cameras.If I call the Camera2 API doing the same thing, it works.
VERSIONS USED
adb shell pm dump com.google.ar.core | findstr /i "packages: versionName"
On macOS, use:adb shell pm dump com.google.ar.core | egrep -i versionName\|packages:
adb shell getprop ro.build.fingerprint
: samsung/b0qzcx/b0q:13/TP1A.220624.014/S9080ZCU4CWH1:user/release-keysSTEPS TO REPRODUCE THE ISSUE
sorry for inconvenient providing full demo
onSurfaceCreated
, create session with shared cameraSession(this, setOf(Session.Feature.SHARED_CAMERA))
DepthMode.AUTOMATIC
, andUpdateMode.LATEST_CAMERA_IMAGE
cameraManager.openCamera
with wrapped callbackCameraDevice.StateCallback.onOpened
:setCameraTextureName
for rendering main video stream openGL rendering, and configure a list ofList<OutputConfiguration>
with surfaces fromsharedCamera.arCoreSurfaces
, alongside with surfaces that comes from jetpack compose UI constructing which configured with specified physical camera by callingOutputConfiguration.setPhysicalCameraId
cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
SessionConfiguration(SessionConfiguration.SESSION_REGULAR, outputConfigs, ...)
with wrapped StateCallbackCameraCaptureSession.StateCallback.onConfigured
, callsession.setRepeatingRequest
with surfaces saved previously, it is sure thatcreateCaptureRequest
andsetRepeatingRequest
have the same target/surface config.Now, the logical camera render to
GLSurfaceView
corretly, butSurfaceView
s that should showing physical camera preview just got one frame.video below:
https://github.com/user-attachments/assets/8631bd04-f251-4768-ba43-7853f02e1228
in the video, the beginning half shows calling ARCore and Camera2, notice that sub previews only got updated one frame. at the end, video shows how the preview should be shown to views(and currently main preview have bug, not related here) if only using Camera2.
WORKAROUNDS (IF ANY)
ADDITIONAL COMMENTS
The ARCore update is driven by
sharedSession.update()
atGLSurfaceView.Renderer.onDrawFrame
, and I got camera position data fromframe.camera.pose