Closed GeorgeAdamon closed 5 years ago
I don't know about using the RealSense and the SDK's Unity wrapper with Quest, but there was a developer who created a custom version of the wrapper to support an HTC Vive point cloud viewer that relied on the SteamVR plugin in Unity.
https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647
It does not look as though the RealSense camera was plugged into the headset in that case though, but rather whatever PC was driving the headset through a cable.
https://www.youtube.com/watch?v=dotQGfINt3w&feature=youtu.be
Thanks for your reply @MartyG-RealSense ! From my understanding they just customized some RealSense scripts in order to be able to stream multiple cameras simultaneously to Unity. Indeed it looks like the RealSense is connected directly to the PC,.
I went over all the information that you provided carefully. As I am not a specialist in using Librealsense with Android though, I will tag into this conversation the guy who works on the Android implementation of the SDK to seek his input. @matkatz
MarshalDirectiveException: Cannot marshal type 'System.Object'
and related errors are most probably caused by building with Unity's IL2CPP scripting backend, which isn't supported by the librealsense wrapper.
Please try switching to the Mono\.NET backend instead.
Thanks for the advice @ogoshen , I wasn't aware that IL2CPP isn't compatible with librealsense. I will try that and post an update.
@matkatz @ogoshen I built the app again using the Mono scripting backend and started the app with the RS D435i connected to the Quest. The app froze for almost 30 seconds right after the Unity splash screen, and after that it played as before, without any signs that the D435i is active/streaming. At this point, the Logcat output reported (once) the following error under the Unity tag:
ExternalException: rs2_pipeline_start_with_config(pipe:0xcd957360, config:0xcd9575d8)
Rethrow as Exception: No device connected
at Intel.RealSense.ErrorMarshaler.MarshalNativeToManaged (System.IntPtr pNativeData) [0x000a7] in <ba30e24c78874cfe9ca3e2cba0dbd673>:0
at (wrapper managed-to-native) Intel.RealSense.NativeMethods.rs2_pipeline_start_with_config(intptr,intptr,object&)
at Intel.RealSense.Pipeline.Start (Intel.RealSense.Config cfg) [0x0000d] in <ba30e24c78874cfe9ca3e2cba0dbd673>:0
at RsDevice.OnEnable () [0x00017] in <d4a4188e91f246d3bca7964430ee0b34>:0
I then started to filter the logs with other tags, specifically InputDeviceManager and UsbPortManager and I started plugging-unplugging the RS D435i to see what happens. These are the messages I got, as soon as the RealSense was connected to the Quest: InputDeviceManager
InputDeviceManager: ovrInputDeviceManager::Initialize
InputDeviceManager: ovrInputDeviceManager::SetRemoteConnected - hardwareID = cf5afef8776d4f99, modelID = 4000
InputDeviceManager: ovrInputDeviceManager::CreateDevice - hardwareID = cf5afef8776d4f99, modelID = 4000
InputDeviceManager: ovrInputDeviceManager::AllocDevice - hardwareID = cf5afef8776d4f99, modelID = 4000
InputDeviceManager: ovrInputDeviceManager::SetRemoteConnected - hardwareID = d7b77de328b90488, modelID = 4000
InputDeviceManager: ovrInputDeviceManager::CreateDevice - hardwareID = d7b77de328b90488, modelID = 4000
InputDeviceManager: ovrInputDeviceManager::AllocDevice - hardwareID = d7b77de328b90488, modelID = 4000
InputDeviceManager: ovrInputDeviceManager::Shutdown
UsbPortManager
UsbPortManager: USB port changed: port=UsbPort{id=otg_default, supportedModes=dual}, status=UsbPortStatus{connected=true, currentMode=dfp, currentPowerRole=source, currentDataRole=host, supportedRoleCombinations=[source:host, sink:device]}, canChangeMode=true, canChangePowerRole=false, canChangeDataRole=false
Any thoughts?
I confirmed that this error ExternalException: rs2_pipeline_start_with_config(pipe:0xcd957360, config:0xcd9575d8) Rethrow as Exception: No device connected
appears whether I connect the RS D435i to the Quest or not, and it's the same error that I get on Unity's console if I try to run the app without connecting the RS D435i to my PC.
Right, so you've past the first hurdle and got everything compiled and running. The second (and last 🤞) is a result of targeting Android from Unity, it is an undocumented limbo, but I can assure you it'll work.
If you look at the Android examples you'll find that you need to call
RsContext.init
before anything else to get the device recognized.
The following script should take care of camera permissions, and call the proper java method:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class AndroidPermissions : MonoBehaviour
{
#if UNITY_ANDROID && !UNITY_EDITOR
void Awake()
{
if (!UnityEngine.Android.Permission.HasUserAuthorizedPermission(UnityEngine.Android.Permission.Camera))
{
UnityEngine.Android.Permission.RequestUserPermission(UnityEngine.Android.Permission.Camera);
}
using (var javaUnityPlayer = new AndroidJavaClass("com.unity3d.player.UnityPlayer"))
using (var currentActivity = javaUnityPlayer.GetStatic<AndroidJavaObject>("currentActivity"))
using (var rsContext = new AndroidJavaClass("com.intel.realsense.librealsense.RsContext"))
{
Debug.Log(rsContext);
rsContext.CallStatic("init", currentActivity);
}
}
#endif
}
Thank you @ogoshen! Getting there!
The second step was indeed successful, but unfortunately it's not the last one. I attached the script above to a GameObject and rebuilt with the same settings. When I run the app, the D435i starts projecting the IR dot pattern ( hooray !!) but the app itself shows no point-cloud. The logcat now keeps repeating the same error for every frame:
OPENGL NATIVE PLUG-IN ERROR: GL_INVALID_OPERATION: Operation illegal in current state
(Filename: /Users/builduser/buildslave/unity/build/Runtime/GfxDevice/opengles/GfxDeviceGLES.cpp Line: 346)
I understand that this is an infamous error that various people have for various reasons. [1][2]
I'm using Unity 2019.1.4f1 with OpenGLES3 as my Graphics API, and .NET 4.x as my Scripting Runtime Version. Switching to OpenGLES2 and/or .NET 3.5 didn't solve the problem. These things seemed to relate to this problem for other people, so I'm just mentioning them.
Glad to help.
So this issue doesn't seem related to librealsense and I suggest reporting it to Unity. Have you tried turning antialiasing off as suggested in this post?
Does this happen only with the point cloud renderer? can you get depth\color texture steaming to work?
I am happy to report that I got it working !!!
The issue with the PointCloudDepthAndColor scene from the Unity samples was that the PointCloudMat material assigned to the PointCloudRenderer component was using by default the Custom/PointCloudGeom shader, which is a geometry shader.
Apparently the Oculus Quest doesn't like geometry shaders, which I confirmed with a different project using geometry shaders. I believe this was the reason for the OPENGL NATIVE PLUG-IN ERROR: GL_INVALID_OPERATION: Operation illegal in current state
error.
Switching the shader of this material to the simple Custom/PointCloud shader worked like a charm! I had the joy of walking through a real-time point cloud of my room!
Also textures (Depth/Color) work perfectly.
Thanks for all your help @ogoshen
@GeorgeAdamon Congratulations! I will make a post on Intel Support's forum linking to this discussion so that others can benefit from this knowledge. :)
@GeorgeAdamon I could see Realsense's PointCloud from OculusQuest thanks to your post!Thank you!
I just opened a repo, documenting the process, I will upload a sample project in the following days! https://github.com/GeorgeAdamon/quest-realsense
@GeorgeAdamon Thanks so much for the effort you have gone to in sharing the experience with other community members in an easy to follow format!
For anyone who comes here from Google when searching "Oculus Quest Geometry Shader", I can say that Geometry shaders will work on the Quest, but only in Multi-Pass stereo.
In single-pass stereo you just get the classic GL_INVALID_OPERATION every frame.
I just wanted to add that I came across a 3D printing blueprint for a 400 Series mounter for Oculus Quest.
@GeorgeAdamon @MartyG-RealSense Hi Guys
I know this threading is a getting a bit dated but I am trying to get my D435i to run on my Quest (first gen) and am having a heck of a time. Does anyone have information on which JDK/NDK/AndroidSDK/Gradle versions I should have for building the .aar from android studio? Firstly, I dont see a task for assembleRelease inside android/librealsense/Tasks/build. I can seem to run it by typing it via execute gradle task but it fails. This error seems to be a common theme:
AAPT: error: style attribute 'android:attr/dialogCornerRadius' not found.
The source of the error seems to come from my .gradle folder so I've tried many different gradle versions and the build keeps failing. I've also tried running some of the other tasks like buildDependents, buildNeeded, clean, cleanBuildCache, they also all fail.
There was a point in time a couple weeks ago when I was able to build the .aar file but when I put it into my unity project as per instructed, as well as add the permissions script to a game object, I can't get the realsense webcamtexture to show up on a plane object in my scene while running stand-alone on the quest. The Quest asked for permission once and I havent seen the request since, even though I later altered the permission script to request it even if it has already been given. After making tons of adjustments to various NDK/SDK/JDK/Gradle versions and still not getting results, I decided to restart the process in @GeorgeAdamon documentation. However, like I mentioned, I can't produce the .aar file anymore and I have no idea why.
Getting this camera streaming into my unity application is huge milestone for my project. I would really appreciate some help getting this sorted out! I have essentially zero experience with android, all of my knowledge is in Unity 3D/C#. Let me know what information you need for troubleshooting. Thanks!
Hi @Kukewilly Another RealSense Android user also had problems with a missing assembleRelease task today in https://github.com/IntelRealSense/librealsense/issues/9978
A guide that I wrote about installing the Android wrapper advises in https://github.com/IntelRealSense/librealsense/issues/9753#issuecomment-919903084 about going to the File > Settings menu option of Android Studio and selecting the Experimental category, then unticking the setting 'Do not build Gradle task list during Gradle sync' and clicking the Apply and OK buttons. Have you tried this action, please?
Hey @MartyG-RealSense thanks for the quick response! I do have the option unticked, but still no assembleRelease task is present. I am able to run the task as described in the thread you linked using "Execute Gradle Task".
I wiped everything I could of Realsense and Android Studio off my pc, then redownloaded/installed. I was able to build the AAR file again! :D
I placed the librealsense-release.aar file into the RealSense SDK plug-in folder as per https://github.com/GeorgeAdamon/quest-realsense then placed an android permission object in the scene with the script provided to request permission and initialize rsContext. I then uninstalled the previous .apk from my Quest, built the new .apk in unity and reloaded it into my Quest. When I booted my quest with the realsense connected I got the permission request for both the realsense, and for the app to take photos and video!
However the 3d plane is still not displaying the color feed from the realsense. It displays it when the realsense is connected to my PC. I am starting to wonder if my unity scripts are not referencing the webcamTexture of the camera correctly. I'm using openCV and here is the script responsible for finding the camera device and displaying it on the plane gameObject:
EDIT: Code went in sloppy. The script is WebCamTextureToMatHelper.cs in OpenCVforUnity, will follow up with script when I find a better way to link
Perhaps this isn't in your wheelhouse anymore but any help is appreciated. Thanks!
Luke
Here is how OpenCV for Unity locates the device and assigns the webcam texture. See anything that the realsense/android wont jive with?
` // Creates the camera var devices = WebCamTexture.devices; if (!String.IsNullOrEmpty(requestedDeviceName)) { int requestedDeviceIndex = -1; if (Int32.TryParse(requestedDeviceName, out requestedDeviceIndex)) { if (requestedDeviceIndex >= 0 && requestedDeviceIndex < devices.Length) { webCamDevice = devices[requestedDeviceIndex];
if (avoidAndroidFrontCameraLowLightIssue && webCamDevice.isFrontFacing == true)
requestedFPS = 15f;
if (requestedFPS < 0)
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight);
}
else
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight, (int)requestedFPS);
}
}
}
else
{
for (int cameraIndex = 0; cameraIndex < devices.Length; cameraIndex++)
{
if (devices[cameraIndex].name == requestedDeviceName)
{
webCamDevice = devices[cameraIndex];
if (avoidAndroidFrontCameraLowLightIssue && webCamDevice.isFrontFacing == true)
requestedFPS = 15f;
if (requestedFPS < 0)
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight);
}
else
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight, (int)requestedFPS);
}
break;
}
}
}
if (webCamTexture == null)
Debug.Log("Cannot find camera device " + requestedDeviceName + ".");
}
if (webCamTexture == null)
{
// Checks how many and which cameras are available on the device
for (int cameraIndex = 0; cameraIndex < devices.Length; cameraIndex++)
{
if (devices[cameraIndex].kind != WebCamKind.ColorAndDepth && devices[cameraIndex].isFrontFacing == requestedIsFrontFacing)
if (devices[cameraIndex].isFrontFacing == requestedIsFrontFacing)
{
webCamDevice = devices[cameraIndex];
if (avoidAndroidFrontCameraLowLightIssue && webCamDevice.isFrontFacing == true)
requestedFPS = 15f;
if (requestedFPS < 0)
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight);
}
else
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight, (int)requestedFPS);
}
break;
}
}
}
if (webCamTexture == null)
{
if (devices.Length > 0)
{
webCamDevice = devices[1];
if (avoidAndroidFrontCameraLowLightIssue && webCamDevice.isFrontFacing == true)
requestedFPS = 15f;
if (requestedFPS < 0)
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight);
}
else
{
webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight, (int)requestedFPS);
}
}
else
{
isInitWaiting = false;
initCoroutine = null;
if (onErrorOccurred != null)
onErrorOccurred.Invoke(ErrorCode.CAMERA_DEVICE_NOT_EXIST);
yield break;
}
}`
Hi @Kukewilly In example code in Unity's WebCamTexture documentation in the link below, they state the instruction WebCamTexture in front of the custom WebCamTexture name.
https://docs.unity3d.com/ScriptReference/WebCamTexture-deviceName.html
So I wonder whether you need to add WebCamTexture onto the start of the relevant lines so that Unity knows that webCamTexture is of the component type WebCamTexture. For example:
WebCamTexture webCamTexture = new WebCamTexture(webCamDevice.name, requestedWidth, requestedHeight);
@MartyG-RealSense Thanks for the reply. Higher up in the same script webCamTexture is declared as a WebCamTexture and it runs great from my PC. Sry I didn't paste it all because it came in messy.
I'm reading a lot about external USB cameras on Android being more complex than calling the native camera on a device. I'm hoping to hear from @GeorgeAdamon on how he actually accessed the camera feed in Unity with C# because it appears I've reproduced everything in his instructions correctly up until step 4. But I'm under the impression that step doesnt apply to me because all I need is the webcamTexture from the realsense so I can use it for openCV. I'm not actually trying to run any of the realsense scenes or view pointcloud/depth.
I also came across this asset (https://assetstore.unity.com/packages/tools/integration/usb-camera-for-unity-android-151744#description) which claims to be able to solve my problem but his test .apk doesnt launch on my quest. Emailed him and waiting to hear back.
If you can think of anything else worth trying let me know :)
Thanks
I researched the Android dialogCornerRadius error and found suggestions about resolving it in the link below.
I got wireless debugging set up with logcat and these are realsense related messages I'm getting when I plug in the realsense and accept permissions:
11-23 13:54:09.158: D/UsbHostManager(1038): Added device UsbDevice[mName=/dev/bus/usb/001/002,mVendorId=32902,mProductId=2874,mClass=239,mSubclass=2,mProtocol=1,mManufacturerName=Intel(R) RealSense(TM) Depth Camera 435i,mProductName=Intel(R) RealSense(TM) Depth Camera 435i,mVersion=50.1215,mSerialNumberReader=com.android.server.usb.UsbSerialReader@9c3723e,mConfigurations=[
11-23 13:54:09.158: D/UsbHostManager(1038): UsbInterface[mId=0,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i Depth,mClass=14,mSubclass=1,mProtocol=0,mEndpoints=[
11-23 13:54:09.158: D/UsbHostManager(1038): UsbInterface[mId=1,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i Depth,mClass=14,mSubclass=2,mProtocol=0,mEndpoints=[
11-23 13:54:09.158: D/UsbHostManager(1038): UsbInterface[mId=2,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i Y,mClass=14,mSubclass=2,mProtocol=0,mEndpoints=[
11-23 13:54:09.158: D/UsbHostManager(1038): UsbInterface[mId=3,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i RGB,mClass=14,mSubclass=1,mProtocol=0,mEndpoints=[]
11-23 13:54:09.158: D/UsbHostManager(1038): UsbInterface[mId=4,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i RGB,mClass=14,mSubclass=2,mProtocol=0,mEndpoints=[
11-23 13:54:09.158: D/UsbHostManager(1038): UsbInterface[mId=5,mAlternateSetting=0,mName=Intel(R) RealSense(TM) HID,mClass=3,mSubclass=0,mProtocol=0,mEndpoints=[
11-23 13:54:09.193: I/librs UsbUtilities(10882): device: UsbDevice[mName=/dev/bus/usb/001/002,mVendorId=32902,mProductId=2874,mClass=239,mSubclass=2,mProtocol=1,mManufacturerName=Intel(R) RealSense(TM) Depth Camera 435i,mProductName=Intel(R) RealSense(TM) Depth Camera 435i,mVersion=50.1215,mSerialNumberReader=android.hardware.usb.IUsbSerialReader$Stub$Proxy@f1fcf4d,mConfigurations=[
11-23 13:54:09.193: I/librs UsbUtilities(10882): UsbInterface[mId=0,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i Depth,mClass=14,mSubclass=1,mProtocol=0,mEndpoints=[
11-23 13:54:09.193: I/librs UsbUtilities(10882): UsbInterface[mId=1,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i Depth,mClass=14,mSubclass=2,mProtocol=0,mEndpoints=[
11-23 13:54:09.193: I/librs UsbUtilities(10882): UsbInterface[mId=2,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i Y,mClass=14,mSubclass=2,mProtocol=0,mEndpoints=[
11-23 13:54:09.193: I/librs UsbUtilities(10882): UsbInterface[mId=3,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i RGB,mClass=14,mSubclass=1,mProtocol=0,mEndpoints=[]
11-23 13:54:09.193: I/librs UsbUtilities(10882): UsbInterface[mId=4,mAlternateSetting=0,mName=Intel(R) RealSense(TM) Depth Camera 435i RGB,mClass=14,mSubclass=2,mProtocol=0,mEndpoints=[
11-23 13:54:09.193: I/librs UsbUtilities(10882): UsbInterface[mId=5,mAlternateSetting=0,mName=Intel(R) RealSense(TM) HID,mClass=3,mSubclass=0,mProtocol=0,mEndpoints=[
11-23 13:54:09.208: I/librs MessagesHandler(10882): handleMessage: realsense device attached
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.DeviceWatcher.addDevice(DeviceWatcher.java:115)
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.DeviceWatcher.invalidateDevices(DeviceWatcher.java:78)
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.DeviceWatcher.access$000(DeviceWatcher.java:16)
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.DeviceWatcher$1.onDeviceAttach(DeviceWatcher.java:44)
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.Enumerator.notifyOnAttach(Enumerator.java:102)
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.Enumerator.access$200(Enumerator.java:18)
11-23 13:54:09.213: E/UsbManager(10882): at com.intel.realsense.librealsense.Enumerator$MessagesHandler.handleMessage(Enumerator.java:144)
11-23 13:54:11.169: I/ActivityTaskManager(1038): START u0 {act=android.hardware.usb.action.USB_DEVICE_ATTACHED flg=0x10000000 cmp=com.DefaultCompany.MetaplexTest/com.intel.realsense.librealsense.DeviceWatcherActivity (has extras)} from uid 10074
11-23 13:54:11.505: I/librs MessagesHandler(10882): handleMessage: realsense device attached
These are my Debug.Log mesages from unity but in logcat:
11-23 13:59:40.318: I/Unity(13133): Cannot find camera device Intel(R) RealSense(TM) Depth Camera 435i RGB.
11-23 13:59:40.329: I/Unity(13133): OnWebCamTextureToMatHelperErrorOccurred CAMERA_DEVICE_NOT_EXIST
Seems unable to find the device by name, also tried changing the device number manually to index 3 because it looked like the RGB camera mId is 3 but it didn't work either. Are you able to make anything of this?
@Kukewilly I thoroughly researched use of OpenCV For Unity but did not find many helpful leads except for an old RealSense Unity wrapper project created by a RealSense user in the link below that made use of OpenCV For Unity.
In the Using import instructions at the top of this script, it imports:
using Intel.RealSense; using OpenCVForUnity;
@MartyG-RealSense I know it's pretty sparse out there in terms of examples. I really appreciate your help either way. I imported both of those at the tops of my permission script and the script that determines the webcam device I'm using, the webcamdevice list is still showing up as empty in Unity on logcat.
The RealSense and it's scenes work great when the camera is plugged into my pc. But I confirmed with Debug.Log(devices.length) that the device list is empty when running the realsense from the Oculus usb. So for some reason, although android is able to see the realsense and it's components when I plug it in (as per the debug messages above from logcat), the Realsense is not being added to the webcamTexture.device list in Unity.
I've read something about customizing a manifest, unfortunately I don't have the experience to know what that means or what to do with it but I will keep researching.
@Kukewilly Unity's official documentation about the Android manifest is here:
https://docs.unity3d.com/Manual/android-manifest.html
An example of a RealSense Android wrapper xml manifest - for the Camera tool - is here:
The 'Camera' tool is the source-code equivalent of the "RS Camera" RealSense app on the Google Play store.
I don't know if I'm in the right place, but I have a problem. I need to stream a good quality image from two webcams to the oculus quest pro. For the cameras I developed an application in python that allows me to join these images one after the other.
For the VR part in unity, place a plane and assign for now these images that you take as material to that plane, now assign this plane as a child to the XR Origin - Main Camera.
Now the problem is that I don't know how close or how far it has to be to be able to generate the 3D effect. I can not assign only a single image for one eye and another for another eye. Does anyone know if this is possible?
This is absolutely not the right place for it haha.
But I'll tell you anyway - the easiest way would be to create a shader that uses single pass stereo rendering to show one texture to when being viewed from the right eye camera, and another texture when being viewed from the left eye camera. It's been a few years since I've needed to mess around with stereo shaders so I can't remember the specifics, but there's a Unity-provided build-in variable called something like STEREO_EYE_INDEX
. Some Googleling or AI-dialogue should get you there :)
Issue Description
Hello everybody,
I'm designing a VR performance, in which the user would interact with a real-time point cloud of the space around them. I am using an Intel RealSense D435i depth camera and an Oculus Quest, and I'm developing in Unity.
My question is: Is it technically possible for the RealSense camera to communicate with the Quest, through the USB-C port ? If not, is it because of a hardware limitation, missing Android drivers, limited Android support or something else?
So far, the example Unity project builds without errors and when I connect the Realsense to the Quest I get the familiar sound of the Quest recognizing that something connected to it, but when the actual app runs, it does not show any pointcloud at all, and the depth camera is clearly inactive (no flashing laser, no IR illumination in the room when viewed in Passthrough mode).
Here are some interesting lines from the adb logcat output of the Quest: Lines 6-8:
and later on (Lines 23-28):
and finally (Lines 32-41):
Keep in mind that I have successfully built the native Android library (librealsense.aar) as described here and placed it in the Plugins folder, alongside Intel.RealSense.dll and realsense2.dll .
This is new ground, so any help would be greatly appreciated!
Thank you!