-
Hello, I've tried single scene, and when I start it in my hololens the cube is positioned 5-6 cm above and 5-6 cm closer to me than the marker. I've controlled marker size and settings in app and it's…
-
If we're going to provide a simple video-mixed AR set of capabilities soon, how are we going to let programmers choose which camera on a multi camera device?
For example, if I want to make a "try o…
-
First of all, thank you very much for your work, the effect is very good
In STEREO mode, multiple maps can merge when loops are detected。
But in stereo+imu mode,multiple maps can not merge when l…
-
### SPECIFIC ISSUE ENCOUNTERED
I am making an application in VR. The app is causing too much drifting while playing. as my app will be played while cycling and during cycling head of the user is cont…
-
### Describe the bug?
When opening up the Device -> Trackers, the Mapped Rotation float3 input is not able to be seen.
Please look in additional context for possible factors.
### To Reproduce
I …
-
```
Hello,
I'm using linuxtrack interface: linuxtrack_hello_word.c and trying to create a
Qwidget (with QTCreator) where i put the camera view while tracking but i can't
do that because linuxtrack_…
-
Hi @dihuangdh
I am trying to run the demo you provided
When I run "sh apps/3_background_matting.sh", I get "easymocap" missing error
More information below:
python tools/preprocess_mask.py …
-
Hello,
I created an openvr driver and it works fine. I have tried adding a device in the driver: TrackedDeviceClass_GenericTracker, this tracker device receives 3rd party positioning data via UDP, …
-
### Version
Release 2.10
### Hardware
1. Jetson Orin Nano devkit with jetpack 5.1.3
2. Stereolab zedm camera
### Issue
1. setting 'enable_imu_fusion' to True and run it shows following error…
-
### Describe what you want to implement and what the issue & the steps to reproduce it are:
Hi,
I'm using a blaze ToF camera and pypylon to process pointclouds, but the resulting pointclouds see…