-
Hi,
I have downloaded the arFoundation-sample to test the collaborative session with Unity and ARkit3.
Have got latest MacOS Catlina/Xcode11 and iPad Pro IOS13. Build is successful. But can't get…
-
Hello @tdmowrer, @jimmy-jam,
I am trying to use _WorldTracking_ and _FaceTracking_ in the same time with rear camera feed from ARKit3 via ARFoundation.
The goal is to get eye pose (relative to f…
-
I would like to ask if ARFoundation 4 already have the support of ARKit3.5, which take advantage. of the latest LiDAR sensor from iPad pro 2020. From the document I didn't find any information. Please…
-
Is it possible to track multiple bodies/skeletons on ARKit3?
The humanbodytracking2D and 3D samples work great but they will only track one human at a time (if there is more than one it just swaps …
-
It looks like ARKit natively lets you augment to objects its ML model classifies as tables, etc. Beyond just plane detection in Unity, is it possible in ARFoundation's ARKit3 wrap to also poll for anc…
yosun updated
4 years ago
-
I am trying to associate `ARSCNView` with rendering of` GVRSCNARTrackingRenderer` and when I try to specify `let renderer = GVRSCNARTrackingRenderer (scene: arView.scene)`, I get no result. In additio…
-
Hi, while trying to build the HumanBodyTracking3D sample scene, I got an error - nw_connection_receive_internal_block_invoke [C6] Receive reply failed with error "Operation canceled"
Nothing happens …
-
-
It seems that this API changed in iOS13 but should be available (e.g., Reality Composer is using recording for Xcode debugging).
Is there still a way of accessing it?
-
The build from the generated Xcode project crashes immediately and pulls up this page in xCode:
![image](https://user-images.githubusercontent.com/9631530/59133656-534c5580-893e-11e9-8b70-5d4572d4f…