Open aferriss opened 6 years ago
I think one note to add - I can't quite remember where off the top of my head but I was reading somewhere that mentioned if the scanning functionality is turned on, some features of ARKit are automatically turned off - this might be a good opportunity to test out some kind of session re-loading as well.
Yep! According to Apple..
ARObjectScanningConfiguration is for use only in development scenarios. High-fidelity spatial mapping has a high performance and energy cost, and disables ARKit features not necessary for reference object scanning.
I'm curious exactly what those features are
Implement functions to detect and track an object based on a previously made reference scan.
Firs the reference object must be loaded as an ARReference Object. Supposedly you can pass a number of these to the ARSession's configuration to a
detectionObjects
property.The reference objects need to be added to your asset catalog before they can be loaded. There are two loader functions, one loads from a url and one loads all objects that are in the code bundle.
If ARKit recognizes an object in the scene it will add a new type of anchor called an ARObjectAnchor. The anchors have their own transform and other properties that you can inspect.
It's a little unclear to me how to display the 3d scanned object, if you even can. I did see that the ARReferenceObjects have a rawFeaturePoints property, so you can at least get at the point cloud data.
I think it would also be really good if we could provide an example of loading a reference object file into an app and drawing something in it's transform. It'd be great if we could create the scan of something common that lots of people might just have laying around (a banana? can of soda?).