This Unity demo uses echo3D's 3D model streaming in combination with ManoMotion, a framework for hand-tracking and gesture recognition in AR. Any number of models can be uploaded to the echo3D console and streamed into the app. You can tap on any detected horizontal plane to move the active model to that location, and use the button at the top of the screen to switch to the next model. The button can be tapped on via the screen, but it can also be used by placing your hand in front of the camera and making a "click" gesture behind the button, as shown below.
You can also add your own models to the echo3D console by searching or adding your own, and they will appear when you cycle through the models. Just make sure that the number of models does not exceed the value you give to the maxModels variable of the Place on Plane script of the AR Session Origin.
Build and run the AR application.
Make sure to pick the Scenes/Main scene under Scenes In Build.
Refer to our documentation to learn more about how to use Unity and echo3D.
Checkout the ManoMotion SDK Community Edition Tutorial for ARFoundation.
If you want more demos of Manomotion's technology, there are additional demos created by Manomotion in the Manomotion/Examples folder. Although these do not incorporate echo3D, they show off a lot more functionality. Please see ManoMotion's website to see documentation and to get your own API key.
Feel free to reach out at support@echo3D.co or join our support channel on Slack.
Demo created by Caleb Biddulph.