Closed Simon1059770342 closed 2 years ago
Is it possible to conduct finger pose detection in an immersive-ar sample (for example, hit-testing)?
Yes, Hololens supports AR, hand tracking and also hit testing so you should be able to hit the real world with your fingers.
Or how can I get the videoStream in immersive-ar sample?
If the platform supports a camera, it should be the same as usual. I believe that's how 8th wall works.
Yes, Hololens supports AR, hand tracking and also hit testing so you should be able to hit the real world with your fingers.
THX! What I really want is detect fingerpose in AR on Android device, but I found there is no solution or sample for that?
and I try getFrame() but the result can not feed into model.estimateHands() as a video resource.
Show code as follow
const predictions = await model.estimateHands(video)
If the platform supports a camera, it should be the same as usual. I believe that's how 8th wall works.
Dose this means immersive-ar sample is run based on 8th wall? But for me, I reckon the two belongs to different platform
FYI my purpose is to make a virtual object in AR scene chang different behaviours by recognizing my finger pose, so this could be slightly different from hand input in WebXR hand input docs.
WebXR hand input vs. Fingerpose detection input
For the best of my knowledge,
Hand input comes from controller, like HTC Vive handheld device, or phone screen.
Fingerpose detection input comes out of camera frame.
You don't use the camera to get the finger position in WebXR.
You get the position of the joints from .hand on the WebXR Inputs then use that information to work out what pose the user is making.
Here is a library you can use to do that: https://github.com/AdaRoseCannon/handy-work
Also this isn't the best place to get help with WebXR it's for the discussion about the API shape, the WebXR discord has a great community of developers.
Here is a library you can use to do that: https://github.com/AdaRoseCannon/handy-work
THX! I will give it a shot!
I am sorry Ada! I try so many ways to log in discord but no work because my Chinese number didnt get any verify message.
Do you know Is there any other place for discussion?
THX for your help!
You don't use the camera to get the finger position in WebXR.
How can I estimate the finger pose without camera frame on Android device.
You get the position of the joints from .hand on the WebXR Inputs then use that information to work out what pose the user is making.
Is this above available only in AR headset (Hololens) but Not in Phone (Android)?
Here is a library you can use to do that: https://github.com/AdaRoseCannon/handy-work
If there is a demo for this library I can test on my own could be better to understand your meaning, I found it's a little be difficult to find anything related to what I aim to do (control the object in AR scene by finger gesture in front of camera) in that lib.
THX for your help!
As far as I know no Phone based WebXR implementation supports hand tracking.
The best way to test the library without an AR headset is on the Oculus Quest
THX!
Hope future this become a proposal!
I reckon that I could work around hand gesture on screen to get the similar interaction between object and human.
There is nothing in the spec stopping a web browser adding it as a feature but none have yet.
Is it possible to conduct finger pose detection in an immersive-ar sample (for example, hit-testing)?
Or how can I get the videoStream in immersive-ar sample?
That detection might be doable if I can acquire a video stream.
MANY THANKS FOR YOUR ASSISTANCE!