immersive-web / webxr-hand-input

A feature repo for working on hand input support in WebXR. Feature lead: Manish Goregaokar
https://immersive-web.github.io/webxr-hand-input/
Other
104 stars 17 forks source link

AR fingerPose detection #115

Closed Simon1059770342 closed 2 years ago

Simon1059770342 commented 2 years ago

Is it possible to conduct finger pose detection in an immersive-ar sample (for example, hit-testing)?

Or how can I get the videoStream in immersive-ar sample?

That detection might be doable if I can acquire a video stream.

MANY THANKS FOR YOUR ASSISTANCE!

cabanier commented 2 years ago

Is it possible to conduct finger pose detection in an immersive-ar sample (for example, hit-testing)?

Yes, Hololens supports AR, hand tracking and also hit testing so you should be able to hit the real world with your fingers.

Or how can I get the videoStream in immersive-ar sample?

If the platform supports a camera, it should be the same as usual. I believe that's how 8th wall works.

Simon1059770342 commented 2 years ago

Yes, Hololens supports AR, hand tracking and also hit testing so you should be able to hit the real world with your fingers.

THX! What I really want is detect fingerpose in AR on Android device, but I found there is no solution or sample for that?

and I try getFrame() but the result can not feed into model.estimateHands() as a video resource.

Show code as follow

const predictions = await model.estimateHands(video)

If the platform supports a camera, it should be the same as usual. I believe that's how 8th wall works.

Dose this means immersive-ar sample is run based on 8th wall? But for me, I reckon the two belongs to different platform

Simon1059770342 commented 2 years ago

FYI my purpose is to make a virtual object in AR scene chang different behaviours by recognizing my finger pose, so this could be slightly different from hand input in WebXR hand input docs.

WebXR hand input vs. Fingerpose detection input

For the best of my knowledge,

Hand input comes from controller, like HTC Vive handheld device, or phone screen.

Fingerpose detection input comes out of camera frame.

AdaRoseCannon commented 2 years ago

You don't use the camera to get the finger position in WebXR.

You get the position of the joints from .hand on the WebXR Inputs then use that information to work out what pose the user is making.

Here is a library you can use to do that: https://github.com/AdaRoseCannon/handy-work

AdaRoseCannon commented 2 years ago

Also this isn't the best place to get help with WebXR it's for the discussion about the API shape, the WebXR discord has a great community of developers.

Simon1059770342 commented 2 years ago

Here is a library you can use to do that: https://github.com/AdaRoseCannon/handy-work

THX! I will give it a shot!

Simon1059770342 commented 2 years ago

I am sorry Ada! I try so many ways to log in discord but no work because my Chinese number didnt get any verify message.

Do you know Is there any other place for discussion?

THX for your help!

Simon1059770342 commented 2 years ago

You don't use the camera to get the finger position in WebXR.

How can I estimate the finger pose without camera frame on Android device.

You get the position of the joints from .hand on the WebXR Inputs then use that information to work out what pose the user is making.

Is this above available only in AR headset (Hololens) but Not in Phone (Android)?

Here is a library you can use to do that: https://github.com/AdaRoseCannon/handy-work

If there is a demo for this library I can test on my own could be better to understand your meaning, I found it's a little be difficult to find anything related to what I aim to do (control the object in AR scene by finger gesture in front of camera) in that lib.

THX for your help!

AdaRoseCannon commented 2 years ago

As far as I know no Phone based WebXR implementation supports hand tracking.

The best way to test the library without an AR headset is on the Oculus Quest

Simon1059770342 commented 2 years ago

THX!

Hope future this become a proposal!

I reckon that I could work around hand gesture on screen to get the similar interaction between object and human.

AdaRoseCannon commented 2 years ago

There is nothing in the spec stopping a web browser adding it as a feature but none have yet.