nosy-b / holography

A small demo using WebXR and Deep Learning to create Holograms of people
https://nosy-b.github.io/holography
231 stars 34 forks source link

Samsung S9 issue #2

Closed Omodaka9375 closed 4 years ago

Omodaka9375 commented 4 years ago

Hi, I have AR support on my S9 and I see the button, but nothing happens when I tap the screen (tried photo and real person). Chrome is r81 also. When I get the info screen and try camera,only my rear camera work, front camera has black screen.

Omodaka9375 commented 4 years ago

I just looked at the console and the cdn url is not resolving. This is the reason it shows the support is ok for the device, but actually not working when tried to use it. I guess local version of body-pix and tf would solve the problem.

Omodaka9375 commented 4 years ago

Another error after warning above.

Javascript DOMException: The associated Track is in an invalid state

Looks live one of these conditions isn't met when grabFrame is performed: !(imageCapture.track.readyState != 'live' || !imageCapture.track.enabled || imageCapture.track.muted

I'm not used working with ImageCapture so much, so this much is how I can get.

revolunet commented 4 years ago

Hi, trying to debug the same issue on Motorola G7

Looks like the Javascript DOMException: The associated Track is in an invalid state is raised because the track is muted.

It starts not muted but becomes muted at some point :

Capture d’écran 2020-05-01 à 15 44 50

Any idea ?

Omodaka9375 commented 4 years ago

If you only ask for video stream with no audio eg. getUserMedia({video:true}), track being muted should be ok.

revolunet commented 4 years ago

tried, even with audio: false but looks like something make the video "muted"

from MDN : "The mute event is sent to a MediaStreamTrack when the track's source is temporarily unable to provide media data."

Omodaka9375 commented 4 years ago

When you first receive it the MediaStreamTrack it's unmuted. But once 'Start AR' button is pressed, it mutes the track I believe, or corrupts it in some way, because when you do grabFrame() it fails with "Invalid track state error". I tried setting explicitly audio true, but the track still got muted.

revolunet commented 4 years ago

confirmed

revolunet commented 4 years ago

tried to replace the grabFrame code with this :

var canvas = document.createElement('canvas');
var ctx = canvas.getContext('2d');
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);

textureOriginal = new THREE.Texture(canvas);
textureOriginal.needsUpdate = true;

video.hidden = true;
loadAndPredict(textureOriginal.image);

But for some reason the segmentation looks like it doesnt work (allPose=0) and so no particles are generated :/

segmentation {height: 640, width: 480, data: Uint8Array(307200), allPoses: Array(0)}
Particles count:  0
Omodaka9375 commented 4 years ago

So you solved the Invalid Track error?

revolunet commented 4 years ago

Yes, the image is grabbed differently but something still doesnt work

Omodaka9375 commented 4 years ago

Screenshot-20200501195502-595x262

Did you get model to finish downloading?

Omodaka9375 commented 4 years ago

I still can't bypass the Invalid Track error. Is there anything else you have changed, maybe?

Omodaka9375 commented 4 years ago

Finally managed to get it to work. Used this to download minified files without sourceMaps.

`

` And also the workaround without grabFrame()

revolunet commented 4 years ago

Cool! What does your particule count says ?

Omodaka9375 commented 4 years ago

image Works fine after some tuning for performance. I will optimize all the code and post it.

revolunet commented 4 years ago

Mine just generates a few particles for some reason. cant wait to test your version :)

Omodaka9375 commented 4 years ago

Try it out and let me know: https://hologram-webxr.herokuapp.com/

Just point camera to a photo or a person immediately, "hologram" will spawn just above.

revolunet commented 4 years ago

nice, i get some particles but still not enough.

i didnt mention but for some reason my phone doesnt appear to have zoom capabilities in the getCapabilities function so when starring at some pic, it appears blurry. i guess tensorflow cannot recognize the shape due to this. i'll try to fix that

In your example, only the first "shot" is used to create particules. if i make another shot at some other picture, this is always the first that is reproduced

Omodaka9375 commented 4 years ago

I just uploaded example of how it works on my phone.

https://youtu.be/VkoQecLPkS4

You be the judge on particle number :)

Omodaka9375 commented 4 years ago

What I did was:

revolunet commented 4 years ago

nice result 👏

Omodaka9375 commented 4 years ago

If you have problems with body-pix try setting multiplier to 0.50 in loadModel(). It is recommended for mobile phones. It's faster, but less accurate, and maybe will help your case.

lossless commented 4 years ago

I tried the reworked version, and it works great as long as I am looking at the target photo with the camera before I enter AR, and it will only work once. Due the way WebXR blocks camera access while in AR mode, I believe you cannot get access to the back camera's live feed once you are in AR.

I'd be curious if this works for you: (1) Point your camera at a random object. (2) Enter AR. (3) Point the camera at a photo of a person, and tap. Nothing should happen because the video frame provided to BodyPix is the last frame of the frozen video stream - the random object. Another test is to enter AR while looking at one person, then point at a different person and tap; you should get particles of the wrong person.

I'm using Pixel 2 XL with Chrome 81. If the above 2 tests work for you, I would be very happy to be wrong as I'd love to get access to the live feed while in WebXR AR mode.

This also means I can't explain how the original project creator's tweeted video is working. He clearly makes holograms of 3 different people.

Omodaka9375 commented 4 years ago

I believe it has to do with this: https://developer.mozilla.org/en-US/docs/Web/API/XRReferenceSpaceType

Not all phones provide options for depth/distance tracking. In AR.js, OP set the referent space to 'local', I updated it to viewer, because both are agnostic, but 'local-floor' and others are provided per device.

'local' refSpace means you get fixed depth, unless you load a depthTexture like OP did in his code, but the texture in original code is missing.

var depthMapName = "depthMapFriends.png";

Then this part is commented out: new THREE.TextureLoader().load( depthMapName);

Omodaka9375 commented 4 years ago

Or to put it shortly, the room scanning/3d mapping doesn't work on all devices, hence no depth from camera sensors, so depth texture is used.

Omodaka9375 commented 4 years ago

Here is the code for my version: https://github.com/Omodaka9375/Holography

Thanks to @nosy-b and @revolunet we got it somehow working on our phones :1st_place_medal:

revolunet commented 4 years ago

thanks for sharing @Omodaka9375 !

nosy-b commented 3 years ago

@lossless , I come a bit late ahah I'm trying to code something new using part of this code on my new Pixel 5 phone and it seems to do as you said, WebXR cuts the access to the camera. It was not the case on my LG G7! I could take picture while in webXR mode. I'll try to see if there is a solution, it's very strange that it could work just on specific phone. Same for imageCapture functions that doesn't work.

nosy-b commented 3 years ago

Could it be because it uses the other back camera on my lg G7 for AR so it doesn't blockl the other rear camera...

"LG | G7 ThinQ | ARCore uses the wide angle fixed focus rear facing camera for AR tracking" https://developers.google.com/ar/discover/supported-devices