Closed jallamsetty1 closed 4 years ago
Is wasm backend supported for bodyPix model ? Can you please let me know if this is a missing implementation or a bug, thanks !
BodyPix is a large model and a single inference on WASM takes ~190ms on MacbookPro vs 77ms on WebGL.
Can you share your hardware/environment? Also does the tab work if you remove the BokehEffect but keep the call to segmentPerson
?
Thank you for responding. I have a powerful machine, here is my system config:
MacBook Pro (15-inch, 2019) Processor - 2.3 GHz Intel Core i9 Memory - 32 GB 2400 MHz DDR4 Graphics - Radeon Pro 560X 4 GB Intel UHD Graphics 630 1536 MB
I still see the same behavior even when remove the BokehEffect and just keep the call to segmentPerson.
I am trying to implement background blur for a videoconferencing solution. I am waiting for the segmentPerson to complete before calling it again and by doing so I get about 12-13 fps for video on powerful laptops like mine using the WebGL backend. On slower machines, it drops down to 4-5 fps and is not usable. I was trying to use the WASM backend to see if it speeds up the inference. Do you have any suggestion that will help me achieve a good frame rate with background blur ? Many thanks !
WASM will likely not help here, since the WASM backend is 3-4X slower than the WebGL backend for "medium-sized" models like BodyPix, PoseNet and MobileNet.
The WASM backend can be faster than the WebGL backend for ultra-lite models (1-3MB, 20-60M multiply-adds) like Face Detector.
I suspect the unresponsiveness is due to the WASM computation taking too long. Have to tried measuring the time of a single inference using WASM? (calling it only once)?
@dsmilkov, sorry about the delay in getting back to you. I was able to give this a try today and saw that that a single inference with WASM was taking around 460ms while it was taking around 2140ms with the WebGL backend. So WASM does seem to be must faster than WebGL backend but I can't seem to figure out why the tab becomes unresponsive.
In my experience TF.js blocks the thread it is running in during the inference, even with WebGL backend. So I moved TF.js usage into a WebWorker to mitigate that, and it fixed the UI blocking (it was very apparent during ~5 secs warmup especially, but for each inference it blocks UI thread too when I intentionally made inference more expensive), but then I can ~only support Chrome now because WebGL backend is only available through OffscreenCanvas inside WebWorker. It is common in community:
https://erdem.pl/2020/02/making-tensorflow-js-work-faster-with-web-workers https://medium.com/@wl1508/webworker-in-tensorflowjs-49a306ed60aa https://towardsdatascience.com/tensorflow-js-using-javascript-web-worker-to-run-ml-predict-function-c280e966bcab
@dsmilkov @nsthorat @tylerzhu-github @rthadur I'm wondering do you think there could be another way to avoid blocking UI/main thread while still supporting more browsers and using WebGL backend? Thanks and sorry for mentioning all.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 dyas if no further activity occurs. Thank you.
Closing as stale. Please @mention us if this needs more attention.
TensorFlow.js version
1.5.1
Browser version
Chrome - 79.0.3945.130
Describe the problem or feature request
I am using the bodyPix model for doing person segmentation and then using the segmentation data for drawing the bokeh effect on a HTML canvas. The chrome tab becomes unresponsive when I try to set the backend to 'wasm'. There are no issues when I use the 'webgl' backend.
Code to reproduce the bug / link to feature request