Open wimvanhenden-tool opened 9 months ago
Since webgpu is released, we are trying to keep investments in webgl at a minimum. Would you be interested in trying webgpu? Just tried volov8 and it works fine and performance will be lot better. 2 lines need to change: include ort.webgpu.min.js instead of ort.min.js:
<script src="https://cdn.jsdelivr.net/npm/onnxruntime-web@1.17.1/dist/ort.webgpu.min.js"> </script>
and use the webgpu execution provider:
{ executionProviders: ['webgpu'] }
@guschmue this seems to work! It does not generate any errors.
I will further implement a fully working version.
Thank you!
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
I am facing the exact same error than the author of that issue. I am using v.1.17.1 I have tried to change the backend to use webgpu but I am facing the following errors
My code can be found in the following repo https://github.com/huats/yolov8_inference_video_javascript
Any feedback or help is really welcome
bumping this issue - I'm seeing the same thing when trying to use the WebGL provider (Error: resize (packed) does not support mode: 'nearest') and same errors as posted by @huats if I try the WebGPU provider - I'm trying to get this running on mobile and according to the docs the WebGPU platform is not supported on mobile anyway.
I've tried exporting the model with various different opsets hoping one would work but hasn't seemed promising. Is there possibly a setting at export time we can use to get around this?
Since webgpu is released, we are trying to keep investments in webgl at a minimum.
I was wondering if there is some data on how wide WebGPU support is. Obviously the browser needs to support it (which is easy to check), but then there is also the hardware that needs to support it. IIRC in my tests a couple of months ago, quite some laptops from the early 2020's would not run WebGPU on the latest Chrome, but would do WebGL (and considerably faster than WASM). This was using YOLOv8 on TensorflowJS.
WebGPU is based on native graphics APIs, D3D12, Vulkan and Metal. So in theory, where we have these APIs supported, we have WebGPU. Below are some data shared by Google and more details can be found here.
Windows: 34% of Chrome users do not have D3D12 Android: 23% of Android users do not have Vulkan 1.1 (15% do not have 1.0) ChromeOS: 50%* of users do not have Vulkan (estimated)
These native APIs are actually supported by very old devices. For example, Intel's Gen7 GPU released in 2013 also supports D3D12 (https://www.intel.com/content/www/us/en/support/articles/000058971/processors/intel-core-processors.html).
So for the laptops in 2020, if they're not on Linux, maybe there is some bug to support WebGPU. You're always welcome to file an issue to Chrome via https://issues.chromium.org/issues/new and attach the chrome://gpu info.
Describe the issue
Trying to run Yolov8.onnx file with WebGl backend in javascript fails.
Uncaught (in promise) Error: resize (packed) does not support mode: 'nearest' at Tb (resize-packed.ts:81:15) at Object.get (resize-packed.ts:28:24) at Qn.executeProgram (inference-handler.ts:64:100) at Qn.run (inference-handler.ts:82:36) at Object.zi [as impl] (resize-packed.ts:24:39) at execution-plan.ts:99:61 at zn.event (instrument.ts:337:17) at execution-plan.ts:98:48
This is basically the code I am running to test:
Thank you for your help.
To reproduce
Download Yolov8n onnx model here MODEL
Run this HTML page in a webserver (LiveServer in Visual Studio Code fi):
<!DOCTYPE html>