microsoft / webnn-developer-preview

MIT License
38 stars 7 forks source link

[Fix] Temporarily always return true from the fp16 check + beginning of better error handling #18

Closed jgw96 closed 1 month ago

jgw96 commented 1 month ago

Problem: Transformers.js currently thinks float16 is unsupported for other ML accelerators due to checking the GPU capability, but this isn't really correct because it's an unrelated device, and a workstation may not have a GPU, or the GPU may be disabled.

Note: I am not fully happy with the error handling here. Transformers.js seems to swallow errors in way that means we can't handle this specific error very well in our code. However, I want to do that as a separate PR as to not slow things down.

fdwr commented 1 month ago

@Adele101 : FYI, merging.