Problem: Transformers.js currently thinks float16 is unsupported for other ML accelerators due to checking the GPU capability, but this isn't really correct because it's an unrelated device, and a workstation may not have a GPU, or the GPU may be disabled.
Always return true from the fp16 check for now
Add new error message as this check could cause an error on deny-listed GPUs
Note: I am not fully happy with the error handling here. Transformers.js seems to swallow errors in way that means we can't handle this specific error very well in our code. However, I want to do that as a separate PR as to not slow things down.
Problem: Transformers.js currently thinks float16 is unsupported for other ML accelerators due to checking the GPU capability, but this isn't really correct because it's an unrelated device, and a workstation may not have a GPU, or the GPU may be disabled.
Note: I am not fully happy with the error handling here. Transformers.js seems to swallow errors in way that means we can't handle this specific error very well in our code. However, I want to do that as a separate PR as to not slow things down.