SthPhoenix / InsightFace-REST

InsightFace REST API for easy deployment of face recognition services with TensorRT in Docker.
Apache License 2.0
501 stars 117 forks source link

ValueError when using the CPU build via Docker #116

Open mnts-i opened 1 year ago

mnts-i commented 1 year ago

I am running Docker on Ubuntu and I used the provided CPU dockerfile to build the image. The image is created successfully but the app keeps crashing with the following error:

valueerror: this ort build has ['azure execution provider', 'cudaexecutionprovider'] enabled. since ort 1.9, you are required to explicitly set the providers parameter when instantiating inferencesession

After some digging I found a similar issue from a different project: https://github.com/Gourieff/sd-webui-reactor/issues/108. I followed the comments and edited the requirements.txt, bumping the onnx from 1.13.0 to 1.14.0 (and adding the onnxruntime==1.15.0) and the app started working again. Just posting it here for anyone who encounters the same error.

SthPhoenix commented 11 months ago

Hi! That's weird, I have build image from scratch and have no errors, but anyway I'll bump versions, just in case.Thanks!