neuralchen / SimSwap

An arbitrary face-swapping framework on images and videos with one single trained model!
Other
4.41k stars 872 forks source link

Problem ValueError with SimSwap #445

Closed planettich closed 11 months ago

planettich commented 11 months ago

It gives me this error at the end :

ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)

Can anyone help me? thank you very much anyone can answer me

woctezuma commented 11 months ago

See:

TransAmMan commented 11 months ago

See:

Can someone help me implement this fix. I use a hosted GPU runtime. This fix assumes a local run time.

neuralchen commented 11 months ago

This bug is about the timm. We have given new installation instructions, please use the following instructions to reinstall timm: timm==0.5.4

TransAmMan commented 11 months ago

Thanks for your reply. I installed "timm==0.5.4" as you suggested. I tried in different places on the installation list:

!pip install timm==0.5.4

!pip install insightface==0.2.1 onnxruntime moviepy !pip install timm==0.5.4 !pip install googledrivedownloader !pip install imageio==2.4.1

!pip install timm==0.5.4

I used the released version of SimSwap:

https://colab.research.google.com/github/neuralchen/SimSwap/blob/main/SimSwap%20colab.ipynb#scrollTo=Y5K4au_UCkKn

and, I disconnected and deleted runtime, each time I tried. All attempts resulted in the same error:

ValueError Traceback (most recent call last) in <cell line: 30>() 28 ## model.eval() 29 ---> 30 app = Face_detect_crop(name='antelope', root='./insightface_func/models') 31 app.prepare(ctx_id= 0, det_thresh=0.6, det_size=(640,640),mode=mode) 32

5 frames /usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py in _create_inference_session(self, providers, provider_options, disabled_optimizers) 449 if not providers and len(available_providers) > 1: 450 self.disable_fallback() --> 451 raise ValueError( 452 f"This ORT build has {available_providers} enabled. " 453 "Since ORT 1.9, you are required to explicitly set "

ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)