warmshao / FasterLivePortrait

Bring portraits to life in Real Time!onnx/tensorrt support!实时肖像驱动!
508 stars 48 forks source link

Problem with Onnxruntime #14

Closed nldhuyen0047 closed 3 months ago

nldhuyen0047 commented 3 months ago

I could not run the code "git checkout liqun/ImageDecoder-cuda" because there is not the directory liqun/ImageDecode-cuda, could you help me please with this problem?

warmshao commented 3 months ago

I could not run the code "git checkout liqun/ImageDecoder-cuda" because there is not the directory liqun/ImageDecode-cuda, could you help me please with this problem?

cd onnxruntime?then git checkout

nldhuyen0047 commented 3 months ago

yes, i did that and i checked the github of onnxruntime, is there any problems with liqun branch

warmshao commented 3 months ago

yes, i did that and i checked the github of onnxruntime, is there any problems with liqun branch

liqun/ImageDecoder-cuda is branch,use git branch to verify

nldhuyen0047 commented 3 months ago

thank you so much, i got it

nldhuyen0047 commented 3 months ago

but i have had this error, do you know about it: onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running GridSample node. Name:'/dense_motion_network/GridSample' Status Message: Only 4-D tensor is supported

warmshao commented 3 months ago

but i have had this error, do you know about it: onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running GridSample node. Name:'/dense_motion_network/GridSample' Status Message: Only 4-D tensor is supported

Tell me about your runtime environment and how you compile onnxruntime?

nldhuyen0047 commented 3 months ago

I created a virtual environment and installed packages. I followed what you instructed at "ONNX Inference" to run by GPU on Linux

warmshao commented 3 months ago

I created a virtual environment and installed packages. I followed what you instructed at "ONNX Inference" to run by GPU on Linu

I created a virtual environment and installed packages. I followed what you instructed at "ONNX Inference" to run by GPU on Linux

How about using Docker? It's much simpler that way

nldhuyen0047 commented 3 months ago

Sample node. Name:'/dense_motion_network/GridSample' Status Message: Only 4-D tensor is supported Traceback (most recent call last): File "/home/code/FasterLivePortrait/run.py", line 63, in dri_crop, out_crop, out_org = pipe.run(frame, img_src) File "/home/code/FasterLivePortrait/src/pipelines/faster_live_portrait_pipeline.py", line 329, in run out_crop = self.model_dict["warping_spade"].predict(f_s, x_s, x_d_i_new) File "/home/code/FasterLivePortrait/src/models/warping_spade_model.py", line 34, in predict preds = self.predictor.predict(feature_3d, kp_driving, kp_source) File "/home/code/FasterLivePortrait/src/models/predictor.py", line 211, in predict results = self.onnx_model.run(None, input_feeds) File "/home/anaconda3/envs/live-portrait/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 220, in run return self._sess.run(output_names, input_feed, run_options) onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running GridSample node. Name:'/dense_motion_network/GridSample' Status Message: Only 4-D tensor is supported

This is the traceback

warmshao commented 3 months ago

but i have had this error, do you know about it: onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running GridSample node. Name:'/dense_motion_network/GridSample' Status Message: Only 4-D tensor is supported

I suspect you haven't switched to the liqun/ImageDecoder-cuda branch.

nldhuyen0047 commented 3 months ago

Hmmm, I'm sure that I did it. I switched to the liqun/ImageDecoder-cuda branch on github and downloaded it, so I could run by GPU but it had errors.

warmshao commented 3 months ago

Using onnxruntime-gpu as the backend, an alpha version for Windows has been made. You guys can start by helping to experience it. After extracting, just run app.bat directly, and let me know if there are any issues. Please note that it is assumed you have already installed CUDA and cuDNN, and have set up the environment variables properly.

https://drive.google.com/file/d/1ijqDlMAYqAVlqwqlXDpjBS5i3A6R_f7M/view?usp=sharing

warmshao commented 3 months ago

I have verified that transferring the Python virtual environment to another Windows computer does indeed cause some issues. I am still working on resolving this. Thank you for your feedback.

nldhuyen0047 commented 3 months ago

I created a virtual environment and installed packages. I followed what you instructed at "ONNX Inference" to run by GPU on Linu

I created a virtual environment and installed packages. I followed what you instructed at "ONNX Inference" to run by GPU on Linux

How about using Docker? It's much simpler that way

Because I want to run on my local, I do not try running by Docker. Hmmm, but I think I will try it

nldhuyen0047 commented 3 months ago

I tested it on linux operating system, if you have any solutions, please let me know. Thank you so muchhhhh

nldhuyen0047 commented 3 months ago

@warmshao I think it's better because I use onnxruntime gpu version 1.7 instead of version 1.8 but it has an error such as:

Traceback (most recent call last): File "/home/code/FasterLivePortrait/run.py", line 65, in dri_crop = cv2.resize(dri_crop, (512, 512)) cv2.error: OpenCV(4.10.0) /io/opencv/modules/imgproc/src/resize.cpp:4152: error: (-215:Assertion failed) !ssize.empty() in function 'resize'

Everything was ok, from upload image, I choose dri_video 0 and runtime

nldhuyen0047 commented 3 months ago

Hi @warmshao the issue is that I use onnxruntime version 18 instead of onnxruntime version 17

warmshao commented 3 months ago

Hi @warmshao the issue is that I use onnxruntime version 18 instead of onnxruntime version 17

can you run successfully?

warmshao commented 3 months ago

Hi guys, Install-free, extract-and-play Windows package with TensorRT support now available! Please watch change log, really fast!!!

nldhuyen0047 commented 3 months ago

Hi @warmshao the issue is that I use onnxruntime version 18 instead of onnxruntime version 17

can you run successfully?

yah, I can run successfully

warmshao commented 3 months ago

enjoy~