KwaiVGI / LivePortrait

Bring portraits to life!
https://liveportrait.github.io
Other
8.66k stars 811 forks source link

Apple Silicon (M1) macOS Support? #65

Closed yukiarimo closed 1 week ago

yukiarimo commented 2 weeks ago

Would like to run this on my Mac ;)

rprosenc commented 2 weeks ago

+1

mc9625 commented 2 weeks ago

+1 :)

in the meantime, just for testing purpose you can refer to this discussion here: https://github.com/KwaiVGI/LivePortrait/issues/41

there is a fork for CPU that let's you run it on Apple Silicon (just a couple of changes in the requirements.txt, mainly onnxruntime==1.18.0 instead of onnxruntime-gpu==1.18.0).

It's very slow so far (of course this is pure CPU no metal at all). For the 3sec demo (inference.py) it took almost 13 min on my M1 Pro with 16gb RAM.

Now I will try to optimise it a little bit for Metal, but I am not a skilled dev, just for the sake of science ;)

p6002 commented 2 weeks ago

It would be nice if it used these AI cores from the M1 processor.

FurkanGozukara commented 2 weeks ago

just published cloud tutorial. mac users can use on cloud services : MassedCompute, RunPod and free Kaggle

https://github.com/KwaiVGI/LivePortrait/issues/78

https://youtu.be/wG7oPp01COg

LivePortrait: No-GPU Cloud Tutorial - RunPod, MassedCompute & Free Kaggle Account - Animate Images

image

Zhang-Hailan commented 2 weeks ago

raise question:AssertionError: Torch not compiled with CUDA enabled

yukiarimo commented 2 weeks ago

raise question:AssertionError: Torch not compiled with CUDA enabled

Try to run export PYTORCH_ENABLE_MPS_FALLBACK=1 in the terminal.

Zhang-Hailan commented 2 weeks ago

raise question:AssertionError: Torch not compiled with CUDA enabled

Try to run export PYTORCH_ENABLE_MPS_FALLBACK=1 in the terminal.

same matter AssertionError: Torch not compiled with CUDA enabled

image
hamseHussein commented 2 weeks ago

Would like to run this on my Mac ;)

there's a simple Mac fork already available, https://github.com/Grant-CP/ComfyUI-LivePortraitKJ-MPS

Zhang-Hailan commented 2 weeks ago

Would like to run this on my Mac ;)

there's a simple Mac fork already available, https://github.com/Grant-CP/ComfyUI-LivePortraitKJ-MPS

thank u, bro.

yukiarimo commented 2 weeks ago

Would like to run this on my Mac ;)

there's a simple Mac fork already available, https://github.com/Grant-CP/ComfyUI-LivePortraitKJ-MPS

Thanks, but is there ComfyUI only available?

mc9625 commented 2 weeks ago

OK, after many hours of coding (I am definitely NOT a skilled dev) using Claude 3.5 (...well I told you!!!) I was finally able to achieve some results for Mac OS. Basically I merged the CPU version and the Comfy UI version (plus many code of trials and errors!) and now I am able to run at least python inference.py and get a result. It took roughly less then 2 minutes on my MacBook Pro M1 16gb. Just to compare, the CPU version takes more or less 13 minutes to run the same test. Gradio still doesn't run, I need to tweak a little bit. And I still suspect that something doesn't work as expected because the final result is cropped instead of the full version. I can't share the code because, basically is a mess, but if there is any skilled dev that wants to work on Mac OS port, I can confirm that with the MPS support the things get a lot better!! BTW for comparison it would be interesting to know how much does it takes to run the python inference.py demo on a CUDA powered device.

zzzweakman commented 2 weeks ago

Hey there,

I wanted to point out that there's been some impressive work for CPU inference by ONNX model done on https://github.com/KwaiVGI/LivePortrait/issues/126. They've managed to get support for the M1 CPU. You might want to follow their progress as well. Great work is happening there : ) @yukiarimo @rprosenc @mc9625 @p6002 @hamseHussein @Zhang-Hailan

zzzweakman commented 1 week ago

Thank you for your patience, everyone. We are excited to inform you that LivePortrait now supports macOS with Apple Silicon! You can find more details here.

@yukiarimo @rprosenc @mc9625 @p6002 @hamseHussein @Zhang-Hailan

warmshao commented 1 week ago

Can someone tell me about the speed of using PyTorch on Mac M1/M2? I've been working on some optimizations recently and will be using Onnxruntime Silicon, and I'd like to compare the speeds.

cleardusk commented 1 week ago

Can someone tell me about the speed of using PyTorch on Mac M1/M2? I've been working on some optimizations recently and will be using Onnxruntime Silicon, and I'd like to compare the speeds.

M1 is about 20x slower than RTX 4090, not an exact value. @warmshao

warmshao commented 1 week ago

Can someone tell me about the speed of using PyTorch on Mac M1/M2? I've been working on some optimizations recently and will be using Onnxruntime Silicon, and I'd like to compare the speeds.

M1 is about 20x slower than RTX 4090, not an exact value. @warmshao

thanks