aszc-dev / ComfyUI-CoreMLSuite

A set of custom nodes for ComfyUI that allow you to use Core ML models in your ComfyUI workflows.
GNU General Public License v3.0
94 stars 8 forks source link

MLX? #34

Open rovo79 opened 7 months ago

rovo79 commented 7 months ago

@aszc-dev Anything in here that might lend itself to CoreMLSuite efforts? MLX: An array framework for Apple silicon

https://github.com/ml-explore/mlx

MLX is an array framework for machine learning on Apple silicon, brought to you by Apple machine learning research.

Some key features of MLX include:

MLX is designed by machine learning researchers for machine learning researchers. The framework is intended to be user-friendly, but still efficient to train and deploy models. The design of the framework itself is also conceptually simple. We intend to make it easy for researchers to extend and improve MLX with the goal of quickly exploring new ideas.

The design of MLX is inspired by frameworks like NumPy, PyTorch, Jax, and ArrayFire.

BuildBackBuehler commented 7 months ago

I checked this earlier and got to work. As far as I can tell...300%! I could be wrong but as far as I can tell, we can now run ComfyUI natively!!!

But I'm not much of a technical person. But MLX can replace TorchSDE as far as I can tell. That is at least the last requirement of ComfyUI's base that wasn't native.

TorchSDE is solely utilized here https://github.com/comfyanonymous/ComfyUI/blob/248d9125b0821851ea4b7c749df20a040f5ebe57/comfy/k_diffusion/sampling.py#L6

I'm currently trying to patch things in but it is a mess. Maybe I'll be able to do it with the Suite's help. Was running into some issues because I was trying to mesh ComfyUI with coreml-stable-diff.

Edit: eh not so certain I'll be able to get it all together. But I imagine someone can. The attention.py module was tripping up my run. Need to mesh CoreML's with Comfyui's.

Edit 2: As far as the common extended features, I can say... I know there's an onnx-runtime-silicon. I was using Miniforge so I just ran "onnx-runtime" and was able to get it. There's also onnx-coreml (in-lieu of onnx). Torchaudio is avail via Nightly conda install torchaudio -c pytorch-nightly

escoolioinglesias commented 7 months ago

@BuildBackBuehler Thank you for your efforts! Have you been able to get any improvements on inference? Really excited to know :)

BuildBackBuehler commented 7 months ago

@BuildBackBuehler Thank you for your efforts! Have you been able to get any improvements on inference? Really excited to know :)

Like I said, I didn't get close to makin it work. However, I've been picking up a bit to understand the basics. It seems there is a way to simplify and semi-automate the process of converting all the Torch refs to MLX refs. Someone actually in the ML field would probably be able to do this with ease. I was working with LLMs and it seems MLX, Pytorch, Tensorflow all have a Common Dictionary that allows one to convert framework-to-framework. Just a total shot in the dark, but I presume after the conversion, it'd likely log all the inconsistencies/unsupported definitions that the target dictionary doesn't have one for.

But the Apple team posted some naked SD performance #s here – and as you can see, no Pytorch necessary!

cchance27 commented 6 months ago

Never mind my previous comment I forgot how god-awful SDXL is on Mac I've been using 1.5 forgot how slow it is ya MLX is a step in the right direction lol

cchance27 commented 6 months ago

Few notes, there seems to be a lack of safetensor support in MLX currently their working on adding support, theres actually also no support for coreml that i can tell...

I still think it would be possible to do but it's a lot of work i think as we couldn't rely on the base model references, might likely need to be a seperate undertaking like i said since i don't seem to think it works with coreml

JTZ18 commented 6 months ago

anyone knows if there's a online community of mlx enthusiasts looking to integrate mlx into different AI applications like Automatic1111, comfyUI, Foooocus, ollama etc