Open ghost opened 5 months ago
Hello, @alphamemematic
Thank you for the kind words. When this node was developed, it was intended as an adapter so that you could connect a Core ML model to a regular KSampler. To achieve that, I wrapped the Core ML object into something that resembled Comfy UI’s Model to a degree that allowed for that, but not much more. Because of that, many advanced workflows requiring Model as an input are not supported (most of the time, it won’t even be possible to handle such use cases). I’m not familiar with this particular node, but if I find some time, I can take a look to see if anything can be done.
A converted model will only accept samples of shapes that were used during conversion, so if you converted the model to work with 512x512 images, you cannot use latent images of any other size. This is what I meant by expected inputs of the model; sorry if it wasn’t worded clearly enough. I cannot offer any more tips without going into use-case-specific details. As I mentioned, I might be able to look into InstantID, but I can’t promise anything. If you don’t feel like debugging or implementing the solution yourself, there’s not much you can do if the node doesn’t work out of the box (assuming you’re using the correct image sizes).
I’m sorry if that’s not what you wanted to hear. I hope you still find this repo useful despite its limitations.
Cheers, Adrian
What is this error about?
Hi! Really like your repo, and am trying to utilize the Core ML Adapter (Experimental) node, and I understand:
When I try to link your CoreML suite into the following repo [workflows], https://github.com/cubiq/ComfyUI_InstantID/blob/main/examples/InstantID_basic.json, I get errors such as:
assert y.shape[0] == x.shape[0] AttributeError: 'NoneType' object has no attribute 'shape'
So my question is, how do I pay attention to the expected inputs of the model. like you suggest?
Any tips!? Thanks in advance!!!