Closed m4a1carbin4 closed 7 months ago
I found what is problem from this comment
// for some reason v1.4 takes int64 as timestep input. ideally we should get input dtype from the model // but currently onnxruntime-node does not give out types, only input names
Did you make it work @m4a1carbin4?
I found an ONNX model and it causes the same error. I didn't dig too deep into the issue yet, but I assume we have to wait for an update or prototype patch onnxruntime in order to receive the dtype and act accordingly?
In My case it's just misstake of mine
link see this link will help you. it just a matter of settings.
Ok... so I just want to use my stable diffusion model combine with my own lora + LCM_LoRa
I Use Optimum to convert model [SD1.5 base model + my lora + LCM_LoRa] to onnx
But I think this repo doesn't have ORTStableDiffusionPipeline Support.
SO I test it with LatentConsistencyModelPipeline, and StableDiffusionPipeline + LCMScheduler (convert DiffusionPipeline.ts)
But I always got same error.
well I don't Know what is causing this error.... Someone help please?
my onnx repo : https://huggingface.co/WGNW/chamcham_v1_checkpoint_onnx