Getting an error on var output = unetSession.Run(input); in StableDiffusion.ML.OnnxRuntime.dll!StableDiffusion.ML.OnnxRuntime.UNet.Inference(string prompt, StableDiffusion.ML.OnnxRuntime.StableDiffusionConfig config) Line 110 C#
Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:RuntimeException] Non-zero status code returned while running MultiHeadAttention node. Name:'MultiHeadAttention_0' Status Message: D:\a_work\1\s\onnxruntime\core\providers\dml\DmlExecutionProvider\src\MLOperatorAuthorImpl.cpp(2448)\onnxruntime.DLL!00007FFCD3CC5645: (caller: 00007FFCD3CC4CDA) Exception(3) tid(7d60) 80070057 The parameter is incorrect.
The input to Run is all FP16 tensors. Any suggestions?
I am using the DirectML FP16 branch on RTX 3070 and a Microsoft Olive Optimized Base SD 1.5 Model that I downloaded from here. https://huggingface.co/axodoxian/stable_diffusion_onnx
Getting an error on var output = unetSession.Run(input); in StableDiffusion.ML.OnnxRuntime.dll!StableDiffusion.ML.OnnxRuntime.UNet.Inference(string prompt, StableDiffusion.ML.OnnxRuntime.StableDiffusionConfig config) Line 110 C#
Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:RuntimeException] Non-zero status code returned while running MultiHeadAttention node. Name:'MultiHeadAttention_0' Status Message: D:\a_work\1\s\onnxruntime\core\providers\dml\DmlExecutionProvider\src\MLOperatorAuthorImpl.cpp(2448)\onnxruntime.DLL!00007FFCD3CC5645: (caller: 00007FFCD3CC4CDA) Exception(3) tid(7d60) 80070057 The parameter is incorrect.
The input to Run is all FP16 tensors. Any suggestions?