Open AngeloD2022 opened 1 year ago
This unfortunately yields the following error when using nnhash.py
2023-03-12 17:41:18.591325 [W:onnxruntime:, graph.cc:3490 CleanUnusedInitializersAndNodeArgs] Removing initializer '219'. It is not used by any node and should be removed from the model.
2023-03-12 17:41:18.591347 [W:onnxruntime:, graph.cc:3490 CleanUnusedInitializersAndNodeArgs] Removing initializer '223'. It is not used by any node and should be removed from the model.
2023-03-12 17:41:22.612642 [E:onnxruntime:, sequential_executor.cc:494 ExecuteKernel] Non-zero status code returned while running FusedConv node. Name:'' Status Message: X num_dims does not match W num_dims. X: {1,1280,1,1} W: {500}
Traceback (most recent call last):
File "/Users/angelodeluca/miniforge3/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 200, in run
return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero status code returned while running FusedConv node. Name:'' Status Message: X num_dims does not match W num_dims. X: {1,1280,1,1} W: {500}
neuralhash.zip Here is a copy of the latest model.
Looks like the conversion of inner_product
layers is not correct. You can fix it by implementing similar _fp16
weights handling logic to convolution
layers.
neuralhash.zip Here is a copy of the latest model.
Hi! Have you made any progress on this?
Haven't looked into it since.
I have managed to get a model converted using the conversion script that I modified: