Open ghost opened 3 years ago
when running the requirements.txt of keras-yolov3-modelset -i 'm getting error for coremltools.it is showing like "couldn't find a version that satisfies the requirement tensorflow<=1.14 and tensorflow >=1.5(from tfcoremltools -r requirements.txt).(from version :2.2.0,2.2..1, 2.2.2, ...2.7.0rc0,2.7.0.rc1............) like this .can someone help me regarding this. Also ,i have a doubt .can we use ubuntu 20.04 ,cuda 11.7 ,cudnn 8.4.0 for this project. or have to use ubuntu 18.04,cuda 10.0 only which only works.please help me regarding this,i have less time in my hand.
I'm following this Xilinx Tutorial about the implementation of a U-Net in the ZCU104 Evaluation Board and I have come up with an error during the compilation step.
I've trained a U-Net in Matlab 2020b and exported to Keras via onnx2keras and followed the steps of the tutorial without any errors:
The full error message is:
At first, I thought that the compiler may not support certain layers such as Conv2DTransposed (a way of upsampling images) but even though the documentation says that the Tensorflow version needs to be higher than 2.0 and I'm using 1.15.2, the tutorial includes a U-Net made of those layers and I've compiled it without any problem so, that's not the problem, I think.
Then, I've decided to compare both neural networks after freezing and also after quantization, so as to try to find some information that may be missing in my U-Net that does include it the other one.
Inspection results after freezing. Op types used (my U-Net --> tutorial U-Net):
There are differences between the two freezing processes as the two U-Nets are two different modified versions of the original one. However, as I see it, I don't think that LeakyRelu, Pad, AddV2 or Sub (the ones that appear in my model and not in the model of the tutorial) are related to the error.
Similarly, after quantization these are the differences. Op types used (my U-Net --> tutorial U-Net):
I don't know exactly where the error comes from so any kind of help would be highly appreciated.
Thanks in advance,
Jon.