experiencor / keras-yolo2

Easy training on custom dataset. Various backends (MobileNet and SqueezeNet) supported. A YOLO demo to detect raccoon run entirely in brower is accessible at https://git.io/vF7vI (not on Windows).
MIT License
1.73k stars 785 forks source link

Unable to convert keras-yolo2 model to CoreML due to lambda layer #361

Open visionscaper opened 5 years ago

visionscaper commented 5 years ago

Hello!

Your Keras Yolo2 model contains a lambda layer that doesn't seem to do much:

output = Lambda(lambda args: args[0])([output, self.true_boxes])

Why is it in there?

This layer stops me from converting the model in to a CoreML model ...

How can I work around this issue?

Edit: I see that "Full Yolo" also has a Lambda layer. According to https://github.com/hollance/YOLO-CoreML-MPSNNGraph/issues/13#issuecomment-341572843

I could remove it and retrain the whole model from scratch?

rodrigo2019 commented 5 years ago

the purpose of the lambda layer is when it is training, for inference you can remove it

anuar12 commented 5 years ago

I have exactly the same issue by converting to ONNX model using onnxmltools. @visionscaper did you manage to find the solution? I tried naively getting rid of Lambda layer (both at train time and at test time) but it didn't help. There is also additional Lambda in the backend.py.

Filed an issue here: https://github.com/onnx/onnxmltools/issues/159

visionscaper commented 5 years ago

@anuar12 I didn't look at onnxmltools, but you need to make an "inference" version of the YOLO model that doesn't use the true_boxes nor the Lambda layer.

Than, if you use Tiny Yolo as backend you can convert the model to a CoreML model. For the lambda layer in other backends you need to make a custom CoreML layer, but I haven't looked in to that yet.

rodrigo2019 commented 5 years ago

@visionscaper, the function to get the inference model you can get it here

anuar12 commented 5 years ago

@visionscaper and @rodrigo2019 thank you for a quick reply! Really appreciate. I just tried your suggestion. Unfortunately got the next error: Traceback (most recent call last): File "predict.py", line 179, in <module> _main_(args) File "predict.py", line 85, in _main_ onnx_model = onnxmltools.convert_keras(yolo_inf) File "/home/cheng/miniconda2/envs/tensorflow/lib/python2.7/site-packages/onnxmltools/convert/main.py", line 38, in convert_keras target_opset, targeted_onnx, custom_conversion_functions, custom_shape_calculators) File "/home/cheng/miniconda2/envs/tensorflow/lib/python2.7/site-packages/onnxmltools/convert/keras/convert.py", line 43, in convert topology.compile() File "/home/cheng/miniconda2/envs/tensorflow/lib/python2.7/site-packages/onnxmltools/convert/common/_topology.py", line 624, in compile self._infer_all_types() File "/home/cheng/miniconda2/envs/tensorflow/lib/python2.7/site-packages/onnxmltools/convert/common/_topology.py", line 500, in _infer_all_types operator.infer_types() File "/home/cheng/miniconda2/envs/tensorflow/lib/python2.7/site-packages/onnxmltools/convert/common/_topology.py", line 101, in infer_types _registration.get_shape_calculator(self.type)(self) File "/home/cheng/miniconda2/envs/tensorflow/lib/python2.7/site-packages/onnxmltools/convert/common/_registration.py", line 68, in get_shape_calculator raise ValueError('Unsupported shape calculation for operator %s' % operator_name) ValueError: Unsupported shape calculation for operator identity

I think it's just because there is Lambda layer in backend.py in FullYolo. There isn't one in the Tiny YOLO =).

Now thinking how to get rid of that Lambda and space_to_depth thing.