apple / coremltools

Core ML tools contain supporting tools for Core ML model conversion, editing, and validation.
https://coremltools.readme.io
BSD 3-Clause "New" or "Revised" License
4.44k stars 643 forks source link

TF2 SSD MobileNet v2 - Cast: Provided destination type bool not supported (ver 6.0b2) #1586

Open apivovarov opened 2 years ago

apivovarov commented 2 years ago

I got Cast: Provided destination type bool not supported when converting TF2 SSD MobileNet v2 320x320 using coremltools 6.0b2

Example1 (convert saved_model dir):

import coremltools as ct
image_input = ct.ImageType(shape=(1, 320, 320, 3))
model = ct.convert("saved_model", inputs=[image_input])

Example2 (convert "serving_default" function)

import tensorflow as tf
m=tf.saved_model.load("saved_model")
f=m.signatures["serving_default"]
import coremltools as ct
image_input = ct.ImageType(shape=(1, 320, 320, 3))
model = ct.convert([f], inputs=[image_input])

Got the following Stack Trace (for both examples):

Converting TF Frontend ==> MIL Ops:  97%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▋    | 4528/4671 [00:05<00:00, 765.76 ops/s]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/_converters_entry.py", line 463, in convert
    specification_version=specification_version,
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/converter.py", line 193, in mil_convert
    return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/converter.py", line 225, in _mil_convert
    **kwargs
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/converter.py", line 283, in mil_convert_to_proto
    prog = frontend_converter(model, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/converter.py", line 105, in __call__
    return tf2_loader.load()
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/tensorflow/load.py", line 86, in load
    program = self._program_from_tf_ssa()
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/tensorflow2/load.py", line 208, in _program_from_tf_ssa
    return converter.convert()
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/tensorflow/converter.py", line 475, in convert
    self.convert_main_graph(prog, graph)
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/tensorflow/converter.py", line 398, in convert_main_graph
    outputs = convert_graph(self.context, graph, self.output_names)
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/tensorflow/convert_utils.py", line 189, in convert_graph
    add_op(context, node)
  File "/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/tensorflow/ops.py", line 1395, in Cast
    "supported.".format(types.get_type_info(node.attr["DstT"]))
NotImplementedError: Cast: Provided destination type bool not supported.
TobyRoseman commented 2 years ago

Can you give us a minimal example to reproduce this problem (i.e. something which doesn't use an external model)?

apivovarov commented 2 years ago

I think I provided quite popular model from tensorflow/models project and the code to reproduce the issue. I guess somebody from coremltools should investigate on it.