ValueError: Op "complex_stft_0_promoted" (op_type: cast) Input x="complex_stft_0" expects tensor or scalar of dtype from type domain ['fp16', 'fp32', 'int32', 'bool'] but got tensor[1,1025,250,complex64] #2212
Make sure to only create an issue here for bugs in the coremltools Python package. If this is a bug with the Core ML Framework or Xcode, please submit your bug here: https://developer.apple.com/bug-reporting/
Provide a clear and consise description of the bug.
Stack Trace
mlmodel = ct.convert(
File "<python>/lib/python3.9/site-packages/coremltools/converters/_converters_entry.py", line 581, in convert
mlmodel = mil_convert(
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 188, in mil_convert
return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 212, in _mil_convert
proto, mil_program = mil_convert_to_proto(
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 288, in mil_convert_to_proto
prog = frontend_converter(model, **kwargs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 108, in __call__
return load(*args, **kwargs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 82, in load
return _perform_torch_convert(converter, debug)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 116, in _perform_torch_convert
prog = converter.convert()
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 581, in convert
convert_nodes(self.context, self.graph)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 86, in convert_nodes
raise e # re-raise exception
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 81, in convert_nodes
convert_single_node(context, node)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 134, in convert_single_node
add_op(context, node)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 1549, in pow
x, y = promote_input_dtypes(inputs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/ops/defs/_utils.py", line 456, in promote_input_dtypes
input_vars[i] = _promoted_var(var, promoted_dtype)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/ops/defs/_utils.py", line 441, in _promoted_var
x = mb.cast(
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/ops/registry.py", line 182, in add_op
return cls._add_op(op_cls_to_add, **kwargs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/builder.py", line 182, in _add_op
new_op = op_cls(**kwargs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/operation.py", line 191, in __init__
self._validate_and_set_inputs(input_kv)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/operation.py", line 504, in _validate_and_set_inputs
self.input_spec.validate_inputs(self.name, self.op_type, input_kvs)
File "<python>/lib/python3.9/site-packages/coremltools/converters/mil/mil/input_type.py", line 163, in validate_inputs
raise ValueError(msg.format(name, var.name, input_type.type_str,
ValueError: Op "complex_stft_0_promoted" (op_type: cast) Input x="complex_stft_0" expects tensor or scalar of dtype from type domain ['fp16', 'fp32', 'int32', 'bool'] but got tensor[1,1025,250,complex64]
🐞Describing the bug
Stack Trace
To Reproduce
I am modifying this script to convert model to MLModel https://github.com/RVC-Boss/GPT-SoVITS/blob/main/GPT_SoVITS/onnx_export.py And I have rewritten
to
System environment (please complete the following information):
Additional context