Closed ThuyyTran closed 1 year ago
Coremltools 6.2 is quite old. Please try with the most recent version of coremltools.
Coremltools 6.2 is quite old. Please try with the most recent version of coremltools.
I used coremltools 6.3 but the output convert required .mlpackage format, not .mlmodel
If you want to save it in the .mlmodel
format, you need to convert it for the neuralnetwork
backend. This can be done directly by adding convert_to="neuralnetwork"
to your ct.convert
call. Although, a better approach would probably be to use the minimum_deployment_target
argument. See ct.convert documentation for details.
Also note - that currently 7.0
, not 6.3
, is the latest version of coremltools.
If you want to save it in the
.mlmodel
format, you need to convert it for theneuralnetwork
backend. This can be done directly by addingconvert_to="neuralnetwork"
to yourct.convert
call. Although, a better approach would probably be to use theminimum_deployment_target
argument. See ct.convert documentation for details.Also note - that currently
7.0
, not6.3
, is the latest version of coremltools.
@TobyRoseman I try update version coremltools to 7.0 and add arg convert_to="neuralnetwork" look like: ` torch_model = torchvision.models.resnet50(pretrained=True)
torch_model.eval() example_input = torch.rand(1, 3, 224, 224) traced_model = torch.jit.trace(torch_model, example_input) out = traced_model(example_input) import coremltools as ct image_input = ct.ImageType(name="input_1", shape=example_input.shape, ) model = ct.convert( traced_model, inputs=[image_input], compute_units=ct.ComputeUnit.CPU_ONLY,convert_to="neuralnetwork" ) model.save("restnet50_v7.mlmodel") ` But the result still looks bad @@
Since your model works in Python, then this is not an issue with coremltools. This must be a problem with your swift code.
Take a look at the Classifying Images with Vision and Core ML tutorial. That should include the functionality you need.
If you still have questions after that, I suggest asking for help in the developer forum: https://developer.apple.com/forums/
@TobyRoseman Sorry but I don't know if this is the reason why my result is NaN or not. I fixed this error by changing the code at line 4377 in the path to the following "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/models/neural_network/builder.py", line 4373, in add_unary"
Old:
spec_layer_params.epsilon = epsilon
spec_layer_params.alpha = Alpha
spec_layer_params.shift = shift
spec_layer_params.scale = scale
Change:
if type(alpha) is numpy.ndarray:
val_Alpha = alpha[0]
else:
val_Alpha = alpha
spec_layer_params.epsilon = epsilon
spec_layer_params.alpha = val_Alpha
spec_layer_params.shift = shift
spec_layer_params.scale = scale
Error when I convert model
Traceback (most recent call last):
File "ConvertSolar2Coreml.py", line 218, in <module>
coreml_model = ct.convert(traced_model, inputs=[input_tensor], source='pytorch',convert_to="neuralnetwork")
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/_converters_entry.py", line 564, in convert
use_default_fp16_io=use_default_fp16_io,
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/converter.py", line 188, in mil_convert
return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/converter.py", line 217, in _mil_convert
**kwargs
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/converter.py", line 304, in mil_convert_to_proto
out = backend_converter(prog, **kwargs)
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/converter.py", line 119, in __call__
return load(*args, **kwargs)
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/backend/nn/load.py", line 264, in load
prog.functions["main"].outputs,
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/backend/nn/op_mapping.py", line 49, in convert_ops
mapper(const_context, builder, op)
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/backend/nn/op_mapping.py", line 1127, in pow
_add_elementwise_binary(const_context, builder, op, "pow")
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/backend/nn/op_mapping.py", line 768, in _add_elementwise_binary
alpha=op.y.val,
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/converters/mil/backend/nn/op_mapping.py", line 683, in _add_elementwise_unary
**kwargs
File "/home/anlab/anaconda3/envs/testconvertmodel/lib/python3.7/site-packages/coremltools/models/neural_network/builder.py", line 4373, in add_unary
spec_layer_params.alpha = alpha
TypeError: array([3.2220182], dtype=float32) has type numpy.ndarray, but expected one of: int, float
@ThuyyTran - I'm confused. You say your change fixed the error, but you also say you can no longer convert the model. How is the issue fixed if you can't even convert your model?
@TobyRoseman Sorry for that, I mean in the model conversion step I got the error as above. I tried to fix the error by editing the code in the coremltools library and converted successfully. I mean I'm wondering if me trying to fix the code in the library is the reason why my result is nan @@
Since you can get correct prediction in Python, I don't think this is an issue with coremltools. I think the is an issue with your Swift code.
@TobyRoseman But can you explain to me the error I get when converting the model?
It most be related to your local change.
@TobyRoseman Editing the library directly like that won't cause any issues, will it?
When I run coreml model in python result is good: {'var_840': array([[-8.15439941e+02, 2.88793579e+02, -3.83110474e+02, -8.95208740e+02, -3.53131561e+02, -3.65339783e+02, -4.94590851e+02, 6.24686813e+01, -5.92614822e+01, -9.67470627e+01, -4.30247498e+02, -9.27047348e+01, 2.19661942e+01, -2.96691345e+02, -4.26566772e+02........ But when I run on xcode so result look like: [-inf,inf,nan,-inf,nan,nan,nan,nan,nan,-inf,-inf,-inf,-inf,-inf,-inf,nan,-inf,-inf,nan,-inf,nan,nan,-inf,nan,-inf,-inf,-inf,nan,nan,nan,nan,nan,nan,nan,nan,nan,nan,-inf,nan,nan,nan,nan,-inf,nan,-inf .......
Step1: Convert Resnet50 to coreml:
Step2: Test model coreml in python:
Step3: Test coreml model in Xcode:
System environment (please complete the following information):