XiaoMi / mace-models

Mobile AI Compute Engine Model Zoo
Apache License 2.0
371 stars 73 forks source link

Problems when trying convert inception-v1 #6

Closed summerspringwei closed 6 years ago

summerspringwei commented 6 years ago

When I try to convert inception-v1 from tensorflow model zoo (https://github.com/tensorflow/models/tree/master/research/slim), Errors occurs as flowing: ` Traceback (most recent call last): File "/home/xcw/software/mace/bazel-bin/mace/python/tools/converter.runfiles/mace/mace/python/tools/converter.py", line 319, in main(unused_args=[sys.argv[0]] + unparsed) File "/home/xcw/software/mace/bazel-bin/mace/python/tools/converter.runfiles/mace/mace/python/tools/converter.py", line 194, in main memory_optimizer.optimize_gpu_memory(output_graph_def) File "/home/xcw/software/mace/bazel-bin/mace/python/tools/converter.runfiles/mace/mace/python/tools/memory_optimizer.py", line 300, in optimize_gpu_memory mem_optimizer.optimize() File "/home/xcw/software/mace/bazel-bin/mace/python/tools/converter.runfiles/mace/mace/python/tools/memory_optimizer.py", line 148, in optimize op.output_shape[i].dims) File "/home/xcw/software/mace/bazel-bin/mace/python/tools/converter.runfiles/mace/mace/python/tools/memory_optimizer.py", line 243, in get_op_mem_block raise Exception('output shape dim size is not 2 or 4.') Exception: output shape dim size is not 2 or 4. Traceback (most recent call last): File "tools/converter.py", line 1730, in flags.func(flags) File "tools/converter.py", line 820, in convert_func convert_model(configs) File "tools/converter.py", line 745, in convert_model ",".join(model_config.get(YAMLKeyword.graph_optimize_options, []))) File "/home/xcw/software/mace/tools/sh_commands.py", line 532, in gen_model_code _fg=True) File "/home/xcw/anaconda2/lib/python2.7/site-packages/sh.py", line 1413, in call raise exc sh.ErrorReturnCode_1:

RAN: /home/xcw/anaconda2/bin/python bazel-bin/mace/python/tools/converter -u --platform=tensorflow --model_file=/home/xcw/datasets/tf_models/google_inception/frozen_inception_v1.pb --weight_file= --model_checksum=d75481a8bcaedafdf217bb2bf9ee5a3da750b4b0fd7a76b571fdf818e0cd4709 --weight_checksum= --input_node=input --output_node=InceptionV1/Logits/Conv2d_0c_1x1/convolution --runtime=gpu --template=mace/python/tools --model_tag=mobilenet_v1 --input_shape=1,224,224,3 --dsp_mode=0 --embed_model_data=False --winograd=0 --quantize=0 --quantize_range_file= --obfuscate=0 --output_dir=mace/codegen/models/mobilenet_v1 --model_graph_format=file --data_type=fp16_fp32 --graph_optimize_options=

STDOUT:

STDERR: `

By inseting print at memory_optimizer.py:242, I found the the problem op is Elewise and it shape is [64].

Do anybody knows how to solve this problem? Thanks!

summerspringwei commented 6 years ago

It seems that the mace does not support the frozen_inception_v1.pb I frozed using Tensorflow r1.4. When I freeze the graph using the tensorflow r1.9 and tensorflow models https://github.com/tensorflow/models/tree/master/research/slim, it works!

llhe commented 6 years ago

close this since it's resolved.