ctuning / ck-mlperf

This repository is outdated! Join the open MLPerf workgroup to participate in the development of the next generation of automation workflows for MLPerf benchmarks:
https://bit.ly/mlperf-edu-wg
BSD 3-Clause "New" or "Revised" License
32 stars 23 forks source link

Generalise and document TF -> TFLite conversion #4

Open psyhtest opened 5 years ago

psyhtest commented 5 years ago

@bellycat77 has managed to convert the TF ResNet50 v1.5 model used in MLPerf Inference to TFLite with the following script:

import tensorflow as tf

graph_def_file = "resnet50_v1.pb"
input_arrays = ["input_tensor"]
output_arrays = ["softmax_tensor"]

converter = tf.contrib.lite.TFLiteConverter.from_frozen_graph(
  graph_def_file, input_arrays, output_arrays)
tflite_model = converter.convert()
open("resnet50_v1.tflite", "wb").write(tflite_model)

We should generalise and automate this via a CK script. For example, the input file can come from a dependency on a TF model, whereas the input and output arrays can be specified in the model's metadata.

psyhtest commented 5 years ago

If a model requires special steps (e.g. SSD-MobileNet), then it should motivate us even more to generalise.

psyhtest commented 5 years ago

Conversion steps for SSD-based models are documented here.

psyhtest commented 5 years ago

I wonder if the same conversion steps will work for SSD-ResNet.