Open seanshpark opened 4 years ago
extract files. this will look like this
bin
folder to your PATH
envrionmenthow-to-prepare-virtualenv.txt
file in doc
folder.
$ one-prepare-venv
This issue will use a model from TensorFlow from https://www.tensorflow.org/lite/guide/hosted_models
Extract and you'll get these
$ tree
.
├── inception_v3_2018_04_27.tgz
├── inception_v3.pb
├── inception_v3.tflite
└── labels.txt
Convert to .circle
file
one-import tf \
--input_path ./inception_v3.pb \
--output_path ./inception_v3.circle \
--input_arrays input --input_shapes "1,299,299,3" \
--output_arrays InceptionV3/Predictions/Reshape_1
On success, there will be no (error) messages and two files
inception_v3.circle
inception_v3.circle.log
inception_v3.circle.log
will show internal conversion log of internal commands used.Optimize .circle
file
one-optimize --all \
--input_path ./inception_v3.circle \
--output_path ./inception_v3-opt.circle
On suceess, there will be no (error) messages and two new files
inception_v3-opt.circle
inception_v3-opt.circle.log
inception_v3-opt.circle.log
will show optimization log.FYI,
one-optimize --help
will show current provided optimization algorithms.
Create package for runtime
one-pack -i ./inception_v3-opt.circle -o nnpack
On suceess, there will be no (error) messages and one new folder nnpack
Your working directory should have these files with tree
command
$ tree
.
├── inception_v3_2018_04_27.tgz
├── inception_v3.circle
├── inception_v3.circle.log
├── inception_v3-opt.circle
├── inception_v3-opt.circle.log
├── inception_v3-opt.pack.log
├── inception_v3.pb
├── inception_v3.tflite
├── labels.txt
└── nnpack
└── inception_v3-opt
├── inception_v3-opt.circle
└── metadata
└── MANIFEST
Let's try conversion with normal file but inappropriate input
one-import tf \
--input_path ./inception_v3.tflite \
--output_path ./inception_v3.circle \
--input_arrays input --input_shapes "1,299,299,3" \
--output_arrays InceptionV3/Predictions/Reshape_1
This will drop TensorFlow logs and hint codes about the error and the error message
ValueError: Invalid tensors 'input' were found.
Let's optimize with normal file but inappropriate input
one-optimize --all \
--input_path ./inception_v3.pb \
--output_path ./inception_v3-opt.circle
Error messages something like
/.../bin/one-optimize: line 148: 27412 Segmentation fault (core dumped) "${DRIVER_PATH}/circle2circle" ${OPTIMIZE_OPTIONS} "${INPUT_PATH}" "${OUTPUT_PATH}" >> "${OUTPUT_PATH}.log" 2>&1 /.../bin/circle2circle --all ./inception_v3.pb ./inception_v3-opt.circle
Note:
Download model: while_3.zip
one-import tf \
--input_path ./while_3.pbtxt \
--output_path ./while_3.circle \
--input_arrays Hole,Hole_2 --input_shapes "1,1:1,1" \
--output_arrays Output
This will drop error something like this
tensorflow.lite.python.convert.ConverterError:
:0: error: loc("While/LoopCond"): body function result type tensor<1x2xi32> is incompatible with result type tensor<1x1xi32> at index 0
Hello, i am Rezwanul Huq Shuhan from SRBD. I'm in charge for testing NNcompiler I see in these sample TCs, 3 scripts were used. one-import, one-optimize and one-pack. However, there are other scripts such as one-import-tflite, one-import-bcq, one-quantize. Should we use them to write abnormal TCs for verification???
Hello, @rez1hawk78 , thanks for testing!
Should we use them to write abnormal TCs for verification???
I don't exactly understand what writing abnormal TCs but I think testing other tools would be wonderful.
one-import-bcq
is like one-import-tf
but with BCQ related nodes.
@llFreetimell, can you provide some guide for this?
And @jinevening , can you provide some guide or share a URL if there exist a guide for one-quantize
?
Hello Mr. @seanshpark, Thank you for your response I'm sorry for the confusion. By abnormal TCs i mean, typical negative usage of the compiler frontend like sample testcase 2 and 3
Test case for one-quantize
, which is a tool to quantize floating point circle model to integer model.
Let's assume we have inception_v3.circle
(generated from this guide) whose weights are float32.
one-quantize
performs post-training static quantization, which requires a representative dataset (hdf5 file) for calibration (See this link for detailed explanation. You can also get a general idea about post-training static quantization here).
The representative dataset for inception_v3.circle
can be created by downloading Imagenet dataset, preprocessing the images, and packaging some of those images into an hdf5 file.
We know this is a laborious job, so we provide a test dataset which has 10 pre-processed image data (inception_v3_test_data.zip). You will get inception_v3_test_data.h5
after unzipping the file.
(You can print the contents of inception_v3_test_data.h5
using a tool like h5dump
)
With the representative dataset, you can quantize the model.
./one-quantize \
--input_dtype float32 \
--quantized_dtype uint8 \
--input_path ./inception_v3.circle \
--input_data ./inception_v3_test_data.h5 \
--output_path ./inception_v3.quantized.circle
This will generate inception_v3.quantized.circle
, whose tensors are quantized to uint8 values.
Currently we only support quantization from float32 -> uint8 model. If you run one-quantize with other data types like this,
./one-quantize \
--input_dtype float64 \
--quantized_dtype uint8 \
--input_path ./inception_v3.circle \
--input_data ./inception_v3_test_data.h5 \
--output_path ./inception_v3.quantized.circle
You will see this message.
ERROR: Unsupported input type. List of supported input type: float32
If you give a data type other than uint8
to --quantized_dtype
, you will see the below message.
ERROR: Unsupported output type. List of supported output type: uint8
The input shape of inception_v3.circle
is [1, 299, 299, 3]. If you give a representative dataset with a different shape, you will see the below message.
ERROR: Input shape mismatch.
You can test this case by running below command with mobilenet_test_data.zip, which is a dataset of image size [1, 224, 224, 3].
./one-quantize \
--input_dtype float32 \
--quantized_dtype uint8 \
--input_path ./inception_v3.circle \
--input_data ./mobilenet_test_data.h5 \
--output_path ./inception_v3.quantized.circle
Followings are test cases for one-import-bcq
, which is a tool to generated BCQ applied circle model.
bcq.pb
file is in bcq.pb.zip.
Following scripts is typical positive case, and therefore it will generate bcq.circle
file which includes BCQ information nodes.
./one-import-bcq \
--input_path ./bcq.pb \
--output_path ./bcq.circle \
--input_arrays Placeholder \
--output_arrays MatMul
When input tensor is wrong, error will be occurred.
./one-import-bcq \
--input_path ./bcq.pb \
--output_path ./bcq.circle \
--input_arrays Placeholder_null \
--output_arrays MatMul
./bcq.circle.log
(omit)
...
ValueError: Invalid tensors 'Placeholder_null' were found.
When output tensor is wrong, error will be occurred.
./one-import-bcq \
--input_path ./bcq.pb \
--output_path ./bcq.circle \
--input_arrays Placeholder \
--output_arrays MatMul_null
./bcq.circle.log
(omit)
...
ValueError: Invalid tensors 'MatMul_null' were found.
When output tensor is wrong, error will be occurred.
./one-import-bcq \
--input_path ./bcq_null.pb \
--output_path ./bcq.circle \
--input_arrays Placeholder \
--output_arrays MatMul
Error: input model not found
This issue is to describe compiler frontend typical positve usage and negative usage.
This issue assumes
Similar issue: #1179