alexanderkroner / saliency

Contextual Encoder-Decoder Network for Visual Saliency Prediction [Neural Networks 2020]
MIT License
172 stars 46 forks source link

Converting to TFLite #2

Closed NinZine closed 4 years ago

NinZine commented 4 years ago

I am trying to convert the model to TFLite, I ran into some problems. How did you convert it to Tensorflow.js? If I could get some pointers on how to convert it to JS I could probably get it to convert to Lite as well.

alexanderkroner commented 4 years ago

Hey, sorry for the late reply!

I tried to convert a model to TFLite with the following code and it seemed to work:

converter = tf.lite.TFLiteConverter.from_frozen_graph("model_salicon_cpu.pb",
                                                      ["input"], ["output"],
                                                      {"input": [1, 240, 320, 3]})
tflite_model = converter.convert()

with open("model_salicon_cpu.tflite", "wb") as f:
    f.write(tflite_model)

All of the models can simply be downloaded from here. It works best if the input dimensions are the same as during training, i.e. [1, 240, 320, 3] for SALICON, [1, 360, 360, 3] for MIT1003, and [1, 216, 384, 3] for CAT2000.

Let me know if this worked for you and feel free to report any other problems!