margaretmz / esrgan-e2e-tflite-tutorial

ESRGAN E2E TFLite Tutorial
Apache License 2.0
17 stars 5 forks source link

Regarding using the distilled version of the original model #2

Open sayakpaul opened 4 years ago

sayakpaul commented 4 years ago

Hi @margaretmz. Thanks for setting this repo up!

Is it possible to use the distilled version of the ESRGAN model which is way lighter (33KB only)? If you see the notebook it's demonstrated under "Inference with the distilled version of the model (33 KB)" section.

A quick note regarding hosting the models:

We might want to host the models under GitHub releases as it makes it easier to consume them. WDYT?

margaretmz commented 4 years ago

Thanks! I will take a look at the distilled version.

We might want to host the models under GitHub releases as it makes it easier to consume them. WDYT?

I don't quite understand your comment about 'under GitHub releases". Could you rephrase please? Are you referring to our project repo?

sayakpaul commented 4 years ago

@margaretmz see if this screencast helps: https://www.loom.com/share/74905ebc1cc94e75a0270d6ab26b9e66.

margaretmz commented 4 years ago

@sayakpaul thanks for the screencast with details. :) My question was more about where to host the models with GitHub releases. Anyways I don't think we need to be so formal in versioning within tutorial project models. Although if you are hosting it for TF Hub a more formal versioning makes sense.

What is helpful is the file naming convention and also indicate the version (I include date stamp) in the tflite metadata itself.

By the way I tried the compressed version on Android and only got a black screen.

sayakpaul commented 4 years ago

@margaretmz the distilled model will be hosted by Adrish (who generated the model) himself in a few days. We can host the other one on TF Hub (which is also happened to be populated with metadata, correct?).

If we can host the model under the "Releases" of this project repo, we could reuse it for TF Hub publication purpose as well. Let me know your thoughts there.

By the way I tried the compressed version on Android and only got a black screen.

This is very strange. I am suspecting there might be some differences in the preprocessing and postprocessing steps between what's followed in this notebook and the Android code. Usually, I often run into this kind of issue when I miss out on the pixel scaling step.

margaretmz commented 4 years ago

@sayakpaul sounds good to host the tflite models under "releases" of this project repo. Let's also include links to the various models on a README.md under /ml so that it's easier to find them.

I fixed the metadata issue with the tflite model here that resulted in a reddish image.

I will look into the issue that the distilled model results in a black screen.

sayakpaul commented 4 years ago

Sure @margaretmz.

I will take the metadata populated TFLite model and will host it under "Releases" of this repository. For visibility, I am going to add the model links in the main README. But feel free to change that.

sayakpaul commented 4 years ago

@margaretmz I have included the model links here (hosted under "Releases" of this repo).

Regarding the preprocessing steps for the distilled model, there are some differences in the dimension. Those preprocessing steps can be seen here in this function load_img_dis() (notebook).

Let me know if anything is unclear.

thusinh1969 commented 3 years ago

Tried on Python :) with TF2. Took forever to invoke :) Drop !

Steve