Open sayakpaul opened 4 years ago
Thanks! I will take a look at the distilled version.
We might want to host the models under GitHub releases as it makes it easier to consume them. WDYT?
I don't quite understand your comment about 'under GitHub releases". Could you rephrase please? Are you referring to our project repo?
@margaretmz see if this screencast helps: https://www.loom.com/share/74905ebc1cc94e75a0270d6ab26b9e66.
@sayakpaul thanks for the screencast with details. :) My question was more about where to host the models with GitHub releases. Anyways I don't think we need to be so formal in versioning within tutorial project models. Although if you are hosting it for TF Hub a more formal versioning makes sense.
What is helpful is the file naming convention and also indicate the version (I include date stamp) in the tflite metadata itself.
By the way I tried the compressed version on Android and only got a black screen.
@margaretmz the distilled model will be hosted by Adrish (who generated the model) himself in a few days. We can host the other one on TF Hub (which is also happened to be populated with metadata, correct?).
If we can host the model under the "Releases" of this project repo, we could reuse it for TF Hub publication purpose as well. Let me know your thoughts there.
By the way I tried the compressed version on Android and only got a black screen.
This is very strange. I am suspecting there might be some differences in the preprocessing and postprocessing steps between what's followed in this notebook and the Android code. Usually, I often run into this kind of issue when I miss out on the pixel scaling step.
@sayakpaul sounds good to host the tflite models under "releases" of this project repo. Let's also include links to the various models on a README.md under /ml so that it's easier to find them.
I fixed the metadata issue with the tflite model here that resulted in a reddish image.
I will look into the issue that the distilled model results in a black screen.
Sure @margaretmz.
I will take the metadata populated TFLite model and will host it under "Releases" of this repository. For visibility, I am going to add the model links in the main README. But feel free to change that.
@margaretmz I have included the model links here (hosted under "Releases" of this repo).
Regarding the preprocessing steps for the distilled model, there are some differences in the dimension. Those preprocessing steps can be seen here in this function load_img_dis()
(notebook).
Let me know if anything is unclear.
Tried on Python :) with TF2. Took forever to invoke :) Drop !
Steve
Hi @margaretmz. Thanks for setting this repo up!
Is it possible to use the distilled version of the ESRGAN model which is way lighter (33KB only)? If you see the notebook it's demonstrated under "Inference with the distilled version of the model (33 KB)" section.
A quick note regarding hosting the models:
We might want to host the models under GitHub releases as it makes it easier to consume them. WDYT?