googlecreativelab / teachablemachine-community

Example code snippets and machine learning code for Teachable Machine
https://g.co/teachablemachine
Apache License 2.0
1.51k stars 676 forks source link

[BUG]: Title Something went wrong while exporting quantized model of tflite #225

Closed CodeBlack-121855 closed 3 years ago

CodeBlack-121855 commented 3 years ago

Describe the bug I'm working with android app and for that when I'm exporting tflite model in quantizer version it shows an error "Something went wrong while converting". Note : I've created 11 classes with approx. 200 images in each class.

Screenshots Screenshot 2021-06-16 083144

Desktop :

Arinjay35 commented 3 years ago

Iam also getting the same issue. Can we know how to fix this issue. Iam doing this as part of my school project. Please help.

leoncoolmoon commented 3 years ago

I also experience this. the file download is a broken zip file. The broken file is actually a "404" file for tflite.zip and "502" for tinyml.zip

Soulkrown commented 3 years ago

I also get the same issue. Can you help me with this problem, please?

wenxiangjiang commented 3 years ago

I also get the same issue, this situation has been going on for a long time. Can someone fix it please?

HalfdanJ commented 3 years ago

Apologies it took so long to get this resolved. Turned out to be a much bigger issue than we had anticipated. We have now rolled out a new version that hopefully should solve this issue. Please re-open if its still giving you issues.

wenxiangjiang commented 3 years ago

@HalfdanJ Thanks for your attention. But it looks like the issue still exists.

HalfdanJ commented 3 years ago

@wenxiangjiang which model conversion is giving you issues, and is it only a specific one or all conversions that fail?

wenxiangjiang commented 3 years ago

@HalfdanJ I want to convert TF Lite Quantized.

All conversions will fail, but occasionally Keras will succeed.

20210810
HalfdanJ commented 3 years ago

Interesting. I've been trying to reproduce without any luck, and what surpises me is that I dont see any errors in the quantize converter logs.

Two followup questions:

wenxiangjiang commented 3 years ago

@HalfdanJ

Thanks for your response.

I just tried again, although I still get an error, but I will also download a file successfully, but the file cannot be decompressed.

20210811-085754

Here is the image i used.

tensorflow.zip

HalfdanJ commented 3 years ago

Thanks @wenxiangjiang , i'm now able to identify the issue here as well. I don't have a solution for it yet, but will keep you posted.

HalfdanJ commented 3 years ago

@wenxiangjiang can you try again now? 🤞 it might be solved now (some fun proxy changes)

wenxiangjiang commented 3 years ago

@HalfdanJ Thank you for your continued response, I just tested this issue and it seems to have been resolved.

SPEAR22 commented 1 year ago

@HalfdanJ Hey i am getting the same problem what proxy changes should be done so that I can download the quantized model..

AbhishekMahadik commented 1 year ago

@HalfdanJ Hey i am getting the same problem what proxy changes should be done so that I can download the quantized model..

I am also facing the same issue..