Closed CodeBlack-121855 closed 3 years ago
Iam also getting the same issue. Can we know how to fix this issue. Iam doing this as part of my school project. Please help.
I also experience this. the file download is a broken zip file. The broken file is actually a "404" file for tflite.zip and "502" for tinyml.zip
I also get the same issue. Can you help me with this problem, please?
I also get the same issue, this situation has been going on for a long time. Can someone fix it please?
Apologies it took so long to get this resolved. Turned out to be a much bigger issue than we had anticipated. We have now rolled out a new version that hopefully should solve this issue. Please re-open if its still giving you issues.
@HalfdanJ Thanks for your attention. But it looks like the issue still exists.
@wenxiangjiang which model conversion is giving you issues, and is it only a specific one or all conversions that fail?
@HalfdanJ I want to convert TF Lite Quantized
.
All conversions will fail, but occasionally Keras
will succeed.
Interesting. I've been trying to reproduce without any luck, and what surpises me is that I dont see any errors in the quantize converter logs.
Two followup questions:
@HalfdanJ
Thanks for your response.
I just tried again, although I still get an error, but I will also download a file successfully, but the file cannot be decompressed.
Here is the image i used.
Thanks @wenxiangjiang , i'm now able to identify the issue here as well. I don't have a solution for it yet, but will keep you posted.
@wenxiangjiang can you try again now? 🤞 it might be solved now (some fun proxy changes)
@HalfdanJ Thank you for your continued response, I just tested this issue and it seems to have been resolved.
@HalfdanJ Hey i am getting the same problem what proxy changes should be done so that I can download the quantized model..
@HalfdanJ Hey i am getting the same problem what proxy changes should be done so that I can download the quantized model..
I am also facing the same issue..
Describe the bug I'm working with android app and for that when I'm exporting tflite model in quantizer version it shows an error "Something went wrong while converting". Note : I've created 11 classes with approx. 200 images in each class.
Screenshots
Desktop :