Closed ghafran closed 3 years ago
@pyu10055 Not sure if this falls in your court :)
Just tried this on models that were working previously. Looks like something changed during the model export or creation process on AutoML that is causing this issue.
This seems to be fixed today. I retrained the models this morning and exported. Worked as expected. I love voodoo magic self healing :)
This issue popped up again :(
cc @pyu10055 @tafsiri
@ghafran can you share your model file? thanks
Attached. Just exported it from AutoML Multi-classification using edge, faster performance. Getting same error. @pyu10055
@pyu10055 thanks for posting the fix. Your a life saver!
@pyu10055 I see the fix has been completed. Can I regenerate the model now or do we need to wait for a production push? Thanks.
@pyu10055 This issues still seems to be happening. Do we know when the fix will be deployed?
Even I am also facing the same issue :( The model I exported about 3 weeks ago is working perfectly fine but the model I exported yesterday is not working and showing the error "cannot read property reduce of undefined" :(
Our customers are getting agitated. Do have an estimate of when this will be pushed?
@ghafran The fix will be available in the next release, but meanwhile you can lock your tfjs version to 2.6.0, that should temporarily solve your problem.
I think I found a solution
In your model.json file just remove this line:-
"modelInitializer": {"versions": {}},
.
@ghafran so here is the modified model.json
Replace your model.json with the above file
So, this should work even with the latest tfjs version
@Vivek1907 that work around fixes it! Thank you!
@Vivek1907 Worked for me too. Thanks!
Just trained, exported, and ran the models with latest versions. All is good now.
Does anyone here have any insight about the problem described at the bottom of this stackoverflow post: https://stackoverflow.com/questions/65402617/tensorflow-automl-model-in-react
When creating a model using AutoML with Multi Classification.
Getting this error from tensorflowjs when running
const model = await tf.automl.loadImageClassification('/model/model.json');
html code:
Model Multi-Classification, Edge, Exported for Tensorflowjs
Nothing crazy here just a simple export and run from browser. The model works fine when deployed in GCP with "Test&Use".