Open nickls opened 1 month ago
cc: @kevinwoolfolk97
Our loading and warming code:
model = await loadGraphModel(this.customModelPath);
...
tf.tidy(() => {
const results = this.model.predict(
tf
.zeros([Detector.IMG_SIZE, Detector.IMG_SIZE, 3], "float32")
.expandDims(0)
);
results.data().then(() => {
this.setModelState("warmed");
});
});
@shmishra99 -- Any ideas on this issue? or anything we can do to help debug it?
System information
Macbook Pro 16 GB 2020 (Intel Mac) Running OS X 10.15.7
NPM
3.9.0
and4.21.0
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36
Describe the current behavior We're running a TF.js model in production that is a fine tuned MobileNetv1. This model works perfectly for all of our users except one, we are unable to reproduce the issue locally or detect the issue before it occurs so we could switch to CPU. This issue started about a month ago, during which time we had not updated any of our TF code or components.
Problem:
loadGraphModel
You can see the stackstrace for when the system going into a loop (also attached)
Describe the expected behavior
loadGraphModel
Here is what the stackstrace looks like when the model successfully loads and warms.
Standalone code to reproduce the issue We cannot reproduce on local systems. But are open to any ideas on how to reproduce the problem.
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
tf.ENV.features:
We wrote a TF testing page to help isolate the issue, screen shots are below. These tests all pass for our dev and QA team, but running the model fails for our user.
Trace-20240930T110703.json.zip