lobe / web-bootstrap

Bootstrap your Lobe machine learning model with our web starter project.
https://lobe.ai
MIT License
74 stars 31 forks source link

lobe web-bootstrap never loads. #24

Closed rcolepeterson closed 3 years ago

rcolepeterson commented 3 years ago

Cloned and deployed the web-bootstrap app on Netifly. Works.

I then updated model, tested locally, exported, re-deployed and it does not work. Just loads for ever.

UI appears, no errors, but just loading text in bottom left.

In console there is this message - imageClassificationModel.ts:112 Model not loaded, please await this.load() first.

URL - https://naughty-bose-d279c2.netlify.app/

Any help would be great. If i cant get this to work online, I can't use it.

mbeissinger commented 3 years ago

Can you run the updated (latest export) model locally? Is it in the public/model folder and is the app looking in that folder? Double check by printing these values from https://github.com/lobe/web-bootstrap/blob/master/src/components/App.tsx#L14:

const signatureFile = process.env.PUBLIC_URL + /model/signature.json; const modelFile = process.env.PUBLIC_URL + /model/model.json;

rcolepeterson commented 3 years ago

I don't think it is a pathing issue. I think it is the size of the model issue. (again I am just doing a simple app for a test. hat / no hat).

Loads on www on "fast computers". Does not load on "slow" computers.

Here is the test url ... https://naughty-bose-d279c2.netlify.app/

https://naughty-bose-d279c2.netlify.app/model/signature.json https://naughty-bose-d279c2.netlify.app/model/model.json

From what I can tell in your code, it looks like you guys check twice for the model in the predict method and then give up if it is not there. Does that sound right?

On faster connections the model is there, but not with slower connections. It calls the predict method twice and then gives up.

That is why there are 2 console statements that say "Model not loaded, please await this.load() first."

rcolepeterson commented 3 years ago

@mbeissinger do you have an example of the web version working online with something more than the hello world / thumbs up thumbs down example? Or does it just not work? and i should move on to a diff web solution? Thanks for your time.

Westy-Dev commented 3 years ago

@rcolepeterson I have had a similar issue with the model taking a while to load. In some cases it never loads and I just refresh the page. Eventually after refreshing and waiting, it does load. I have tested this on Brave and Chrome browsers, and work colleagues have also gotten it to load. It does not always load first time though unfortunately, no matter how long you leave it in the "loading..." state. I am also using Netlify to host.

shreesha345 commented 3 years ago

I am also having the same issue please someone help me 😭

salmanfarisvp commented 3 years ago

@rcolepeterson I'm also facing the same issue, webapp: https://naughty-bose-d279c2.netlify.app/ , source : https://gitlab.com/salmanfarisvp/microsoftlobe-mask-detection

rcolepeterson commented 3 years ago

@salmanfarisvp I don't think this repo is being maintained anymore. I have moved on. good luck. I would go with the google teachable.

mbeissinger commented 3 years ago

Hi all! This should be fixed with #25, I tested using Netlify as well. Loaded the model on the predict() call if it wasn't done yet.

mbeissinger commented 3 years ago

To help with performance, I would recommend using the Speed model for your project because it is much smaller. (Menu > Project Settings > Optimize for Speed).