HighCWu / waifu2x-tfjs

Image super-resolution using Tensorflow.js .
MIT License
51 stars 18 forks source link

Cannot read property 'nInputPlane' of undefined #5

Closed gqgs closed 3 years ago

gqgs commented 3 years ago

The error happens in the line bellow: https://github.com/HighCWu/waifu2x-tfjs/blob/2a31b376d5d2f43c9890d335251cce9259bd6828/src/lib/predictor.ts#L116

Investigating it appears the issue is happening because of some weird behavior of the fetch-progress library and the condition bellow:

https://github.com/HighCWu/waifu2x-tfjs/blob/2a31b376d5d2f43c9890d335251cce9259bd6828/src/lib/predictor.ts#L151

Adding a log to the onProgress callback I'm seeing that the model is still being downloaded even after this._modelFetchProgress > 1.

onProgress: (progress) => {
    this._modelFetchProgress = Math.max(0, progress.transferred / progress.total - 0.001);
    console.log("onProgress", this._modelFetchProgress, progress.transferred, progress.total, progress.remaining, progress.percentage)
    this._modelFetchCallback(this._modelFetchProgress);
},
onProgress 0.8614218191299012 2097152 2431701 334549 86
onProgress 0.9012865064413758 2194091 2431701 237610 90
onProgress 1.0770272739123767 2621440 2431701 -189739 108
onProgress 1.579496944320046 3843296 2431701 -1411595 158
onProgress 1.9803402223381905 4818027 2431701 -2386326 198

So basically, predict being called when this._modelFetchProgress >= 0.999999 but the model is still being fetched results in the error above.

Adicional info:

HighCWu commented 3 years ago
  • I'm submitting a ... [x] bug report [ ] feature request [ ] question about the decisions made in the repository [ ] question about how to use this project

The error happens in the line bellow: https://github.com/HighCWu/waifu2x-tfjs/blob/2a31b376d5d2f43c9890d335251cce9259bd6828/src/lib/predictor.ts#L116

Investigating it appears the issue is happening because of some weird behavior of the fetch-progress library and the condition bellow:

https://github.com/HighCWu/waifu2x-tfjs/blob/2a31b376d5d2f43c9890d335251cce9259bd6828/src/lib/predictor.ts#L151

Adding a log to the onProgress callback I'm seeing that the model is still being downloaded even after this._modelFetchProgress > 1.

onProgress: (progress) => {
    this._modelFetchProgress = Math.max(0, progress.transferred / progress.total - 0.001);
    console.log("onProgress", this._modelFetchProgress, progress.transferred, progress.total, progress.remaining, progress.percentage)
    this._modelFetchCallback(this._modelFetchProgress);
},
onProgress 0.8614218191299012 2097152 2431701 334549 86
onProgress 0.9012865064413758 2194091 2431701 237610 90
onProgress 1.0770272739123767 2621440 2431701 -189739 108
onProgress 1.579496944320046 3843296 2431701 -1411595 158
onProgress 1.9803402223381905 4818027 2431701 -2386326 198

So basically, predict being called when this._modelFetchProgress >= 0.999999 but the model is still being fetched results in the error above.

Adicional info:

It looks like the server returned an incorrect file size. I haven't encountered this problem yet. Do you have any ideas on this issue. I just guessed that fetch did not get the correct file length

gqgs commented 3 years ago

Since I'm using Github pages it would be a bug on Github's part if that was the problem. I'm not sure if that's the case. I doubled checked with curl and the returned gzipped size seems to be correct.

$ curl --head -H 'Accept-Encoding: gzip' https://gqgs.github.io/3x3-generator/tfjs_models/scale2.0x_model.json  | grep content-length
content-length: 2431701

This is the correct value according to RFC2616.

It appears to me that the ReadableStream in fetch-progress is accumulating the size of the uncompressed data while the "Content-Length" header refers to the compressed size as described above. This would explain the values being returned by the library.

https://github.com/samundrak/fetch-progress/blob/86663472ac11ead82a6b2cfd62f5465161df02aa/index.js#L23 https://github.com/samundrak/fetch-progress/blob/86663472ac11ead82a6b2cfd62f5465161df02aa/Progress.js#L26

gqgs commented 3 years ago

I created an issue upstream but I'm not sure if that library is being maintained anymore.

HighCWu commented 3 years ago

In this case, maybe I should judge the end of fetch directly instead of judging that the download is complete based on the progress.

gqgs commented 3 years ago

Yeah I think that will work. Just awaiting the _modelFetchPromise unconditionally should fix the issue I think.

HighCWu commented 3 years ago

Thanks. I‘ll try to fix it later.

gqgs commented 3 years ago

cdbe616fadcc2a16c454c517d0fb961e8239bc74