tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.3k stars 1.91k forks source link

loadModel from url doesn't work in Node #410

Closed dsmilkov closed 5 years ago

dsmilkov commented 6 years ago

(Reported by another user (which is why I don't have the stack trace))

loadModel with a url path doesn't work in Node. This is most likely related to fetch missing in node. We should detect the environment and use node's built-in HTTP, or conditionally import node-fetch when we are not in the browser.

cc @nsthorat, @tafsiri for ideas on node <--> browser interop.

dsmilkov commented 6 years ago

I think the best solution is to import node-fetch conditionally, and add 'ignore' flag to rollup to ignore this import when making a browser bundle, just like we do for crypto (rollup config) which came from our seedrandom dependency (see seedrandom's conditional import here)

nsthorat commented 6 years ago

That's one solution. Another solution is to basically extend the backend, or create a "platform" which would be node vs browser which has a few methods that we override (fromPixels, fetch, etc). This seems a little cleaner than sprinkling conditional imports.

caisq commented 6 years ago

@dsmilkov @nsthorat The IOHandlerRegistry is exactly designed to accommodate this kind of environment-dependent handling of URL schemes. In particular, the http:// URL scheme will be "routed" to different IOHandler implementations depending on whether the environment is browser or Node.js. This issue can be regarded as a duplicate of https://github.com/tensorflow/tfjs/issues/343, which is underway. The status is that the file:// handler has been implemented for Node.js and the http:// or https:// handler will happen soon. So I will close this issue now.

dsmilkov commented 6 years ago

Sounds good! Would be great if browser specific handlers only get registered in the browser . This way in node it would say "no io handler registered for http" as opposed to using the browser specific one

tafsiri commented 6 years ago

I think i am in favour of providing someway to pass a function that can things like override fetch. This issue intersects with ones like https://github.com/tensorflow/tfjs/issues/272 where on platforms like Ionic or React Native (hybrid web/native platforms) the developer may want to load from some local store that we can't apriori know how to load from (and where fetch isn't implemented). Allowing for callbacks that enable a user to pull the necessary paths from whatever platform they get tfjs could be quite useful.

@caisq what do you think of this case, is there another way to handle it?

caisq commented 6 years ago

Thanks, @tafsiri for the comment. How about an API like the following:

  1. If the user doesn't need to override fetch for environments like React Native, then the fetch method will be selected under the hood, automatically, based on whether the environment is browser or node.js. In either case, the following code will work:

    const model = await tf.loadModel('http://foo/path/to/model.json');
  2. If the user needs to override fetch, the following API can be used:

    const model = await tf.loadModel(tf.io.httpRequest('http://foo/path/to-model.json', {fetch: myCustomFetch}));
tafsiri commented 6 years ago

@caisq In the example above does getting the initial json file also use myCustomFetch? It looks like it will do a regular http request which may throw a user off if they know they can't do a 'local' http request. If the first param doesn't have 'http' in it will it work?

Is this the point where a user need to implement their IOHandler? Is there a way to implement this minimally in a way that controls how each type of resource is loaded, while delegating as much as possible to our existing code.

Something like

const model = await tf.loadModel('path/to/manifest.json', {
    load: myCustomLoadFunc, // what is the required signature for myCustomLoadFunc?
});

I imagine myCustomLoadFunc being responsible for either getting a json string or a binary blob to our loadModel code.

or

const model = await tf.loadModel(tf.io.customIO('path/to/manifest.json', load: loadFunc));

Though IMO this is less concise.

gabrielfreire commented 6 years ago

Hi guys, i don't even remember how i got here in this issue, but as a tfjs user and nodeJs lover i would love to be able to do a simple

const model = await tf.loadModel('path/to/model.json');
// or
const model = await tf.loadModel('https://www.path.com/to?my=model.json');
// or
const model = await tf.loadModel('file://path/to/model.json');

and let the library do the guessing, sorry for my ignorance, i'm not familiar with the tfjs code for this method, but couldn't you just have something like

async loadModel(url: string): Promise<SomeModelClass> {
   if(typeof document === 'undefined'){ // or something else to figure out if there is a browser
       // NodeJS land
       // maybe use some package to extrack url metadata ?
       if(url.match(/\.json$/) == -1) throw 'some error';
       if(url.match(/file(?=\:)/) > -1) console.log(`It\'s a file protocol request, maybe use the node fs module?`);
       else if(url.match(/http(?=\:)/) > -1) console.log(`It\'s a http request, probably use Request module or some cool library`);
       else console.log('It\'s a local request, fs module?');
       return someModelClassInstance;
   }
}

?

Sorry again

caisq commented 6 years ago

Currently (v0.11.6), only the file:// URL scheme following works in tfjs-node.

Absolute path example:

const model = await tf.loadModel('file:///tmp/path/to/model.json');
// Notice the three slashes after the file:. The first two belong to the scheme. The last one belongs
// to the absolute file path.

Relatve path example:


const model = await tf.loadModel('file://./path/to/model.json');
``

We are working on the http:// and no-scheme on in node.js.
hyun-yang commented 6 years ago

@caisq @tafsiri Hi guys,

Actually, I raised the issue tf.loadModel not working in ionic #272.

I tested it using tf.loadModel('file://./path/to/model.json') and tf.loadModel('file:///tmp/path/to/model.json');

Both of them, got the same error message "TypeError : Failed to fetch", I already uploaded sample ionic project Tensorflow Pre-Trained Model Import in Ionic Demo

Essentially, hybrid app developer want to load pretrained model from local path.

caisq commented 6 years ago

@hyun-yang I'm pretty sure saving and loading models with file:// is working with the latest versions of @tensorflow/tfjs and @tensorflow/tfjs-node. I wrote a simple example at: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

hyun-yang commented 6 years ago

@caisq I'm not sure, if we are on the same page.

As I mentioned in "tf.loadModel not working in ionic #272"

"What I really want to do is, I want to load this model from local folder( In this case from assets/model/ ) not using http server.

In ionic( I think lots of hybrid app platform has a same build process), when developer builds a native app for Android, iOS and Windows. Developer might want to load model from the local folder which is already packaged inside output file ( apk, ipa ) not using http server.

It'd be great if we have an API like a tf.loadModelFromLocal.

Thanks."

So, I don't think I need to install @tensorflow/tfjs-node as well.

I hope this makes sense to you.

Ps. I tested it your example with node main.js and it shows the result, however that's not what I'm talking about. Tensor [[0.2704779, 0.2301091, 0.21263, 0.2867831], [0.2704779, 0.2301091, 0.21263, 0.2867831]]

caisq commented 6 years ago

@hyun-yang Thanks for the clarification. You are right. I overlooked the fact that you are not in Node.js, but in ionic. Loading in ionic is currently not directly supported. We plan to support it through the custom fetch configuration to tf.io.httpRequest as I wrote above. Currently, you may need to write a custom IOHandler implementation. If you need an example, you can look at this code in tfjs-node: https://github.com/tensorflow/tfjs-node/blob/master/src/io/file_system.ts#L26

hyun-yang commented 6 years ago

@caisq

We plan to support it through the custom fetch configuration to tf.io.httpRequest as I wrote above

Sounds good and I'll have a look tfjs-node code you mentioned.

Thanks.

limscoder commented 5 years ago

I get a fetch error even when using a local file running tfjs 13.1 and node 8.11.

Models was saved from Keras with the Python package

 tfjs.converters.save_keras_model(model, path)
model = await tf.loadModel('file:///absolute/path/to/model.json');
(node:71934) UnhandledPromiseRejectionWarning: Error: browserHTTPRequest is not supported outside the web browser without a fetch polyfill.
    at new BrowserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:46:19)
    at Object.browserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:247:12)
    at Object.<anonymous> (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:98:50)
    at step (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:42:23)
    at Object.next (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:23:53)
    at /Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:17:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:13:12)
    at Object.loadModelInternal (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:92:12)
    at Object.loadModel (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/exports.js:17:21)

Update -- I also get an error when trying to run the example file loader code from @caisq: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

schipiga commented 5 years ago

@limscoder looks like @tensorflow/tfjs-node@0.1.17 isn't ok. For me it works with previous v0.1.16.

├─┬ @tensorflow/tfjs@0.13.1
│ ├─┬ @tensorflow/tfjs-converter@0.6.1
│ ├─┬ @tensorflow/tfjs-core@0.13.2
│ └── @tensorflow/tfjs-layers@0.8.1
├─┬ @tensorflow/tfjs-node@0.1.16

Looks like error happens because they started to use @tensorflow/tfjs as dependency, not as devDependency: https://github.com/tensorflow/tfjs-node/commit/c3e1e0659ca8ab3eeac07ff77caa711bf3cb44a9#diff-b9cfc7f2cdf78a7f4b91a753d10865a2R40

Or as another variant, you have to use the same version of @tensorflow/tfjs, as it is specified in dependencies of @tensorflow/tfjs-node@0.1.17:

➜  tfjs-node-doodle git:(master) ✗ npm list|grep tensorflow
├─┬ @tensorflow/tfjs@0.12.7
│ ├─┬ @tensorflow/tfjs-converter@0.5.9
│ ├─┬ @tensorflow/tfjs-core@0.12.17
│ └── @tensorflow/tfjs-layers@0.7.5
├─┬ @tensorflow/tfjs-node@0.1.17
│ ├── @tensorflow/tfjs@0.12.7 deduped
➜  tfjs-node-doodle git:(master) ✗ node main.js 
_____________________________________________________________
Layer (type)                 Output shape              Param #   
=================================================================
dense_Dense1 (Dense)         [null,10]                 60        
_________________________________________________________________
dense_Dense2 (Dense)         [null,4]                  44        
=================================================================
Total params: 104
Trainable params: 104
Non-trainable params: 0
_________________________________________________________________
Tensor
    [[0.3059654, 0.2283318, 0.1902294, 0.2754734],
     [0.3059654, 0.2283318, 0.1902294, 0.2754734]]
(node:30346) Warning: N-API is an experimental feature and could change at any time.
{ modelArtifactsInfo: 
   { dateSaved: 2018-10-01T13:21:02.113Z,
     modelTopologyType: 'JSON',
     modelTopologyBytes: 1006,
     weightSpecsBytes: 248,
     weightDataBytes: 416 } }
Tensor
    [[0.3059654, 0.2283318, 0.1902294, 0.2754734],
     [0.3059654, 0.2283318, 0.1902294, 0.2754734]]
➜  tfjs-node-doodle git:(master) ✗
insensitive commented 5 years ago

@caisq

We plan to support it through the custom fetch configuration to tf.io.httpRequest as I wrote above

Sounds good and I'll have a look tfjs-node code you mentioned.

Thanks.

Hello @hyun-yang I am facing a similar problem. Did you find any workaround to this? As of now I have to host the files on some server.

Thanks

hpssjellis commented 5 years ago

Just thought I should add this as an alternative if the above solutions are not working for you. Found it on Stackoverflow to do with loading local files for use with Ionic may also work for Phonegap.

https://stackoverflow.com/questions/50224003/tensorflowjs-in-ionic/55306342#55306342

Use a polyfill (https://github.com/github/fetch) and replace the fetch.

window.fetch = fetchPolyfill;

Now, it's possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);
nsthorat commented 5 years ago

We're working on a fix that should fix this across the board without a fetch polyfill in node: https://github.com/tensorflow/tfjs-core/pull/1648

caisq commented 5 years ago

FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.

Code sample:

package.json looks like:

{
    "devDependencies": {
        "@tensorflow/tfjs-node": "^1.0.2"
    }
}

Node.js code looks like:

const tf = require('@tensorflow/tfjs-node');

(async function() {
    const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
    const model = await tf.loadLayersModel(modelURL);
    model.summary();
})();
AlberErre commented 5 years ago

FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.

Code sample:

package.json looks like:

{
    "devDependencies": {
        "@tensorflow/tfjs-node": "^1.0.2"
    }
}

Node.js code looks like:

const tf = require('@tensorflow/tfjs-node');

(async function() {
    const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
    const model = await tf.loadLayersModel(modelURL);
    model.summary();
})();

Same behaviour here ^^

In case you are using tfjs-node, updating from ^0.1.21 to ^1.0.2 has solved the issue for me.

thank you @caisq

aminBenSlimen commented 3 years ago

[ Workaround i guess ] hello everyone ... so i'm facing a problem when trying to load my model in ionic 5 it work on browser but wont in android ( i'm using ml5.js but its the same thing since it based on tensorflow.js ) my solution is simply moving all my loading code into the index.html . hope someone find this useful .

fernando12170209 commented 3 years ago

Just thought I should add this as an alternative if the above solutions are not working for you. Found it on Stackoverflow to do with loading local files for use with Ionic may also work for Phonegap.

https://stackoverflow.com/questions/50224003/tensorflowjs-in-ionic/55306342#55306342

Use a polyfill (https://github.com/github/fetch) and replace the fetch.

window.fetch = fetchPolyfill;

Now, it's possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);

Help!!

When code the same appear this in the browser: Not allowed to load local resource image

this is the message in console image