Closed hyun-yang closed 4 years ago
Hi!
tf.loadModel not working from local folder ( ie. assets ) however, web version is working, when it runs on http://localhost:8100/ionic-lab
You should check #257, I think it will help you!
@timotheebernard Thanks for your response. Btw, let me clarify this feature request.
It works when I runs on my local sever http://localhost:8100/ionic-lab.
As you mentioned,
https://github.com/tensorflow/tfjs/issues/257
when running your code you should see the following error: Fetch API cannot load [...]/model.json. URL scheme must be "http" or "https" for CORS request. You need to serve your model via a http server that allows CORS request for loading it.
What I really want to do is, I want to load this model from local folder( In this case from assets/model/ ) not using http server.
In ionic( I think lots of hybrid app platform has a same build process), when developer builds a native app for Android, iOS and Windows. Developer might want to load model from the local folder which is already packaged inside output file ( apk, ipa ) not using http server.
It'd be great if we have an API like a tf.loadModelFromLocal.
Thanks.
@hyun-yang to help us understand this use case, could you describe how you would generally load a file from a local folder in ionic?
@tafsiri Thanks for your concern, I'll let you know when I uploaded test project for this.
@tafsiri Just uploaded demo project https://github.com/hyun-yang/tfjsionicdemo
hii, Any progress in this problem
is there any workaround ?
As far as I can read in this thread and in the referenced ones, loading the model from a local folder in ionic (as @hyun-yang mentioned) is still not supported when launching onto the device, but when running on localhost, it is.
Is there any progress in this issue or if it is solved, could you explain how to do it?
@gabrielglbh loadModel in node.js version of TF.js supports file:// URLs. Does that work for Ionic by any chance?
@caisq When installing tfjs-node v0.3.0 in the project I get this error: Cannot find module './tfjs_binding'. So I cannot test wether it works or not in Ionic 3 with the node version of tfjs. When solving this issue, I will notify here if it works.
Following the above comment and as an update, I've tried to import tfjs-node 0.3.0 into my ionic 3 app with no accomplishments:
import * as tf from '@tensorflow/tfjs; require('@tensorflow/tfjs-node'); const tfModel = tf.loadModel('/assets/model.json');
The code above gives me the following error: 'The "original" argument must be of type function'. It appears to be caused when adding the line of require();
.
I've also tried to import all alone the tfjs-node version:
import * as tf from '@tensorflow/tfjs-node'; const tfModel = tf.loadModel('/assets/model.json');
But the following error appears: 'Cannot find module './tfjs_binding'.
So am I doing something wrong when using the tfjs-node version? Or is Ionic still not supported to load models from local files on device using tfjs?
+Nick Kreeger for thoughts on the build error related to './tfjs_binding'. I think this started to appear in the latest release of tfjs-node (0.3.0).
On Thu, Feb 21, 2019 at 4:49 PM Gabriel García notifications@github.com wrote:
Following the above comment and as an update, I've tried to import tfjs-node 0.3.0 into my ionic 3 app with no accomplishments:
`import * as tf from '@tensorflow/tfjs; require('@tensorflow/tfjs-node');
const tfModel = tf.loadModel('/assets/model.json');`
The code above gives me the following error: 'The "original" argument must be of type function'. It appears to be caused when adding the line of require();.
I've also tried to import all alone the tfjs-node version:
import * as tf from '@tensorflow/tfjs-node'; const tfModel = tf.loadModel('/assets/model.json');
But the following error appears: 'Cannot find module './tfjs_binding'.
So am I doing something wrong when using the tfjs-node version? Or is Ionic still not supported to load models from local files on device using tfjs?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/tensorflow/tfjs/issues/272#issuecomment-466180798, or mute the thread https://github.com/notifications/unsubscribe-auth/AQC5fi6Lz3Gr49xd9iE1vgcjGcAAStxrks5vPxRsgaJpZM4T1i77 .
Shanqing Cai Software Engineer Google cais@google.com
Well, after trying for 2 weeks with different versions of tfjs-node and tfjs, loading a model from the local folder of assets, it is not compatible with ionic & angular when deployed on a device.
Latest version I've tried:
The error I am getting on my device when trying to load up the model is:
Based on the provided shape, [3,3,32,32], the tensor should have 9216 values but has 1119
. I found out that this error is due to not loading up correctly the weightManifest
files from local, meaning the model is loading correctly but for some reasons I am not aware of, the weights are not. Note: The shards are located on the same folder as the model.
TensorflowJs loads the models via fetch(). fetch() does not support loading local-files. https://fetch.spec.whatwg.org/
To make this happen, I used this workaround in a Cordova-Projekt:
Import a polyfill (https://github.com/github/fetch) and replace the global-fetch.
window.fetch = fetchPolyfill;
Now, it's possible to load local files (file:///) like:
const modelUrl = './model.json'
const model = await tf.loadGraphModel(modelUrl);
@b-lack Thank you for the workaround. I have tried it, but I still cannot get it right. Looking on the links and documentation provided, I have done the following:
import * as tf from '@tensorflow/tfjs';
import { fetch as fetchPolyfill } from 'whatwg-fetch';
constructor(){
window.fetch = fetchPolyfill;
modelJSON = '/assets/model.json';
const model = await tf.loadLayersModel(modelJSON);
}
When deploying on my android device using Cordova, I get the same error Based on the provided shape, [3,3,32,32], the tensor should have 9216 values but has 1119
.
I have tried also loading the model with tf.loadGraphModel()
but then I get the following error: Uncaught TypeError: Cannot read property 'producer' of undefined
. Looked it up on the issue #1432 and it seems the correct way to load a model from keras (as mine) with tfjs is with tf.loadLayersModel()
.
So I am wondering if the import of the polyfill is wrong or something, as the weight shards of the model still don't load correctly even with the global-fetch overwritten.
@gabrielglbh I don't know if your error message is related to the import. I tried this Polyfill only with GraphModels.
For LayerModels I would recommend to load the model once from a server (https:// ...) and save it locally in localstorage or indexeddb: https://www.tensorflow.org/js/guide/save_load
const model = await tf.loadLayersModel('https://foo.bar/tfjs_artifacts/model.json');
const saveResult = await model.save('localstorage://my-model-1');
From then on, you can load it from localstorage/db
const model = await tf.loadLayersModel('localstorage://my-model-1');
Any follow up on this issue? @gabrielglbh Thanks!
TensorflowJs loads the models via fetch(). fetch() does not support loading local-files. https://fetch.spec.whatwg.org/
To make this happen, I used this workaround in a Cordova-Projekt:
Import a polyfill (https://github.com/github/fetch) and replace the global-fetch.
window.fetch = fetchPolyfill;
Now, it's possible to load local files (file:///) like:
const modelUrl = './model.json' const model = await tf.loadGraphModel(modelUrl);
I consider this as very important. However, I want to add some information here.
For running on devices you need the version from the release section https://github.com/github/fetch/releases
Once integrated with
<script type="text/javascript" src="cordova.js"></script>
<script type="text/javascript" src="js/fetch.umd.js"></script>
you can overwrite window.fetch directly when the app initalizes with:
initialize: function() {
window.fetch = WHATWGFetch.fetch;
document.addEventListener('deviceready', this.onDeviceReady.bind(this), false);
}
and load the model via
async function loadModel() {
try {
tfModelCache = await tf.loadGraphModel('model.json');
return tfModelCache
} catch (err) {
console.log(err)
}
}
Now I am having the problem that path to the shards in model.json is no more correct. However, at least the fetch method is working.
Automatically closing due to lack of recent activity. Please update the issue when new information becomes available, and we will reopen the issue. Thanks!
1, rename model file [group1-shard1of1] to [group1-shard1of1.bin].
2, define custom fetch function:
function customFetchFunc(url, o) {
if (url === 'model/group1-shard1of1') {
return self.fetch(${url}.bin
, o);
}
return self.fetch(url, o);
}
3, set fetchFunc const nsfwModel = await tf.loadLayersModel('models/mobilenet_v2/model.json', { fetchFunc: customFetchFunc });
TensorFlow.js version
0.10.0
Browser version
cli packages: (C:\Users\Administrator\AppData\Roaming\npm\node_modules)
global packages:
local packages:
System:
Describe the problem or feature request
tf.loadModel not working, it fails to load model from local folder ( ie. assets/model ) however, web version is working, when it runs on http://localhost:8100/ionic-lab
Code to reproduce the bug / link to feature request
here is a error message
and here is a data structure