patlevin / tfjs-to-tf

A TensorFlow.js Graph Model Converter
MIT License
136 stars 19 forks source link

Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype #23

Closed loretoparisi closed 3 years ago

loretoparisi commented 3 years ago

Hello, I'm trying to convert this TFJS GraphModel:

tfconv.loadGraphModel(
      'https://tfhub.dev/tensorflow/tfjs-model/toxicity/1/default/1',
      { fromTFHub: true })

The conversion works without any issues with the command tfjs_graph_converter --output_format tf_saved_model ./ ./saved/ But, when I try to load the saved model

tf.node.loadSavedModel(this.path)];

I get the error

(node:39361) UnhandledPromiseRejectionWarning: Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype
loretoparisi commented 3 years ago

Okay this should be related to this PR https://github.com/tensorflow/tfjs/pull/4008

patlevin commented 3 years ago

I'm not sure I understand what you are trying to do. The model in question is a TFJS model, so there is no need to convert it into a saved model; you can just load it directly into node. The model works OOTB with nodejs and in the browser.

loretoparisi commented 3 years ago

Yes I know, but I need it to be loaded locally, and the current GraphModel api cannot load artifacts from file protocol, just uri. Also I think it’s more efficient to load the local files than via http...

Btw it seems last PR in TFJS fixes this, so we just have to wait the next release (current is 2.4.0 where the DINT64 problem is).

Il giorno ven 2 ott 2020 alle 19:22 Patrick Levin notifications@github.com ha scritto:

I'm not sure I understand what you are trying to do.

The model in question is a TFJS model, so there is no need to convert it into a saved model; you can just load it directly into node. The model works OOTB with nodejs and in the browser.

β€” You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/patlevin/tfjs-to-tf/issues/23#issuecomment-702857792, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABH4BIGGOT4BGNYQNDFAWTSIYD5HANCNFSM4SB34A6A .

-- Dott. Ing. Loreto Parisi Parisi Labs

Company: info@parisilabs.com Personal: loretoparisi@gmail.com Twitter: @loretoparisi Web: http://parisilabs.com http://blog.parisilabs.com LinkedIn: http://www.linkedin.com/in/loretoparisi

patlevin commented 3 years ago

Thanks for the explanation! Since I use TF with Python and C++ only, I wasn't aware of this limitation. The converter currently doesn't guarantee to output models that can be loaded by TFJS. I could add a flag to keep the data types compatible, though, if that helps.

patlevin commented 3 years ago

@loretoparisi I have bad news on this one. Unfortunately you'll have to wait for the TFJS team to release an update. Keeping the types compatible would require rewriting the entire graph, identifying and optimising away redundant nodes (e.g. type cast operations), and risking differences in behaviour (though I guess overflow/underflow issues are unlikely, but still).

The reason is that if I change the weight node types, all operations that use them as inputs will need to have their input- and output types changed as well. The latter is the actual problem since the type change now cascades to all nodes connected to that node...

I still might give it a try just for exercise, but don't count on it.

loretoparisi commented 3 years ago

Thanks a lot Patrick, it makes sense. Hopefully the DINT64 support will be ready in a month or so!

patlevin commented 3 years ago

@loretoparisi Good news! I managed to solve the problem by converting incompatible inputs in the graph. I tested the result in tf.node and the converted model loaded just fine.

The new version will be available on PyPi in just a few moments.

loretoparisi commented 3 years ago

@patlevin wow, that's amazing!!! πŸ’― πŸ₯‡ I will test as well and back to you!

loretoparisi commented 3 years ago

@patlevin Just to be sure we are working on the same model.

mbploreto:toxicity_model loretoparisi$ tfjs_graph_converter --output_format tf_saved_model ./ ./saved/
TensorFlow.js Graph Model Converter

Graph model:    ./
Output:         ./saved/
Target format:  tf_saved_model

Converting.... Done.
Conversion took 1.775s
mbploreto:toxicity_model loretoparisi$ tfjs_graph_converter --version

tfjs_graph_converter 1.4.0

Dependency versions:
    tensorflow 2.3.1
    tensorflowjs 2.4.0

I have downloaded the model from the TFHub here and then run the conversion.

This is my JavaScript test:

const tfjsnode = require('@tensorflow/tfjs-node');
const tfconv = require("@tensorflow/tfjs-converter");

var loadGraphModel = function (url) {
  return new Promise(function (resolve, reject) {
    tfconv.loadGraphModel(url,
      { fromTFHub: true })
      .then(res => {
        console.log("loadGraphModel");
        resolve(res);
      })
      .catch(err => reject(err));
  });
}
var loadSavedModel = function (path) {
  return new Promise(function (resolve, reject) {
    tfjsnode.node.loadSavedModel(path)
      .then(res => {
        console.log("loadSavedModel");
        resolve(res);
      })
      .catch(err => reject(err));
  });
}
loadGraphModel('https://tfhub.dev/tensorflow/tfjs-model/toxicity/1/default/1')
  .catch(err => console.error("loadGraphModel", err));
loadSavedModel('/Users/loretoparisi/webservice/toxicity_model/saved')
  .catch(err => console.error("loadSavedModel", err));

this is the current output

$ node load.js
Platform node has already been set. Overwriting the platform with [object Object].
Platform node has already been set. Overwriting the platform with [object Object].
node-pre-gyp info This Node instance does not support builds for N-API version 6
node-pre-gyp info This Node instance does not support builds for N-API version 6
2020-10-05 12:16:46.423579: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-10-05 12:16:46.469215: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1094005d0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-10-05 12:16:46.469265: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version

loadSavedModel Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype
    at mapTFDtypeToJSDtype (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-
models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:469:19)
    at /Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:161:57
    at step (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:48:23)
    at Object.next (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:29:53)
    at fulfilled (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:20:58)

loadGraphModel OK

so it seems the problem is still there

loadSavedModel Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype

but maybe it's my fault at this point...

patlevin commented 3 years ago

@loretoparisi You need to use the compatibility flag:

tfjs_graph_converter --output_format --compat_mode tf_saved_model ./ ./saved/

The compatibility-mode is optional - the default behaviour is to keep all types as-is.

loretoparisi commented 3 years ago

@patlevin ah right!!

tfjs_graph_converter --output_format tf_saved_model --compat_mode ./ ./saved/
TensorFlow.js Graph Model Converter

Graph model:    ./
Output:         ./saved/
Target format:  tf_saved_model

Converting.... Done.
Conversion took 1.667s

...
2020-10-05 12:28:55.915996: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /Users/loretoparisi/webservice/toxicity_model/saved
2020-10-05 12:28:55.943074: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-10-05 12:28:56.009469: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
2020-10-05 12:28:56.010138: I tensorflow/cc/saved_model/loader.cc:212] The specified SavedModel has no variables; no checkpoints were restored. File does not exist: /Users/loretoparisi/webservice/toxicity_model/saved/variables/variables.index
2020-10-05 12:28:56.010284: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 94285 microseconds.
loadSavedModel OK

Perfect it works! Thank you! πŸ₯‡