Closed ksipma closed 6 years ago
Can you attach the actual exception?
model.export_savedmodel(r'EXPORTDIR',export_input_fn ) ... var modelFile = File.ReadAllBytes(@"modelDir"); graph.Import(modelFile);
SavedModel fromat is not just single model-file, it's directory with certain structure, look: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md
Try to use TFSession.FromSavedModel to load model in SavedModel format.
Added some information to the documentation
@klaas3 @alex-zu can TFSession.FromSavedModel solve your problem? I tried to use this, but I can not graph["inputs"] is null
@yiylin Try to use saved_model_cli to discover your model signature:
$saved_model_cli show --dir \ /tmp/saved_model_dir --tag_set serve --signature_def serving_default The given SavedModel SignatureDef contains the following input(s): inputs['x'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: x:0 The given SavedModel SignatureDef contains the following output(s): outputs['y'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: y:0 Method name is: tensorflow/serving/predict
here "name: x" - x is name of input," name: y" - y is name of output.
In TFSharp use like:
var input = graph["x"][0];
..
var runner = session.GetRunner().Fetch("y", 0);
@alex-zu thanks. Your reply help me.
hi @alex-zu and @yiylin . Do you have a example how to use the TFSession.FromSavedModel I have a hard time to fill the metaGraphDef. How to declare it correctly?
@klaas3 Hello, metaGraphDef is an output parameter and optional in native, but for some reason @migueldeicaza check it for null. Just pass the dummy TFBuffer.
//instance fields
private TFGraph _tfGraph;
private TFSession _tfSession;
...
//load SavedModel
_tfGraph = new TFGraph();
using (var tmpSess = new TFSession(_tfGraph))
using (var tfSessionOptions = new TFSessionOptions())
using (var metaGraphUnused = new TFBuffer())
{
//for some reason FromSavedModel is not static
_tfSession = tmpSess.FromSavedModel(tfSessionOptions, null, exportDir, new[] { "serve" }, _tfGraph, metaGraphUnused);
}
When I use save_model_cli my outputs are y:input_example_tensor and x has two types: -dnn/head/Tile -dnn/head/predictions/probabilities
input_example_tensor requires a string, in the regular model it should accept 3 floats.
I feel I am not far away from the solution, the above part seems to work fine.
model.export_savedmodel(r'EXPORTDIR',export_input_fn ) ... var modelFile = File.ReadAllBytes(@"modelDir"); graph.Import(modelFile);
SavedModel fromat is not just single model-file, it's directory with certain structure, look: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md
Try to use TFSession.FromSavedModel to load model in SavedModel format.
So I wanted to ask about the current accuracy of the statement that the saved model is not just a single file. There is an example that does this in TensorFlowSharp using the MNist dataset. Python code generates a model and saves the model to a "pb" file. The "pb" file is then loaded in a C# application using TensorFlowSharp:
`byte[] buffer = System.IO.File.ReadAllBytes(modelfile); int countOfFailedClassifications = 0; using (var graph = new TensorFlow.TFGraph()) { graph.Import(buffer); using (var session = new TensorFlow.TFSession(graph)) { foreach (string file in filesTestingRandomized) { Stopwatch sw = new Stopwatch(); sw.Start(); var runner = session.GetRunner(); var tensor = Utils.ImageToTensorGrayScale(file); runner.AddInput(graph["conv1_input"][0], tensor); runner.Fetch(graph["activation_4/Softmax"][0]);
var output = runner.Run();
var vecResults = output[0].GetValue();
float[,] results = (float[,])vecResults;
sw.Stop();
///
/// Evaluate the results
///
int[] quantized = Utils.Quantized(results);
///
/// Use the parent folder name to deterimine the expected digit
///
string parentFolder = System.IO.Directory.GetParent(file).Name;
int iParentFolder = int.Parse(parentFolder);
int[] expected = Utils.GetQuantizedExpectedVector(iParentFolder);
bool success = quantized.SequenceEqual(expected);
if (!success) countOfFailedClassifications++;
string message = $"Directory={parentFolder} File={System.IO.Path.GetFileName(file)} Bit1={results[0, 0]} Bit2={results[0, 1]} Elapsed={sw.ElapsedMilliseconds} ms, Success={success}";
Console.WriteLine(message);
}
}
}`
I am looking all night long how to get a tensorflow python trained model inside tensorflowsharp. I know about freezing the model and importing it. I dont know what I am doing wrong, I think it might be: 1 wrong encoding 2 not really a frozen model 3 an unsupported model (I am trying a simple soccer predicter using the DNNclassifier as an example)
I hope someone can help me how to solve this. I attached both python, my c# and current model and the exported model I made. I am willing to write an article/summary about it this workflow because I find tensorflow lacking in some parts of its documentation.
python
c#
'
saved_model.zip