migueldeicaza / TensorFlowSharp

TensorFlow API for .NET languages
MIT License
3.14k stars 578 forks source link

How to use python models #265

Closed ksipma closed 6 years ago

ksipma commented 6 years ago

I am looking all night long how to get a tensorflow python trained model inside tensorflowsharp. I know about freezing the model and importing it. I dont know what I am doing wrong, I think it might be: 1 wrong encoding 2 not really a frozen model 3 an unsupported model (I am trying a simple soccer predicter using the DNNclassifier as an example)

I hope someone can help me how to solve this. I attached both python, my c# and current model and the exported model I made. I am willing to write an article/summary about it this workflow because I find tensorflow lacking in some parts of its documentation.

python

#starting the model
model = tf.estimator.DNNClassifier(
    feature_columns=feature_columns,
    hidden_units=[10,10,10],
    optimizer=tf.train.FtrlOptimizer(learning_rate=0.01,l1_regularization_strength=0.001),
    config = tf.estimator.RunConfig(model_dir="MODELDIR")
)

#doing some training
for i in range(2):
    model.train(input_fn=trainSet,max_steps=500000) 

    accuracy_score = model.evaluate(
        input_fn=testSet,
        steps=500000
    )["accuracy"]
    print('Accuracy: {0:f}'.format(accuracy_score))

#generating export columns
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
export_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)

#exporting the application
model.export_savedmodel(r'EXPORTDIR',export_input_fn )

c#

using (var graph = new TFGraph())
        {

            var modelFile = File.ReadAllBytes(@"modelDir");

            //here it starts throwing errors about invalid graphdef

            graph.Import(modelFile);

            var session = new TFSession(graph);
            var runner = session.GetRunner();
            runner.AddInput(graph["Input/IsTraining"][0], false);

            runner.AddInput(graph["Input/Values"][0], new TensorFlow.TFTensor(TFDataType.Float, new long[] { 1, 44, 60 }, 1 * 44 * 60 * 4));
            runner.Fetch(graph["Output/Predictions"][0]);

            var output = runner.Run();

            // Fetch the results from output:
            TFTensor result = output[0];
        }

'

saved_model.zip

truencoa commented 6 years ago

Can you attach the actual exception?

alex-zu commented 6 years ago

model.export_savedmodel(r'EXPORTDIR',export_input_fn ) ... var modelFile = File.ReadAllBytes(@"modelDir"); graph.Import(modelFile);

SavedModel fromat is not just single model-file, it's directory with certain structure, look: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md

Try to use TFSession.FromSavedModel to load model in SavedModel format.

migueldeicaza commented 6 years ago

Added some information to the documentation

yiylin commented 6 years ago

@klaas3 @alex-zu can TFSession.FromSavedModel solve your problem? I tried to use this, but I can not graph["inputs"] is null

alex-zu commented 6 years ago

@yiylin Try to use saved_model_cli to discover your model signature:

$saved_model_cli show --dir \ /tmp/saved_model_dir --tag_set serve --signature_def serving_default The given SavedModel SignatureDef contains the following input(s): inputs['x'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: x:0 The given SavedModel SignatureDef contains the following output(s): outputs['y'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: y:0 Method name is: tensorflow/serving/predict

here "name: x" - x is name of input," name: y" - y is name of output.

In TFSharp use like:

var input = graph["x"][0];
..
var runner = session.GetRunner().Fetch("y", 0);
yiylin commented 6 years ago

@alex-zu thanks. Your reply help me.

ksipma commented 6 years ago

hi @alex-zu and @yiylin . Do you have a example how to use the TFSession.FromSavedModel I have a hard time to fill the metaGraphDef. How to declare it correctly?

alex-zu commented 6 years ago

@klaas3 Hello, metaGraphDef is an output parameter and optional in native, but for some reason @migueldeicaza check it for null. Just pass the dummy TFBuffer.

//instance fields
private TFGraph _tfGraph;
private TFSession _tfSession;

...

//load SavedModel
_tfGraph = new TFGraph();

using (var tmpSess = new TFSession(_tfGraph))
using (var tfSessionOptions = new TFSessionOptions())
using (var metaGraphUnused = new TFBuffer())
{
    //for some reason FromSavedModel is not static
    _tfSession = tmpSess.FromSavedModel(tfSessionOptions, null, exportDir, new[] { "serve" }, _tfGraph, metaGraphUnused);
}
ksipma commented 6 years ago

When I use save_model_cli my outputs are y:input_example_tensor and x has two types: -dnn/head/Tile -dnn/head/predictions/probabilities

input_example_tensor requires a string, in the regular model it should accept 3 floats.

I feel I am not far away from the solution, the above part seems to work fine.

solarflarefx commented 4 years ago

model.export_savedmodel(r'EXPORTDIR',export_input_fn ) ... var modelFile = File.ReadAllBytes(@"modelDir"); graph.Import(modelFile);

SavedModel fromat is not just single model-file, it's directory with certain structure, look: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md

Try to use TFSession.FromSavedModel to load model in SavedModel format.

So I wanted to ask about the current accuracy of the statement that the saved model is not just a single file. There is an example that does this in TensorFlowSharp using the MNist dataset. Python code generates a model and saves the model to a "pb" file. The "pb" file is then loaded in a C# application using TensorFlowSharp:

`byte[] buffer = System.IO.File.ReadAllBytes(modelfile); int countOfFailedClassifications = 0; using (var graph = new TensorFlow.TFGraph()) { graph.Import(buffer); using (var session = new TensorFlow.TFSession(graph)) { foreach (string file in filesTestingRandomized) { Stopwatch sw = new Stopwatch(); sw.Start(); var runner = session.GetRunner(); var tensor = Utils.ImageToTensorGrayScale(file); runner.AddInput(graph["conv1_input"][0], tensor); runner.Fetch(graph["activation_4/Softmax"][0]);

                    var output = runner.Run();
                    var vecResults = output[0].GetValue();
                    float[,] results = (float[,])vecResults;
                    sw.Stop();
                    ///
                    /// Evaluate the results
                    ///
                    int[] quantized = Utils.Quantized(results);
                    ///
                    /// Use the parent folder name to deterimine the expected digit
                    ///
                    string parentFolder = System.IO.Directory.GetParent(file).Name;
                    int iParentFolder = int.Parse(parentFolder);
                    int[] expected = Utils.GetQuantizedExpectedVector(iParentFolder);
                    bool success = quantized.SequenceEqual(expected);
                    if (!success) countOfFailedClassifications++;
                    string message = $"Directory={parentFolder}    File={System.IO.Path.GetFileName(file)}    Bit1={results[0, 0]} Bit2={results[0, 1]}   Elapsed={sw.ElapsedMilliseconds} ms, Success={success}";
                    Console.WriteLine(message);
                }
            }
        }`