migueldeicaza / TensorFlowSharp

TensorFlow API for .NET languages
MIT License
3.14k stars 578 forks source link

How to serialize tensor input required by dnnclassifier(serving_input_reciever) #293

Open ksipma opened 6 years ago

ksipma commented 6 years ago

I want to be able to use the dnnclassifier (estimator) on top of IIS, which has previously been trained in python. I got so far that I can now generate PB files, know the correct input/outputs, however I am stuck in tensorflowsharp using string inputs.

I can create a valid .pb file of the iris dataset (attached). It uses the following feate_spec:

{'SepalLength': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None), 'SepalWidth': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None), 'PetalLength': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None), 'PetalWidth': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None)}

I have created a simple c# console to try and spin it up. The input should be an "input_example_tensor" and the output is located in "dnn/head/predictions/probabilities". This I discoved after alex_zu provided help using the saved_model_cli command here.

As far as I am aware all tensorflow estimator API's work like this.

Here comes the problem: the input_example_tensor should be of a string format which will be parsed internally by the ParseExample function. Now i am stuck. I have found TFTensor.CreateString, but this doesn't solve the problem.

using System;
using TensorFlow;

namespace repository
{
    class Program
    {
        static void Main(string[] args)
        {
            using (TFGraph tfGraph = new TFGraph()){
                using (var tmpSess = new TFSession(tfGraph)){
                    using (var tfSessionOptions = new TFSessionOptions()){
                        using (var metaGraphUnused = new TFBuffer()){

                            //generating a new session based on the pb folder location with the tag serve
                            TFSession tfSession = tmpSess.FromSavedModel(
                                tfSessionOptions,
                                null,
                                @"path/to/model/pb", 
                                new[] { "serve" }, 
                                tfGraph, 
                                metaGraphUnused
                            );

                            //generating a new runner, which will fetch the tensorflow results later
                            var runner = tfSession.GetRunner();

                            //this is in the actual tensorflow documentation, how to implement this???
                            string fromTensorflowPythonExample = "{'SepalLength': [5.1, 5.9, 6.9],'SepalWidth': [3.3, 3.0, 3.1],'PetalLength': [1.7, 4.2, 5.4],'PetalWidth': [0.5, 1.5, 2.1],}";

                            //this is the problem, it's not working...
                            TFTensor rawInput = new TFTensor(new float[4]{5.1f,3.3f,1.7f,0.5f});
                            byte[] serializedTensor = System.Text.Encoding.ASCII.GetBytes(rawInput.ToString());
                            TFTensor inputTensor = TensorFlow.TFTensor.CreateString (serializedTensor);

                            runner.AddInput(tfGraph["input_example_tensor"][0], inputTensor);
                            runner.Fetch("dnn/head/predictions/probabilities", 0);

                            //start the run and get the results of the iris example
                            var output = runner.Run();
                            TFTensor result = output[0];

                            //printing response to the client
                            Console.WriteLine(result.ToString());
                            Console.ReadLine();
                        } 
                    }
                }
            }
        }
    }
}

This example will give the following error:

An unhandled exception of type 'TensorFlow.TFException' occurred in TensorFlowSharp.dll: 'Expected serialized to be a vector, got shape: []
     [[Node: ParseExample/ParseExample = ParseExample[Ndense=4, Nsparse=0, Tdense=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT], dense_shapes=[[1], [1], [1], [1]], sparse_types=[], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_example_tensor_0_0, ParseExample/ParseExample/names, ParseExample/ParseExample/dense_keys_0, ParseExample/ParseExample/dense_keys_1, ParseExample/ParseExample/dense_keys_2, ParseExample/ParseExample/dense_keys_3, ParseExample/Const, ParseExample/Const, ParseExample/Const, ParseExample/Const)]]'

How can I serialize tensors in such a way that i can use the pb file correctly?

Attached is the python iris example ,pb file and the console application program. pbfile and python.zip

I also posted it on stackoverflow

In my opinion solving this creates a neat solution for all tensorflow users having ancient production environments (like me).

ibrcic commented 6 years ago

Have you ever resolved this? Currently have the same problem.

ksipma commented 6 years ago

Unfortunatly i did not

songzy12 commented 5 years ago

How about now? Currently have the same problem too...

azmathmoosa commented 5 years ago

+1 on this. Stuck here. Is there any workaround?

ksipma commented 5 years ago

Yes there is. Stop trying with this package. This project just doesn't work. I finally got around by using keras and kerassharp.

azmathmoosa commented 5 years ago

Thanks @klaas3 . I've trained the model in tensorflow though. @migueldeicaza Is there a solution to this? The examples shown are only for image classifiers and detectors. Would be good if you could show how to use DNN classifiers with your library.

HenrikasRS commented 5 years ago

Also stuck here, the last step to be resolved for my model to run. Any news on this issue?

ForeverPs commented 5 years ago

OMG, just one year past, is there any solutions?

fabiosoto commented 4 years ago

Same problem here. Anyone ?

rs22 commented 4 years ago

I posted an answer to this on StackOverflow -- maybe this still helps someone.

https://stackoverflow.com/a/63227808/5831785