thtrieu / darkflow

Translate darknet to tensorflow. Load trained weights, retrain/fine-tune using tensorflow, export constant graph def to mobile devices
GNU General Public License v3.0
6.13k stars 2.08k forks source link

Exporting darkflow-YOLO to Tensorflow Serving, but got empty variables #824

Open sugartom opened 6 years ago

sugartom commented 6 years ago

Hi, I was trying to export darkflow-YOLO to Tensorflow Serving, so I can use TF-Serving to run those YOLO models.

But when I used the following script to export darkflow-YOLO, the exported model is almost empty. Specifically, saved_model.pb is only 315 bytes, and directory variables is empty (usually, it should contain two files: variables.data-00000-of-00001 and variables.index)

https://gist.github.com/sugartom/70b58505bf5f28d1cf5d05904f6c0af2

I have used similar code (line 32-58 in above script) to export other models (Inception, caffe-tensorflow's ResNet/VGG), and everything was good. So I would like to ask whether you have any idea why the same code won't apply to darkflow.

Thanks in advance! :-)

gauravgola96 commented 5 years ago

@sugartom Any solutions ?? I am also getting the same issue

Can I use "xyz.index" and "xyz.data-00000-of-00001" created during training yolo for tensorflow serving.

shivaram93 commented 5 years ago

i am also looking to export variables for serving. how can we get that? how we can deploy the model in serving

gauravgola96 commented 5 years ago
def build_model(self):
        with self.sess.graph.as_default():
            x_op = self.sess.graph.get_operation_by_name("input")
            x = x_op.outputs[0]
            pred_op = self.sess.graph.get_operation_by_name("output")
            pred = pred_op.outputs[0]
        with self.sess.graph.as_default():
            prediction_signature = tf.saved_model.signature_def_utils.build_signature_def(
                inputs={
                    "input": tf.saved_model.utils.build_tensor_info(x)
                },
                outputs={
                    "output": tf.saved_model.utils.build_tensor_info(pred)
                },
                method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
            )
            builder = tf.saved_model.builder.SavedModelBuilder('/home/gaurav/serving/darkflow/2')
            builder.add_meta_graph_and_variables(
                self.sess, [tf.saved_model.tag_constants.SERVING],
                signature_def_map={
                    "predict": prediction_signature,
                })
            builder.save()

You can add this code in darkflow>net>bulid.py and it will create tf serving files direclty

shivaram93 commented 5 years ago

@gauravgola96 '/home/gaurav/serving/darkflow/2' - > what specific directory we have to provide Does this create after training ends?

so, we have to deploy with same version of tensorflow and python as same as training and export right. but while training we trained in tensorflow-gpu = 1.9. how about this ?

gauravgola96 commented 5 years ago

@shivaram93 '/home/gaurav/serving/darkflow/2' > this is tf serving file path where it will save You have to call this method after training ends to make tf serving file format. I am not sure about version but tf and python version should ideally be the same.

shivaram93 commented 5 years ago

okay, let me try and let you know. can we able to train locally and deploy the models in google cloud for prediction. does that support prediction and deployment alone?

gauravgola96 commented 5 years ago

@shivaram93 Yes, you can easily train tiny yolov2 on 8GB RAM i7 system Once you will have TF Serving file from build_model() you can deploy and host it on tf serving on Gcloud or anywhere else you want.

anisbhsl commented 4 years ago

@gauravgola96 I am receiving Invalid GraphDef message while trying to serve the file via tf. I have trained a single class tiny-yolov2 model and generated tf serving file as per you instructions above.

gauravgola96 commented 4 years ago

@anisbhsl checkout the input dimensions of the graph. Might be the reason for invalid graph. Not sure Also follow https://github.com/thtrieu/darkflow/issues/403

anisbhsl commented 4 years ago

@gauravgola96 Thanks for reply. Here's the output from saved_model_cli:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['input'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 416, 416, 3)
        name: input:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['output'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 13, 13, 30)
        name: output:0
  Method name is: tensorflow/serving/predict

I have resized image according to input dimensions but this error is generated while reading the file.

NayanDharviya commented 3 years ago

Hey, @gauravgola96 when I am running the script which you gives in your post for exporting the YOLO model into TensorFlow serving format I got the following error!!

AttributeError Traceback (most recent call last)

in () 10 export_path = "./export/1/" 11 ---> 12 tfnet.build_model(export_path = export_path) 13 14 with tfnet.sess.graph.as_default(): AttributeError: 'TFNet' object has no attribute 'build_model' How may I solved this issue?
NayanDharviya commented 3 years ago

Hey, @gauravgola96 when I am running the script which you gives in your post for exporting the YOLO model into TensorFlow serving format I got the following error!!

AttributeError Traceback (most recent call last) in () 10 export_path = "./export/1/" 11 ---> 12 tfnet.build_model(export_path = export_path) 13 14 with tfnet.sess.graph.as_default():

AttributeError: 'TFNet' object has no attribute 'build_model'

How may I solve this issue?

gauravgola96 commented 3 years ago

@NayanDharviya : I did this with TF 1.11 can you try out with my repo