tensorflow / models

Models and examples built with TensorFlow
Other
77.19k stars 45.75k forks source link

[Deeplab] Inconsistent performance between vis.py and frozen model inference #6593

Open kayvonkhosrowpour opened 5 years ago

kayvonkhosrowpour commented 5 years ago

System information

Describe the problem

I posted this on Stack Overflow because I am not sure if this is a bug or if I am misusing the code. I have successfully trained my model on a custom dataset with 4 classes of size 480x640, with an xception65 encoder, using Deeplab. I am getting decent results on the validation set whenever I use the vis.py script:

However, I am not getting the same results on the same images when I freeze the model. I froze the model using export_model.py and successfully outputted a frozen_model.pb file. However, when I run inferences using this pb file, the outputs are always 0 (i.e. everything is classified as "background") on the same exact images I provided links to above. Everything is black!

I believe this to be an issue with how the model is exported, and not necessarily with the model itself because the performance on the images is different between running the vis.py script and my code for inference. However, if I am loading the model incorrectly please let me know. Perhaps I am not loading the graph or initializing the variables correctly. Or perhaps I'm not saving the weights correctly in the first place. Any help would be greatly appreciated!

Source code

And below is my code for exporting the model, using the provided export_model.py script.

python export_model.py \
--logtostderr \
--atrous_rates=6 \
--atrous_rates=12 \
--atrous_rates=18 \
--output_stride=16 \
--checkpoint_path="/path/to/.../model.ckpt-32245" \
--export_path="/path/to/.../frozen_4_11_19.pb" \
--model_variant="xception_65" \
--num_classes=4 \
--crop_size=481 \
--crop_size=641 \
--inference_scales=1.0

Below I provide my code for inference:

from deeplab.utils import get_dataset_colormap
from PIL import Image
import tensorflow as tf
import time
import matplotlib.pyplot as plt
import numpy as np
import cv2
import os
import glob

# tensorflow arguments
flags = tf.app.flags  # flag object for setup
FLAGS = flags.FLAGS   # object to access initialized flags
flags.DEFINE_string('frozen', None,
                    'The path/to/frozen.pb file.')

def _load_graph(frozen):
    print('Loading model `deeplabv3_graph` into memory from',frozen)
    with tf.gfile.GFile(frozen, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
    with tf.Graph().as_default() as graph:
        tf.import_graph_def(
            graph_def, 
            input_map=None, 
            return_elements=None, 
            name="", 
            op_dict=None, 
            producer_op_list=None
        )
    return graph

def _run_inferences(sess, image, title):
    batch_seg_map = sess.run('SemanticPredictions:0',
        feed_dict={'ImageTensor:0': [np.asarray(image)]})
    semantic_prediction = get_dataset_colormap.label_to_color_image(batch_seg_map[0],
        dataset=get_dataset_colormap.__PRDL3_V1).astype(np.uint8)
    plt.imshow(semantic_prediction)
    plt.axis('off')
    plt.title(title)
    plt.show()

def main(argv):
    # initialize model
    frozen = os.path.normpath(FLAGS.frozen)
    assert os.path.isfile(frozen)
    graph = _load_graph(frozen)

    # open graph resource and begin inference in-loop
    with tf.Session(graph=graph) as sess:
        for img_path in glob.glob('*.png'):
            img = Image.open(img_path).convert('RGB')
            _run_inferences(sess, img, img_path)

if __name__ == '__main__':
    flags.mark_flag_as_required('frozen')
    tf.app.run()  # call the main() function
frankmanbb commented 5 years ago

I have the same problem, has you resolved your issuer? Thanks

oerroerr commented 3 years ago

hi kayvonkhosrowpour. did you solve this problem? i met same problem.