ArunMichaelDsouza / tensorflow-image-detection

A generic image detection program that uses Google's Machine Learning library, Tensorflow and a pre-trained Deep Learning Convolutional Neural Network model called Inception.
MIT License
328 stars 88 forks source link

ValueError: GraphDef cannot be larger than 2GB. #2

Closed lunasdejavu closed 6 years ago

lunasdejavu commented 7 years ago

I trained with 10509 images 2 classes

then accuracy was good for 81.6% but when I tried to rewrite classify.py I encounter this error ValueError: GraphDef cannot be larger than 2GB.

`#os.environ['TF_CPP_MIN_LOG_LEVEL']='2' a=0

for impath in glob.glob('/home/lunasdejavu/Downloads/image-classification-tensorflow-master/validation/*.jpg'): image_path = impath `

then

if predictions[0][0]>predictions[0][1]: a+=1 print (a/577)

I checked the stackoverflow already https://stackoverflow.com/questions/36349049/overcome-graphdef-cannot-be-larger-than-2gb-in-tensorflow but I still don't know how to modify :( `

ArunMichaelDsouza commented 7 years ago

@lunasdejavu what exactly did you modify ?

lunasdejavu commented 7 years ago

`import tensorflow as tf

import os

import glob

Disable tensorflow compilation warnings

os.environ['TF_CPP_MIN_LOG_LEVEL']='2'

a=0

for impath in glob.glob('/home/lunasdejavu/Downloads/image-classification-tensorflow-master/validation/*.jpg'): image_path = impath

Read the image_data

image_data = tf.gfile.FastGFile(image_path, 'rb').read()

Loads label file, strips off carriage return

label_lines = [line.rstrip() for line
    in tf.gfile.GFile("logs/trained_labels.txt")]

Unpersists graph from file

    with tf.gfile.FastGFile("logs/trained_graph.pb", 'rb') as f:
        graph_def=tf.GraphDef()
        graph_def.ParseFromString(f.read())
    _=tf.import_graph_def(graph_def, name='')
    with tf.Session() as sess:
     #   Feed the image_data as input to the graph and get first prediction
        softmax_tensor=sess.graph.get_tensor_by_name('final_result:0')
        predictions = sess.run(softmax_tensor, \
    {'DecodeJpeg/contents:0': image_data})
# Sort to show labels of first prediction in order of confidence
top_k = predictions[0].argsort()[-len(predictions[0]):][::-1]
if predictions[0][0]>predictions[0][1]: 
    a+=1 

print ('GG')

print (a/577)

#for node_id in top_k:
    #human_string = label_lines[node_id]
    #score = predictions[0][node_id]
    #print('%s (score = %.5f)' % (human_string, score))`

that is for the classify.py the training dataset is 5677 for A 5768 for B validalition 577 for each

ghost commented 7 years ago

Loading model every time in loop ,if we add takes time .how to implement that load model once and test many time in classify. Py

shahnawaj5488 commented 7 years ago

arun what is the best way to deploy this retrained model as could not find anything worthwile on the net

shahnawaj5488 commented 7 years ago

want to consume the model and utilise as a service

ArunMichaelDsouza commented 6 years ago

You want to build a service to classify a particular set of images ?

ArunMichaelDsouza commented 6 years ago

Closing due to inactivity.