helloeve / universal-sentence-encoder-fine-tune

73 stars 16 forks source link

Problem with get_operation_by_name('finetune/init_all_tables') #11

Open germanebr opened 3 years ago

germanebr commented 3 years ago

I've been trying to run the code on Google Colab and my local computer, both using Tensorlow 1.15.

When trying to graph the sentences before fine-tuning, the following error code: The name 'finetune/init_all_tables' refers to an Operation not in the graph.

I downloaded the model directly from the hub and imported it on a new folder: `from tensorflow.python.saved_model import tag_constants

scope = 'finetune'

graph=tf.Graph()

with tf.Session(graph=graph) as sess: model_path = 'D:/Users/GermanEBR/Glite/ITESM/DCI/Tesis/Databases/USE/universal-sentence-encoder-4' tf.saved_model.loader.load(sess, [tag_constants.SERVING], model_path)

sess.run(tf.global_variables_initializer())
sess.run(tf.get_default_graph().get_operation_by_name('finetune/init_all_tables'))

in_tensor = tf.get_default_graph().get_tensor_by_name(scope + '/module/fed_input_values:0')
ou_tensor = tf.get_default_graph().get_tensor_by_name(scope + '/module/Encoder_en/hidden_layers/l2_normalize:0')

run_and_plot(sess, in_tensor, X, ou_tensor)`

I would appreciate if someone knows what the problem is. Thanks in advance

helloeve commented 3 years ago

Hi @germanebr , as you can see from the commit history, this code was written for a quite older version of tensorflow + tensor hub. At that time the parameters within the tensor hub model are not retrainable so I came up with this work around. For latest version of tensorflow, I believe you won't really need to convert the model at all. You should be able to just load the model with the trainable parameter equal to True and then follow the same fine-tuning strategy.