Closed saja1994 closed 6 years ago
It's easy to find exception occurred as the type of X_sent_for_word
is float32
while the type of sentences
is int32
, you can convert the type of sentences
to float32
for solving the problem. The solution is shown as follows:
sentences = tf.cast(sentences,tf.float32)
sentences = tf.concat([self.sentences, X_sent_for_word], 2)
PS: You can use ``` ``` or insert code tool in the upper right to make you problem to be read easily.
Thank you for your effort. I need to add more features to word embeddings such as lexical features (positive, negative, and neutral) where each one of these features represented as a one-hot vector. So can you please help me to concatenate such features.
I have already tried to do that by adding the following code at utils.py: def load_sentiment_dictionary(): pos_list = list() neg_list = list() sent_words_dict = dict() fneg = open('pos.txt', 'r',encoding='utf8') fpos = open('neg.txt', 'r',encoding='utf8') for line in fpos: if not line.strip() in sent_words_dict: sent_words_dict[line.strip()] = 0 pos_list.append(line.strip()) for line in fneg: if not line.strip() in sent_words_dict: sent_words_dict[line.strip()] = 1 neg_list.append(line.strip())
fneg.close() fpos.close() return pos_list, neg_list, sent_words_dict pos_list, neg_list, sent_words_dict = load_sentiment_dictionary() sentiment_for_word_tmp = list() with open('words.txt','r', encoding="utf8") as f: for line in f: for word in line: words = line.strip().split(' ') for word in words: if word in pos_list: sentiment_for_word_tmp.append(1) elif word in neg_list: sentiment_for_word_tmp.append(2) else: sentiment_for_word_tmp.append(0) sentiment_for_word.append(sentiment_for_word_tmp)
and adding the following code at model.py: self.tf_X_sent_for_word = tf.placeholder(tf.int32, shape = [None, self.max_sentence_len]) X_sent_for_word = tf.one_hot(self.tf_X_sent_for_word, 3, on_value = 20.0, off_value = 10.0, axis = -1) sentences = tf.concat([self.sentences, X_sent_for_word], 2) sentences = tf.cast(sentences,tf.float32)
but using this way I got the following error: Traceback (most recent call last): File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 455, in _apply_op_helper as_ref=input_arg.is_ref) File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\framework\ops.py", line 1209, in internal_convert_n_to_tensor ctx=ctx)) File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\framework\ops.py", line 1144, in internal_convert_to_tensor ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref) File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\framework\ops.py", line 981, in _TensorTensorConversionFunction (dtype.name, t.dtype.name, str(t))) ValueError: Tensor conversion requested dtype int32 for Tensor with dtype float32: 'Tensor("inputs/one_hot:0", shape=(?, 86, 3), dtype=float32)'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "main_with_lex.py", line 55, in
tf.app.run()
File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\platform\app.py", line 125, in run
_sys.exit(main(argv))
File "main_with_lex.py", line 50, in main
model.build_model()
File "C:\Users\Saja\Desktop\RAM-master - Copy\model_with_lex.py", line 44, in build_model
sentences = tf.concat([self.sentences, X_sent_for_word], 2)
File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\ops\array_ops.py", line 1124, in concat
return gen_array_ops.concat_v2(values=values, axis=axis, name=name)
File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\ops\gen_array_ops.py", line 1201, in concat_v2
"ConcatV2", values=values, axis=axis, name=name)
File "M:\Anaconda\envs\py3\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 483, in _apply_op_helper
raise TypeError("%s that don't all match." % prefix)
TypeError: Tensors in list passed to 'values' of 'ConcatV2' Op have types [int32, float32] that don't all match.
Please if you can help me to implement this idea or to solve this error. Thank you.