zhaoqichang / AttentionDTA_BIBM

AttentionDTA: prediction of drug–target binding affinity using attention model.https://ieeexplore.ieee.org/abstract/document/8983125
13 stars 6 forks source link

Hi. I got the problem when I run python DTA_train.py #2

Open sheisxy opened 3 years ago

sheisxy commented 3 years ago

InvalidArgumentError (see above for traceback): indices[56,57] = 66 is not in [0, 65) [[Node: smi_embedding/embedding_lookup = GatherV2[Taxis=DT_INT32, Tindices=DT_INT32, Tparams=DT_FLOAT, _class=["loc:@train_step/Adam/update_smi_embedding/smi_word_embedding/AssignSub"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](smi_embedding/smi_word_embedding/read, input/IteratorGetNext, smi_embedding/embedding_lookup/axis)]]

zhaoqichang commented 3 years ago

please show me which line in DTA_train.py causes this issue.

sheisxy commented 3 years ago

def inference(smi_tensor, pro_tensor, regularizer=None, keep_prob=1, trainlabel=False): with tf.variable_scope('smi_embedding', reuse=tf.AUTO_REUSE): smi_wordembedding = tf.get_variable( "smi_word_embedding", [SMI_DIM, EMBEDDING_DIM])

    smi_embedding = tf.nn.embedding_lookup(smi_wordembedding, smi_tensor)
    pro_wordembedding = tf.get_variable(
        "pro_word_embedding", [PRO_DIM, EMBEDDING_DIM])
    pro_embedding = tf.nn.embedding_lookup(pro_wordembedding, pro_tensor)

this line: smi_embedding = tf.nn.embedding_lookup(smi_wordembedding, smi_tensor)

Do I need some parameters setting ? Thank you so much !

zhaoqichang commented 3 years ago

Please set the SMI_DIM = 67 in DTA_model.py

sheisxy commented 3 years ago

Thank you so much!