storyandwine / LAGCN

Code and Datasets for "Predicting Drug-Disease Associations through Layer Attention Graph Convolutional Networks"
51 stars 15 forks source link

utils.py中的dropout_sparse函数问题 #16

Closed porvinci closed 2 years ago

porvinci commented 2 years ago

utils.py文件中第18行开始

def dropout_sparse(x, keep_prob, num_nonzero_elems):
    noise_shape = [num_nonzero_elems]
    random_tensor = keep_prob
    random_tensor += tf.random_uniform(noise_shape)
    dropout_mask = tf.cast(tf.floor(random_tensor), dtype=tf.bool)
    pre_out = tf.sparse_retain(x, dropout_mask)
    return pre_out*(1./keep_prob)

该函数的含义好像是:每个元素以keep_prob的概率保留原值并*(1/keep_prob),并且以1-keep_prob的概率将值设置为0, 而代码的意思是随机的将一些值设置为0,好像并没有体现以1-keep_prob将值设为0的意思?

storyandwine commented 2 years ago

See sparse_retain function https://www.tensorflow.org/api_docs/python/tf/sparse/retain.

porvinci commented 2 years ago

tf.random_uniform生成的值的范围是[0,1)并且服从均匀分布,只是它是如何体现以概率1-keep_prob将值设为0的呢? sparse_retain的操作我是明白的。

storyandwine commented 2 years ago

none in sparse matrix means zero

porvinci commented 2 years ago

none in sparse matrix means zero 比如说keep_prob=0.4,tf.random_uniform跟tf.floor的组合是如何实现这个概率0.6的?

storyandwine commented 2 years ago

keep_prob =0.4 random_tensor will in [0.4,1.4) tf.floor(random_tensor) will be 0.4 prob 1, 0.6 prob 0. 40% of masks will be true, others are false. So only 40% of sparse values will remain, and others will drop. Maybe you can debug it carefully instead of simply throwing out these simple questions. Just run the code line by line, and you will definitely get a deeper understanding, go for it!

porvinci commented 2 years ago

多谢,这次明白了。其实整个文件我都是一行一行运行着来看的,只是这一块打印出来是占位符,看不到具体的值,所以我一直不是太明白,抱歉打扰了。

storyandwine commented 2 years ago

多谢,这次明白了。其实整个文件我都是一行一行运行着来看的,只是这一块打印出来是占位符,看不到具体的值,所以我一直不是太明白,抱歉打扰了。

Sorry for misunderstanding you. Print in tf1 is a little different with original python, you need to create a tf session and use session.run function. Best wishes!