yifan-h / CS-GNN

Measuring and Improving the Use of Graph Information in Graph Neural Networks
MIT License
82 stars 10 forks source link

some question about Feature smoothness #2

Closed junkangwu closed 4 years ago

junkangwu commented 4 years ago

Nice to meet you, I am interesting in your paper in 2020ICLR-"MEASURING AND IMPROVING THE USE OF GRAPH INFORMATION IN GRAPH NEURAL NETWORKS".But when I plan to caculate some feature smoothness,I found it has some difference.I wish you could help me out! To caculate the feature smoothness of dataset --cora, I use the folloing codes:

result=np.zeros(features.shape[1])
for i in range(features.shape[0]):
    z=np.zeros((1,features.shape[1]))
    for j in range(features.shape[0]):
        if adj[i,j]==1 and i!=j:
            z+=features[i].toarray()-features[j].toarray()
    zz=np.squeeze(z)
    result+=zz*zz

result=np.sum(result)/(features.shape[1]*5429)
print(result)

But the result of my codes is so big.I'm little confused ! THANKS A LOT!

yifan-h commented 4 years ago

Hi there, thanks for your attention. For the feature smoothness, the definition is derived from graph signal processing. You can check with PyGSP.

As for your code, I think it's alright. But note that the feature vectors of cora are not normalized. In the paper I mentioned that the feature space should be [0,1]^{d_k}, which means you need to normalize it first.

from sklearn import preprocessing
min_max_scaler = preprocessing.MinMaxScaler()
features= min_max_scaler.fit_transform(features)

Then I think you'll get the right anwser!

junkangwu commented 4 years ago

oh, I understand it ! Thanks so much!! I ignore it before... Thanks for your patience!

yifan-h commented 4 years ago

Great! I'll close the issue.