LibCity / Bigscity-LibCity

LibCity: An Open Library for Urban Spatial-temporal Data Mining
https://libcity.ai/
Apache License 2.0
937 stars 168 forks source link

Should self-connection matrix be added in this line? #300

Closed SonghuaHu-UMD closed 2 years ago

SonghuaHu-UMD commented 2 years ago

Thank you for providing this awesome library. It really helps me a lot in learning different temporal graphs.

When comparing the paper, "T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction" with the TGCN model implemented in this library, I meet one question regarding the calculation of the Laplacian matrix.

The paper does not directly use the normalized graph Laplacian. Instead, it uses 1st-order approximation (Eq. (2) in the paper). While the TGCN in this library seems to directly use the normalized graph Laplacian:

https://github.com/LibCity/Bigscity-LibCity/blob/30c760662603efe8edbb498a27fed801640d09a0/libcity/model/traffic_speed_prediction/TGCN.py#L24

Should the code change by adding a self-connection matrix?

See the code provided by the original authors:

def normalized_adj(adj):
    adj = sp.coo_matrix(adj)
    rowsum = np.array(adj.sum(1))
    d_inv_sqrt = np.power(rowsum, -0.5).flatten()
    d_inv_sqrt[np.isinf(d_inv_sqrt)] = 0.
    d_mat_inv_sqrt = sp.diags(d_inv_sqrt)
    normalized_adj = adj.dot(d_mat_inv_sqrt).transpose().dot(d_mat_inv_sqrt).tocoo()
    normalized_adj = normalized_adj.astype(np.float32)
    return normalized_adj

def sparse_to_tuple(mx):
    mx = mx.tocoo()
    coords = np.vstack((mx.row, mx.col)).transpose()
    L = tf.SparseTensor(coords, mx.data, mx.shape)
    return tf.sparse_reorder(L) 

def calculate_laplacian(adj, lambda_max=1):  
    adj = normalized_adj(adj + sp.eye(adj.shape[0]))
    adj = sp.csr_matrix(adj)
    adj = adj.astype(np.float32)
    return sparse_to_tuple(adj)

(https://github.com/lehaifeng/T-GCN/blob/master/T-GCN/T-GCN-TensorFlow/utils.py#L25)

Thank you!

aptx1231 commented 2 years ago

Thanks for the feedback, there is indeed a problem with the implementation of TCN here.