Closed Haiyan-Chris-Wang closed 5 years ago
Hi,
I think we mentioned in the paper that skip connections are used. It roughly helps combining features from different scales
Best, Yue
On Thu, Jan 24, 2019 at 11:15 AM whyccny notifications@github.com wrote:
@WangYueFt https://github.com/WangYueFt @syb7573330 https://github.com/syb7573330 Could you help me explain why do we need to perform tf.concat here to combine the previous net? It seems didn't mention in the paper.
Also does anyone understands this line? Please help
https://github.com/WangYueFt/dgcnn/blob/29948ad95d2e8843de542fae910a7f495f549160/models/dgcnn.py#L79
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/WangYueFt/dgcnn/issues/13, or mute the thread https://github.com/notifications/unsubscribe-auth/AJhhNiGQCvrOr7gy93W6PNO2nk4TZwYoks5vGdw4gaJpZM4aRNhD .
@WangYueFt Thanks for your reply! Just one more question, I find that you compare the result with baseline in the paper. As you mentioned, the baseline is using fixed knn graph rather dynamic graph. So could you help me explain what is the difference between fixed knn graph and dynamic knn graph?
@WangYueFt @syb7573330 From my understanding, the fixed knn graph means that you only calculate the adjacent matrix once while for the dynamic graph, the adjacent matrix is different for each layer. Is that right?
@WangYueFt @syb7573330 Could you help me explain why do we need to perform tf.concat here to combine the previous net? It seems didn't mention in the paper.
Also does anyone understands this line? Please help
https://github.com/WangYueFt/dgcnn/blob/29948ad95d2e8843de542fae910a7f495f549160/models/dgcnn.py#L79