open-mmlab / mmocr

OpenMMLab Text Detection, Recognition and Understanding Toolbox
https://mmocr.readthedocs.io/en/dev-1.x/
Apache License 2.0
4.35k stars 750 forks source link

Edge Embedding Not updated by Every GNN Layer #1122

Open amitbcp opened 2 years ago

amitbcp commented 2 years ago

https://github.com/open-mmlab/mmocr/blob/64fb6fffc06f168355c8e3d28f5b31f26536f485/mmocr/models/kie/heads/sdmgr_head.py#L82

The GNN Layers return node embedding and edge embedding . But the edge_embedding is name as cat_nodes. That is the updated edge embedding is not going as input to the next GNN Layer.

Is there a reason for not updating the edge embedding from every GNN layers

@cuhk-hbsun

cuhk-hbsun commented 2 years ago

Yes, it is a bug. Would you raise a PR to help us fix it?

amitbcp commented 2 years ago

@cuhk-hbsun sure !

amitbcp commented 2 years ago

@cuhk-hbsun created the PR #1134 . though its a very small change. I quickly ran the training for 2 epochs and it works as expected.

Let me know if anything else is required to merge the PR

amitbcp commented 2 years ago

@cuhk-hbsun while going over the code I had another doubt in terms of understanding that I want to clarify

https://github.com/open-mmlab/mmocr/blob/64fb6fffc06f168355c8e3d28f5b31f26536f485/mmocr/models/kie/heads/sdmgr_head.py#L113

  1. As per my understanding the residue is kind of attention that we are learning ?
  2. We later multiply the attention to the with the embedding in L116 ?

https://github.com/open-mmlab/mmocr/blob/64fb6fffc06f168355c8e3d28f5b31f26536f485/mmocr/models/kie/heads/sdmgr_head.py#L116

If the above 2 understanding is correct, then my major doubt is at line :

https://github.com/open-mmlab/mmocr/blob/64fb6fffc06f168355c8e3d28f5b31f26536f485/mmocr/models/kie/heads/sdmgr_head.py#L121

  1. As we are adding the original node embedding, this can be considered as a skip connection for Node embedding ?
  2. If its a skip connection, is there any intuition or reason for it ? And why are we not using skip connections for edges then?
amitbcp commented 2 years ago

@cuhk-hbsun this might have been a missed notification, can you please check the last question and share your views on it ?