openseg-group / OCNet.pytorch

Please choose the openseg.pytorch project for the updated code that achieve SOTA on 6 benchmarks!
MIT License
812 stars 128 forks source link

CVPR #21

Closed winwinJJiang closed 5 years ago

PkuRainBow commented 6 years ago

@winwinJJiang Sorry, there seems no discription.

winwinJJiang commented 6 years ago

Hi, I mean I found some mistakes in your paper and just one to point it out if you do not mind.

RainbowSecret notifications@github.com 于2018年10月2日周二 上午10:27写道:

@winwinJJiang https://github.com/winwinJJiang Sorry, there seems no discription.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/PkuRainBow/OCNet/issues/21#issuecomment-426294727, or mute the thread https://github.com/notifications/unsubscribe-auth/AeniormsWjLNzvvZ6aBDyi9m_BGTUTpwks5ug3fLgaJpZM4XDPVV .

PkuRainBow commented 6 years ago

@winwinJJiang Ok, please share me the mistakes that you find. I will fix them in the latter version.

Thanks for your attention on our work.

winwinJJiang commented 6 years ago

It's mainly about the introduction. Self-attention is not proposed by the "attention is all you need.", just a fundamental issue.

RainbowSecret notifications@github.com 于2018年10月2日周二 下午3:39写道:

@winwinJJiang https://github.com/winwinJJiang Ok, please share me the mistakes that you find. I will fix them in the latter version.

Thanks for your attention on our work.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/PkuRainBow/OCNet/issues/21#issuecomment-426454148, or mute the thread https://github.com/notifications/unsubscribe-auth/Aenior9sh1xh24yIqBkCqjDfNEmCoxgKks5ug-spgaJpZM4XDPVV .

PkuRainBow commented 6 years ago

@winwinJJiang In fact, I find that the word "self-attention" is firsly used by the nips2017 paper.

Please show me the earlier work if you know exactly. Besides, I find there used to be some similar work use a term "intra-attention".

winwinJJiang commented 6 years ago

Yes. Self-attention/intra-attention is the same thing. https://arxiv.org/pdf/1703.03130.pdf Here is the paper that proposed before the GG paper.

Please check and let me know whether I am right or wrong.

Thank you!

RainbowSecret notifications@github.com 于2018年10月2日周二 下午5:57写道:

@winwinJJiang https://github.com/winwinJJiang In fact, I find that the word "self-attention" is firsly used by the nips2017 paper.

Please show me the earlier work if you know exactly. Besides, I find there used to be some similar work use a term "intra-attention".

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/PkuRainBow/OCNet/issues/21#issuecomment-426478552, or mute the thread https://github.com/notifications/unsubscribe-auth/AeniogTt-lt4PEeH3mCwQXxYehy7k5v_ks5uhAtfgaJpZM4XDPVV .

PkuRainBow commented 6 years ago

@winwinJJiang Thanks for your advice. I will check the issue and fix it latter.

winwinJJiang commented 6 years ago

Good luck with your submission!

RainbowSecret notifications@github.com 于2018年10月2日周二 下午7:33写道:

@winwinJJiang https://github.com/winwinJJiang Thanks for your advice. I will check the issue and fix it latter.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/PkuRainBow/OCNet/issues/21#issuecomment-426492764, or mute the thread https://github.com/notifications/unsubscribe-auth/Aeniopy5jSSCVJWA4HxoQLH7JA7hOkyCks5uhCHhgaJpZM4XDPVV .