Closed seonwoo-min closed 2 years ago
@mswzeus
Are there any model architecture or training hyperparameter updates?
Yes. There are some differences in implementation between v1.0 and v2.0.
Do you have specific reasons to use fairseq backbone? Is it for the mixed-precision training?
We switch to fairseq backbone for efficiency. And it is easier to schedule a distributed training task with fairseq. And it is easy to extend with new models and tasks.
Hi, I have a quick question about Graphormer v2.0. What are the main differences between Graphormer v1.0 and v2.0?
Thanks in advance!