wudongming97 / TopoMLP

[ICLR2024] TopoMLP: A Simple yet Strong Pipeline for Driving Topology Reasoning
Apache License 2.0
131 stars 11 forks source link

Why "is_detach" in topo_ll_head and topo_lt_head is True? #21

Closed night3759 closed 2 months ago

night3759 commented 2 months ago

If the "self.is_detach" is True, the MLP of topo_ll_head and topo_lt_head may not be trained. Do I understand right? Why detach?

wudongming97 commented 2 months ago

The topology head is trained, but its update would not influence the detection when using "self.is_detach". From my experiment, it doesn't matter if you use "self.is_detach".

Wolfybox commented 2 months ago

The topology head is trained, but its update would not influence the detection when using "self.is_detach". From my experiment, it doesn't matter if you use "self.is_detach".

does "self.is_detach" option prevent the gradients from bp to the upstream module?

wudongming97 commented 2 months ago

yep.

Wolfybox commented 2 months ago

yep.

Thanks .