snap-stanford / GEARS

GEARS is a geometric deep learning model that predicts outcomes of novel multi-gene perturbations
MIT License
189 stars 38 forks source link

MLP module last layer activation is always set to ReLU #42

Open dparkSonata opened 8 months ago

dparkSonata commented 8 months ago

The MLP class takes an argument last_layer_act and assigns it to self.activation in the __init__() method, but self.activation is not referenced after initialization in either the self.network PyTorch Module or self.forward().

https://github.com/snap-stanford/GEARS/blob/df09d7ae34e90f5ef25afa389daf7c5c589e710d/gears/model.py#L29

yhr91 commented 8 months ago

Hi thanks for pointing this out. Yes, you are right that this is an error. Will fix in the next version after running some tests.

For now, it should only effect two layers, so I don't think change in the results would be significant even after the fix.

https://github.com/snap-stanford/GEARS/blob/df09d7ae34e90f5ef25afa389daf7c5c589e710d/gears/model.py#L91

https://github.com/snap-stanford/GEARS/blob/df09d7ae34e90f5ef25afa389daf7c5c589e710d/gears/model.py#L118