LeoGrin / tabular-benchmark

448 stars 59 forks source link

Bug in MLP #14

Closed leor-c closed 1 year ago

leor-c commented 1 year ago

Hi! First, I want to thank you for the great paper and repo, your work is highly appreciated! I think I've found a bug in the implementation of the MLP in src/tabular/bin/mlp.py

In line 46: d_layers.append(d_out)
In line 55: self.head = nn.Linear(d_layers[-1] if d_layers else d_in, d_out)

The outcome is that self.head will always be a linear map from dim d_out to dim d_out. This should be fixed by removing line 46.

I hope this helps!

LeoGrin commented 1 year ago

Hi! Thank you very much for pointing it out! I'm late to reply, but I fixed it for the final version of the paper.