automl / PFNs

Our maintained PFN repository. Come here to train SOTA PFNs.
Apache License 2.0
51 stars 10 forks source link

Bug in Embedding Encoder? #8

Open tomviering opened 4 months ago

tomviering commented 4 months ago

I think there is a bug in the Embedding Encoder, see here: https://github.com/automl/PFNs/blob/fd212b187a22d5a07959484d03fcd82e060e9b21/pfns/encoders.py#L68 Currently its stated that (x - self.min_max[0] // split_size).int().clamp(0, self.num_embs - 1) but I believe it should be ((x - self.min_max[0]) // split_size).int().clamp(0, self.num_embs - 1)