Open tomviering opened 4 months ago
I think there is a bug in the Embedding Encoder, see here: https://github.com/automl/PFNs/blob/fd212b187a22d5a07959484d03fcd82e060e9b21/pfns/encoders.py#L68 Currently its stated that (x - self.min_max[0] // split_size).int().clamp(0, self.num_embs - 1) but I believe it should be ((x - self.min_max[0]) // split_size).int().clamp(0, self.num_embs - 1)
(x - self.min_max[0] // split_size).int().clamp(0, self.num_embs - 1)
((x - self.min_max[0]) // split_size).int().clamp(0, self.num_embs - 1)
I think there is a bug in the Embedding Encoder, see here: https://github.com/automl/PFNs/blob/fd212b187a22d5a07959484d03fcd82e060e9b21/pfns/encoders.py#L68 Currently its stated that
(x - self.min_max[0] // split_size).int().clamp(0, self.num_embs - 1)
but I believe it should be((x - self.min_max[0]) // split_size).int().clamp(0, self.num_embs - 1)