worldbank / REaLTabFormer

A suite of auto-regressive and Seq2Seq (sequence-to-sequence) transformer models for tabular and relational synthetic data generation.
https://worldbank.github.io/REaLTabFormer/
MIT License
200 stars 23 forks source link

Maximum number of columns limitation in tabular GPT-2 model? #62

Open efstathios-chatzikyriakidis opened 6 months ago

efstathios-chatzikyriakidis commented 6 months ago

Hi @avsolatorio,

I would like to ask if there is a limitation on the maximum number of columns that can be passed to a tabular model? Is there an upper limit? Is it going to fail in case there are many columns?

Of course, I am talking about the case of using the classic early stopping mechanism and not the critic one, because in the past we have seen that having the critic metric with high-dimensional data might lead to large memory consumption and many errors can occur.

So, my use case is fitting a tabular model with a simple early stopping (no critic-sensitivity metric). Is GPT-2 going to fail with many columns at its input when training or generation? I have in my dataset mixed types like, text, float, datetime, int, etc. Text columns are not going to be very lengthy they are just categorical values.

Lastly, is it possible that the tabular model to have such limitation while the relational not? If that's the case, maybe I could fit a relational instead as I see here https://github.com/worldbank/REaLTabFormer/issues/22#issuecomment-1598082977 by providing no parent. Also, in the past (https://github.com/worldbank/REaLTabFormer/issues/11) I remember you have told me that the relational model has no limitation like the pre-trained GPT-2 but probably that was only for the generation part?

Thanks!