Closed guotong1988 closed 4 years ago
We need not execute the SQL during inference.
Instead, we could cache the content in memory first.
Also we could only cache the more important table content, not all the content data.
If you use table content during training please add a ^
to the first column.
caching content during inference is the same as using content during inference.
OK. I will.
From my understanding of this paper, the two extra feature vectors it adds need access to the table content, both during training and for inference, is that right @guotong1988? If so this may not be the right leaderboard for the results, since the conditions say
Moreover, you acknowledge that your models only use the table schema and question during inference. That is they do not use the table content.
I'm not affiliated with this repo btw, so I could be all wrong. I'm personally interested in methods that use table content at inference time, and would be interested in a leaderboard for such methods.