OSU-NLP-Group / TableLlama

[NAACL'24] Dataset, code and models for "TableLlama: Towards Open Large Generalist Models for Tables".
https://osu-nlp-group.github.io/TableLlama/
MIT License
116 stars 11 forks source link

Inquiry on Inference Times and Hardware Specifications for Evaluations #4

Closed hhlgithub closed 6 months ago

hhlgithub commented 7 months ago

Could you share the observed inference times for various table-related tasks from your evaluations, and detail the hardware specifications used?

zhangtianshu commented 7 months ago

Our inference script takes 1h to 10h+ for different tasks. The inference time depends on the eval data size, the input length and the output length. Currenty we don't use batch inference, but you can try to use vllm to speed up the inference.