OSU-NLP-Group / TableLlama

[NAACL'24] Dataset, code and models for "TableLlama: Towards Open Large Generalist Models for Tables".
https://osu-nlp-group.github.io/TableLlama/
MIT License
110 stars 10 forks source link

fine-tuning time #3

Closed Haruka1307 closed 5 months ago

Haruka1307 commented 7 months ago

May I ask how long it took you to fine tune? I just use rel_extraction data and it takes me over 3 hours now...I use lora as well.

Haruka1307 commented 7 months ago

I run it on 8×A800-SXM4-80GB

zhangtianshu commented 7 months ago

Relation extration is a relatively large dataset in our corpus. It takes around 9h without using lora on 8*A100 (80G)