KimMeen / Time-LLM

[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
https://arxiv.org/abs/2310.01728
Apache License 2.0
1.02k stars 179 forks source link

batch size的问你题 #97

Closed liuwj2000 closed 3 weeks ago

liuwj2000 commented 1 month ago

你好,请问一下你这边设置的batch_size是把一张A100用到满的设置嘛~我是一张32GB的显卡, 是不是batch_size要比你小很多呢

kwuking commented 1 month ago

建议如果是32G显存的卡需要把batch size设置小一些比较好,或者调整base model为gpt-2或bert,您可以进行尝试,看哪种方案最适合你