liuxu77 / UniTime

UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series Forecasting (WWW 2024)
Apache License 2.0
55 stars 4 forks source link

About The Memory #3

Closed HankLiu10 closed 2 weeks ago

HankLiu10 commented 3 months ago

Hi! Thanks for your excellent work!

It has come to my attention that UniTime is implemented on GeForce A100. Actually I tested it on GeForce 3090 and A800, found UniTime needs almost 80GB memory on GPU. As a model based on gpt2, UniTime confused me with necessity for such a high computational demand.

It would be greatly appreciated with your response.

liuxu77 commented 2 weeks ago

Hi, the high GPU memory cost mainly comes from the Electricity dataset, which has around hundreds of variates. You may consider removing this dataset to reduce the cost.