It has come to my attention that UniTime is implemented on GeForce A100. Actually I tested it on GeForce 3090 and A800, found UniTime needs almost 80GB memory on GPU. As a model based on gpt2, UniTime confused me with necessity for such a high computational demand.
It would be greatly appreciated with your response.
Hi, the high GPU memory cost mainly comes from the Electricity dataset, which has around hundreds of variates. You may consider removing this dataset to reduce the cost.
Hi! Thanks for your excellent work!
It has come to my attention that UniTime is implemented on GeForce A100. Actually I tested it on GeForce 3090 and A800, found UniTime needs almost 80GB memory on GPU. As a model based on gpt2, UniTime confused me with necessity for such a high computational demand.
It would be greatly appreciated with your response.