koa-fin / sep

Code release for "Learning to Generate Explainable Stock Predictions using Self-Reflective Large Language Models" https://arxiv.org/abs/2402.03659
87 stars 16 forks source link

cuda out of memory #3

Open lgtkuxuan opened 5 months ago

lgtkuxuan commented 5 months ago

I used a GPU with 48GB of VRAM to run the sample code, but I still encountered the 'cuda out of memory' issue at the end.could anyone tell me what are the specific requirements for GPU configuration in this project?