jdjin3000 / PRG-MoE

6 stars 2 forks source link

Inquiry about GPU Configuration for RTX 3090 Compatibility #6

Open LilinNK opened 7 months ago

LilinNK commented 7 months ago

I'm currently attempting to run your deep learning project on my system equipped with an RTX 3090 GPU. However, I'm encountering out-of-memory (OOM) errors consistently(even let batch_zise = 1). Could you please provide information about the GPU configuration your project was developed/tested on? Additionally, any recommendations or adjustments you suggest for running the project smoothly on an RTX 3090 would be greatly appreciated.

xy-xiaotudou commented 2 months ago

me too, and i find the GPU memory is gradually increasing.