rstudio / tensorflow.rstudio.com

https://tensorflow.rstudio.com
9 stars 12 forks source link

GPU memory is not enough #73

Open xiaofanfan1991 opened 5 months ago

xiaofanfan1991 commented 5 months ago

I try tensorflow in r, my gpu memory is 16gb and ram is 32gb. i program and find the gpu explode because it run out of memory, i try to find any function could help release the memory in gpu, but no one satisfy. Do anyone could have way to slove?

t-kalinowski commented 5 months ago

Can you please provide a reproducible example?

Most likely, you're creating a model that's too large to fit on the GPU. Take note of the "Total params: " entry that's shown when printing a model.

If you're using the default float32 size, each parameter takes 4bytes, and that means that with a ceiling of 16Gb, the max you'll be able to hold on the GPU will be ~4b. To train the model you'd also need room for optimizer parameters and the gradients, so practically speaking the max size you'll be able to train on a single 16gb GPU is a model with ~2b parameters.