But it still occupied the whole gpu memory when I run the AdaNet. For example:
(1) If there is only 4G memory left, it will occupied the remaining 6G memory
(2) If there is no other process occupying the GPU memory (i.e., there is 10G memory left), it will occupied the whole 10G memory when I run the AdaNet.
So I think it doesn't need 10G to run but it takes that much any way.
I expect it only takes the memory it needs, instead of taking all of them. Therefore, I could make use of the 10G GPU memory better.
But now, I have no idea how to fix this issue. Could anybody please give me some suggestions? Thanks a lot.
Problem: gpu_options doesn't work. I used the codes in AdaNet like
But it still occupied the whole gpu memory when I run the AdaNet. For example: (1) If there is only 4G memory left, it will occupied the remaining 6G memory (2) If there is no other process occupying the GPU memory (i.e., there is 10G memory left), it will occupied the whole 10G memory when I run the AdaNet. So I think it doesn't need 10G to run but it takes that much any way.
I expect it only takes the memory it needs, instead of taking all of them. Therefore, I could make use of the 10G GPU memory better. But now, I have no idea how to fix this issue. Could anybody please give me some suggestions? Thanks a lot.
Other system information: