Open GalAvineri opened 5 years ago
@GalAvineri I check the total number of parameters of my model, this has the same number of parameters as yours. The memory cost differs from different deepnet tool. You can try to reduce the batch size further (e.g. just try 1), or to reduce the input image size as in https://github.com/zouchuhang/LayoutNet/issues/5#issuecomment-393949323
Hello! :) I'm trying to implement your network with keras and it that the network I built has many more parameters than the amount you declared at your paper. You've mentioned you have been able to train the entire network with a batch size of 20 using 12GB. (I've even seen in #5 that you've mentioned you use 10.969GB) It seems that my gpu has 10.57GiB available, but when I try to use a batch size of 15, which by calculation should fit the gpu, the gpu cannot fit the model into it's memory. I've even removed the 3-D regresson part and it still fails.
So I wanted to ask if you could help me see if i've made any implementation error :) Could you for example provide the total number of parameters of your model? And perhaps even better, provide the number of parameters per layer? :)
Here is the description of my implementation :) I've defined the network as follows:
And the number of parameters per layer is shown here: