zhoubenjia / GFSLT-VLP

MIT License
43 stars 7 forks source link

Dataset Problem. #9

Closed mohiburnabil closed 10 months ago

mohiburnabil commented 10 months ago

I have a 6gb rtx 3060(laptop). I have reduced the batch size to 1. Still, it gives Cuda out of memory. what is the minimum req for the training process?

mohiburnabil commented 10 months ago

Screenshot from 2023-12-07 02-38-24 loss is showing 0. What can be the issue? masked loss is reducing though.

zhoubenjia commented 10 months ago

I have a 6gb rtx 3060(laptop). I have reduced the batch size to 1. Still, it gives Cuda out of memory. what is the minimum req for the training process?

Hi, thank you for your interest in our work. We recommend setting the batch size to 4 or more, as we optimize it through a Contrastive Learning loss similar to CLIP. Setting the batch size to 1 may not be suitable. If computing resources are limited, you can attempt to reduce the spatial resolution by setting the crop_size to 128 (eg., --input_size 128). We hope this helps you.

If you have any further questions, feel free to ask!

mohiburnabil commented 10 months ago

Thank you. The problem is solved after setting the batch size to 4