tianrun-chen / SAM-Adapter-PyTorch

Adapting Meta AI's Segment Anything to Downstream Tasks with Adapters and Prompts
MIT License
850 stars 75 forks source link

How much memory of GPU is typically consumed when evaluating a model at batch of 1? #52

Closed jithf closed 9 months ago

jithf commented 9 months ago

Memory of my GPU is 24G,“CUDA out of memory” raised when I "python test.py --config config.yaml --model mode_epoch_bets_path". The model is the pretrain model of the author and the bath size was set to 1.

tianrun-chen commented 9 months ago

Greetings! As the current application will utilize over 30G of memory for batchsize=1, we suggest considering alternative graphics cards with greater memory capacity.