muzairkhattak / multimodal-prompt-learning

[CVPR 2023] Official repository of paper titled "MaPLe: Multi-modal Prompt Learning".
https://muzairkhattak.github.io/multimodal-prompt-learning/
MIT License
578 stars 42 forks source link

Minimum GPU Memory Requirement for Running the Experiment #30

Closed byerose closed 1 year ago

byerose commented 1 year ago

Hello,

I recently came across your research paper and noticed that you have conducted your experiments using an NVIDIA A100 GPU. I am very interested in following your work and replicating the experiments. However, my current GPU has less than 80GB of memory, and I am concerned about the minimum GPU memory requirements for running your experiment.

Could you please provide information on the minimum GPU memory requirement to successfully execute the experiments in your repository? It would be greatly appreciated if you could share any possible workarounds or suggestions for running the experiments on GPUs with lower memory capacities.

Thank you in advance for your time and assistance. I am looking forward to your response.

Best regards

muzairkhattak commented 1 year ago

Hi @byerose,

Thank you for showing interest in our work.

We utilized NVIDIA 40GB A100 GPU for our experiments, but the GPU memory consumption for training our models is not that much.

We note that the maximum GPU consumption throughout our experiments is about 13-14 GB (when training model on ImageNet, which is the largest dataset in our benchmarks). Therefore, If you have GPU with at least 16 GB of memory, that should be fine to evaluate and train your models.

I hope that resolves your query!

Thank you and kind regards.

byerose commented 1 year ago

Thank you. I have an RTX 5000 with 16GB of VRAM, and I will proceed to replicate your experiments in the future. Once again, thank you for your excellent work.