jshilong / GPT4RoI

GPT4RoI: Instruction Tuning Large Language Model on Region-of-Interest
Other
496 stars 25 forks source link

GPU memory #13

Closed Leonhard-Euler-ai closed 1 year ago

Leonhard-Euler-ai commented 1 year ago

How much GPU memory is required for inference?

jshilong commented 1 year ago

I can run it on the 4090, so it should be around 20G