Closed tengyu-liu closed 1 year ago
It only requires one A100 for inference since the large model is only ViT-large. For retrieval, if your dataset is too large, maybe you need more GPUs to have a quick inference.
Thank you very much. I'm closing this issue.
Great work! The performance on public sources is very impressive. Congratulations!
The paper did not mention the cost of inference. How many A100s do you need for inference?