ChaoningZhang / MobileSAM

This is the official code for MobileSAM project that makes SAM lightweight for mobile applications and beyond!
Apache License 2.0
4.86k stars 505 forks source link

The image embedding inference time is long #72

Open zhongwenkun886 opened 1 year ago

zhongwenkun886 commented 1 year ago

below is my test code, I calculate the code “ predictor.set_image(image) ” running time , in my laptop pc (RTX3060),it cost about 50ms using GPU.

sam_checkpoint = "./weights/mobile_sam.pt" model_type = "vit_t" device = "cuda" if torch.cuda.is_available() else "cpu" sam = sam_model_registrymodel_type sam.to(device=device) sam.eval() predictor = SamPredictor(sam)

for i in range(10): torch.cuda.synchronize() start = time.time()

predictor.set_image(image)

torch.cuda.synchronize()
print('time cost = ', (time.time() - start) * 1000, "ms")
noyami2033 commented 7 months ago

I also faced the same problem