openmedlab / MedLSAM

MedLSAM: Localize and Segment Anything Model for 3D Medical Images
Apache License 2.0
488 stars 19 forks source link

Based on MedLAM's test config running inference, it takes more than 40 minutes to run on kaggle. What's wrong? #6

Open jumbojing opened 1 year ago

jumbojing commented 1 year ago

As 👆🏻 the title

image

Have any optimization methods ?

......

LWHYC commented 1 year ago

As 👆🏻 the title

image

Have any optimization methods ?

......

Hi, Sorry, I don't really understand what 'run on kaggle' means and what's your dataset here. The running inference time is related to the number of your target classes and the size of query datasets and most of the time is taken by the SAM. We will try to optimize the inference time by increasing the inference batchsize of SAM.