amazon-science / patchcore-inspection

Apache License 2.0
719 stars 146 forks source link

Why is the mean inference speed faster than paper? #29

Open Classmate-Huang opened 2 years ago

Classmate-Huang commented 2 years ago

I have a question about "inference time" the inference speed listed in your paper is 0.22s per image (~5FPS). But I tested the inference speed in this open-source code, which was very fast. About 20FPS, the GPU I use is a GTX 2080Ti. Why is that?

Confusezius commented 2 years ago

So that may very well be an issue with the cluster our models were trained and evaluated on, in particular the use of a much slower (but memory-heavier) GPU. The most important aspect is the relative inference speeds in the table, and the provided value should this give somewhat of a lower(-ish) bound on the inference speeds. Given the use of a GTX 2080Ti, higher inference speeds should thus be expected!

Classmate-Huang commented 2 years ago

So that may very well be an issue with the cluster our models were trained and evaluated on, in particular the use of a much slower (but memory-heavier) GPU. The most important aspect is the relative inference speeds in the table, and the provided value should this give somewhat of a lower(-ish) bound on the inference speeds. Given the use of a GTX 2080Ti, higher inference speeds should thus be expected!

Got it, thank you for your reply!

tommiekerssies commented 1 year ago

Where can I find the code for measuring inference time?