Open wwxb2012 opened 6 months ago
The input image:
One more question: what does NMS Threshold and Sample Points mean? I may need to adjust them on myself.
On HuggingFace we currently use a preprocessing step that is not yet part of the model. Outside of HuggingFace you'd need to apply this manually to get the same results. You can find the function here: https://huggingface.co/spaces/ericup/celldetection/blob/d551587eddc1c8536336bbce073c32613b4f113a/prep.py#L43
Simply use the following line before running your model:
img = multi_norm(img, 'cstm-mix')
We are planning to handle this automatically, but that is not available yet.
Regarding your other question:
NMS Threshold: This is the Intersection over Union (IoU) threshold used in the Non-Maximum Suppression (NMS) algorithm, which helps prevent overlapping detections where IoU > nms_threshold
. More details can be found on the PyTorch NMS documentation page.
Sample Points: This parameter allows you to specify the number of sample points defining each contour. For example, setting it to 128 ensures that each contour is defined by exactly 128 points.
To choose an NMS threshold, select a lower value (e.g., 0.3) for relatively strict suppression of overlapping detections, or a higher value (e.g., 0.5 or above) if overlap is acceptable. For sample points, use more points (e.g., 128 or more) for detailed contours in high-resolution images, or fewer points for simpler shapes or when computational efficiency is a priority.
Thank you very much! Very fast response.
I tried to download the model and tried your provided demo in my server, and I noticed the results are different when inference with the same input image and using huggingface with your default parameters: huggingface result:
server result (using the provided demo code):
I want to set the parameters as in the huggingface.