ChaoningZhang / MobileSAM

This is the official code for MobileSAM project that makes SAM lightweight for mobile applications and beyond!
Apache License 2.0
4.86k stars 505 forks source link

More than one bounding box #152

Open sulaimanvesal opened 5 months ago

sulaimanvesal commented 5 months ago

Hi all,

When I use more than one bounding box, I get the following errors. I checked everything the shape of bboxs are BxNx4.

Draw bounding boxes on the image and press 'q' when done.
Bounding box added: [522, 622, 662, 752]
Bounding box added: [742, 713, 871, 832]
User-Drawn Bounding Boxes: [[522 622 662 752]
 [742 713 871 832]] (2, 4)
Traceback (most recent call last):
  File "D:\Hanwha_Projects\Project2\DPM_MobileSAm\run_demo_ultralytics.py", line 70, in <module>
    masks_object, _, _ = model.predict(
  File "d:\hanwha_projects\project2\dpm_mobilesam\mobilesam\mobile_sam\predictor.py", line 155, in predict
    masks, iou_predictions, low_res_masks = self.predict_torch(
  File "C:\ProgramData\anaconda3\envs\pytorch_env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "d:\hanwha_projects\project2\dpm_mobilesam\mobilesam\mobile_sam\predictor.py", line 223, in predict_torch
    sparse_embeddings, dense_embeddings = self.model.prompt_encoder(
  File "C:\ProgramData\anaconda3\envs\pytorch_env\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "d:\hanwha_projects\project2\dpm_mobilesam\mobilesam\mobile_sam\modeling\prompt_encoder.py", line 159, in forward
    sparse_embeddings = torch.cat([sparse_embeddings, box_embeddings], dim=1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 1 but got size 2 for tensor number 1 in the list.