microsoft / GLIP

Grounded Language-Image Pre-training
MIT License
2.07k stars 186 forks source link

bugs in GLIPDemo #102

Open pioneer-innovation opened 1 year ago

pioneer-innovation commented 1 year ago

Hi! I have tried the demo in Colaborate on my own server. I found there are some bugs. I'm not sure if these issues are due to my deployments or if the code itself comes with it. Anyway, I've listed all the bug issues I'm experiencing, hoping it helps.

1. GLIPDemo.color

Traceback (most recent call last):
  File "/data/zqf/experiment/GLIP/demo.py", line 50, in <module>
    result, _ = glip_demo.run_on_web_image(image, caption, 0.5)
  File "/data/zqf/experiment/GLIP/maskrcnn_benchmark/engine/predictor_glip.py", line 147, in run_on_web_image
    result = self.overlay_entity_names(result, top_predictions)
  File "/data/zqf/experiment/GLIP/maskrcnn_benchmark/engine/predictor_glip.py", line 347, in overlay_entity_names
    image, s, (int(x), int(y)-text_offset_original), cv2.FONT_HERSHEY_SIMPLEX, text_size, (self.color, self.color, self.color), text_pixel, cv2.LINE_AA
AttributeError: 'GLIPDemo' object has no attribute 'color'

I guess this is because the self.color is not defined at GLIPDemo.__init__() function. The only time the self.color is defined is in the 163 line of the maskrcnn_benchmark/engine/predictor_glip.py. To be precise is in GLIPDemo.visualize_with_predictions() function.

2. plt.imshow()

The last line of the imshow function is missing plt.imshow() function. This results in the image not being displayed in pycharm sciview. I think the correct function should be:

def imshow(img, caption):
    plt.imshow(img[:, :, [2, 1, 0]])
    plt.axis("off")
    plt.figtext(0.5, 0.09, caption, wrap=True, horizontalalignment='center', fontsize=20)
    plt.show()
lclszsdnr commented 11 months ago

Hi,Do you encounter this error "ImportError: cannot import name '_C' from 'maskrcnn_benchmark' (/content/GLIP/maskrcnn_benchmark/init.py)"

weinman commented 11 months ago

@lclszsdnr Perhaps see #89.

BrianG13 commented 11 months ago

@lclszsdnr Did you manage to fix that issue?

sylchw commented 11 months ago

Hi, to add on to this thread, I realized that the original GLIP Colab is no longer running out of the box.

To allow it to run again, I added in extra things such as installing Python 3.8, and only installing the packages around the period of time when GLIP was first released, and solved most of the compatibility issues. However, I am now stuck at running the actual model inferencing itself, may I ask if anyone has experienced this too?

"RuntimeError: Not compiled with GPU support"

Here is the link to the notebook https://colab.research.google.com/drive/1svOjYeltl-v6pqvAf9ir0YcMhe3Nd0DL?usp=sharing

puppytag commented 6 months ago

嗨,补充一下这个线程,我意识到原来的 GLIP Colab 不再开箱即用。

为了让它能够再次运行,我添加了一些额外的东西,例如安装Python 3.8,并且只安装GLIP首次发布期间的软件包,并解决了大部分兼容性问题。但是,我现在陷入了运行实际模型推理本身的困境,请问有人也经历过这种情况吗?

“运行时错误:未使用 GPU 支持进行编译”

这是笔记本的链接 https://colab.research.google.com/drive/1svOjYeltl-v6pqvAf9ir0YcMhe3Nd0DL?usp=sharing

Hello,

First of all, I want to express my sincere gratitude for sharing your modified GLIP Colab notebook. Your efforts to update and maintain compatibility have been incredibly helpful.

I noticed your recent update about encountering the 'RuntimeError: Not compiled with GPU support' issue. This seems like a tricky problem, and I was wondering if you have managed to resolve it since your last update?

If you've found a solution, would it be possible for you to share an updated demo or any tips on how to overcome this challenge? Your insights would be invaluable to those of us who are following your work and trying to implement the GLIP model.

Thank you once again for your valuable contribution to the community.

Best regards.

andy-zt commented 2 months ago

Hi, I updated the GLIP demo colab notebook to make it able to run with torch=2.2.1+cu121.

Here is the link to the notebook: https://colab.research.google.com/drive/1TJgEdZEblICDMFtztdURVVT5R_AFjrXU?usp=sharing