Closed nomadoor closed 9 months ago
I'm planning to add it, currently working on outputs of the VLM and other nodes, it can take some time
Thank you for the wonderful job! I’m looking forward to it.
i added this model to repo 👍
It’s working perfectly. Thank you so much for the wonderful job😍 However, there is one thing that concerns me.
Some custom nodes like comfyui_controlnet_aux have started to fail on import.
It shows ImportError: Matplotlib requires dateutil>=2.7; you have 2.2
, but python-dateutil-2.8.2 is already installed. I tried pip install python-dateutil==2.7
and it started working normally, but is this related to VLM_nodes?
yes its related to that i fixed it as you suggest, thank you 👍
Thank you!!
i also added kosmos-2 and UForm-Gen2Qwen(This model is blazing fast like joytag but extremly good almost llava 1.5 level)
That’s great news, I tried it out right away! UForm-Gen2Qwen isn’t bad, but considering the VRAM usage, I prefer llava-v1.6-mistral-7b which uses about the same amount. Also, joytag, which hardly monopolizes VRAM, is still tier 1 even now.
definitely llava 1.6 mistral number one choice by far
The 4-bit quantized version of InternLM-XComposer2-VL, known as InternLM-XComposer2-VL-7b-4bit, which has demonstrated excellent performance, has been released.
I would be very pleased if this could be supported. However, it might be better not to cram too much into a single repository.