Open faev999 opened 3 days ago
I modified the chat function in chat_ui.py like this:
def chat(message, history, temperature, max_tokens):
chat = []
if len(message["files"]) >= 1:
chat.append(message["text"])
else:
raise gr.Error("Please upload an image. Text only chat is not supported.")
files = message["files"][-1]
if model.config.model_type != "paligemma":
messages = apply_chat_template(processor, config, message["text"], num_images=1)
else:
messages = message["text"]
response = ""
for chunk in stream_generate(
model, processor, files, messages, image_processor, max_tokens, temp=temperature
):
response += chunk
yield response
Seems to work.
hi all, I had the following exception when trying to run the gradio example with:
python -m mlx_vlm.chat_ui --model mlx-community/Qwen2-VL-72B-Instruct-4bit
when using the CLI example with:
python -m mlx_vlm.generate --model mlx-community/Qwen2-VL-72B-Instruct-4bit --max-tokens 100 --temp 0.0 --image http://images.cocodataset.org/val2017/000000039769.jpg
there's no exception.I installed the package with:
pip install mlx-vlm
and have tried python 3.12 and python 3.10 with and got the same result