Blaizzy / mlx-vlm

MLX-VLM is a package for running Vision LLMs locally on your Mac using MLX.
MIT License
144 stars 12 forks source link

Add chat UI with gradio #17

Closed Blaizzy closed 1 month ago

Blaizzy commented 1 month ago

This PR adds support for gradio as the default Chat UI.

You can start it via the CLI using this command:

python -m mlx_vlm.chat_ui --model "<HF_REPO_OR_LOCAL_PATH>"

You can control temperature and max_token parameters from the UI.

Limitations:

Example:

Screenshot 2024-05-05 at 2 34 07 PM