ml-explore / mlx-examples

Examples in the MLX framework
MIT License
6.16k stars 872 forks source link

Add support for llava-hf/llava-v1.6-mistral-7b-hf #638

Closed katopz closed 7 months ago

katopz commented 7 months ago

Currently we are not support llava-hf/llava-v1.6-mistral-7b-hf

ValueError: Model type mistral not supported. Currently only 'llama' is supported

It would be nice if we support this one.

jrp2014 commented 7 months ago

See #605

awni commented 7 months ago

Let's close this in favor of #605.

okpatil4u commented 6 months ago
python generate.py \
  --model llava-hf/llava-v1.6-mistral-7b-hf \
  --image "/Users/omkarpatil/Downloads/maxresdefault.jpg " \
  --prompt "USER: <image>\nWhat is this image about ?\nASSISTANT:" \
  --max-tokens 128 \
  --temp 0

ValueError: Model type mistral not supported. Currently only 'llama' is supported

I don't think that the issue is resolved. I am using May 8 commit[fad959837232b0b4deeaa29b147dc69cbc9c5d19].

awni commented 6 months ago

I don't think we ever added it. But you should checkout out MLX VLM by @Blaizzy : https://github.com/Blaizzy/mlx-vlm

Blaizzy commented 6 months ago

Thanks for the shoutout @awni!

@katopz we are extending support for all vlms,

Currently we support llava, idefics 2, nanollava with many more coming such as QwenVL, bunny, internVL, including llava-next which that model is based on.

If you are interested you can send us a PR, I will be there to help if needed :)