lmstudio-ai / lmstudio-bug-tracker

Bug tracking for the LM Studio desktop application
10 stars 3 forks source link

Failed to load model: Model type mllama not supported #208

Open chronick opened 3 days ago

chronick commented 3 days ago

Attempting to load vision model with mllama https://huggingface.co/mlx-community/Llama-3.2-11B-Vision-Instruct-8bit MLX architecture

Is this meant to be supported on my machine/version? I can't seem to find any docs about it.

🥲 Failed to load the model

Failed to load model

Error when loading model: ValueError: Model type mllama not supported.

MacOS 15.1, M4 Pro LM Studio 0.3.5 Runtime: LM Studio MLX 0.0.14

I also have Metal Llama.cpp 1.2.0 but I don't believe it is being used.

YorkieDev commented 8 hours ago

@chronick this model is not yet supported in the current stable version of LM Studio. See: https://github.com/lmstudio-ai/mlx-engine/issues/5#issuecomment-24480881

nb: It's also not supported in llama.cpp so GGUF's of it won't load.