Open chronick opened 3 days ago
@chronick this model is not yet supported in the current stable version of LM Studio. See: https://github.com/lmstudio-ai/mlx-engine/issues/5#issuecomment-24480881
nb: It's also not supported in llama.cpp so GGUF's of it won't load.
Attempting to load vision model with
mllama
https://huggingface.co/mlx-community/Llama-3.2-11B-Vision-Instruct-8bit MLX architectureIs this meant to be supported on my machine/version? I can't seem to find any docs about it.
MacOS 15.1, M4 Pro LM Studio 0.3.5 Runtime: LM Studio MLX 0.0.14
I also have Metal Llama.cpp 1.2.0 but I don't believe it is being used.