qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.
https://twitter.com/stablequan
MIT License
1.45k stars 131 forks source link

Cannot run local MLX model not in mlx-community? #187

Closed 100ZZ closed 1 month ago

100ZZ commented 2 months ago
  1. download huggingface llm model(Qwen2-7B-Instruct)
  2. convert to mlx model(Qwen2-7B-Instruct=》Qwen2-7B-Instruct-MLX)
  3. copy mlx model to **/chat_with_mlx/models/download/Qwen2-7B-Instruct-MLX
  4. add /chat_with_mlx/models/configs/.yaml

if model(like Qwen2-7B-Instruct-MLX) is not exist in mlx-community;however modify yaml,this mlx model cannot run?