johnmai-dev / ChatMLX

🤖✨ChatMLX is a modern, open-source, high-performance chat application for MacOS based on large language models.
Apache License 2.0
627 stars 43 forks source link

Feature Request: Local server and system instruction presets etc. #26

Open czkoko opened 1 week ago

czkoko commented 1 week ago

Thanks for your work, the elegant interface design and lightweight client are great.

I have a few new feature suggestions:

  1. Can start an OpenAI API compatible server locally, so that the AI ​​code plugin of vs code can be used through ChatMLX.
  2. Can customize a set of system instruction presets, so that you can easily switch the role that the model needs to play.
  3. Can customize the model save directory. After all, the models are large and suitable for storage on an external SSD
johnmai-dev commented 1 week ago

Thank you for your suggestions. The suggestion 3 is currently in development. Suggestions 1 and 2 are also being planned.

czkoko commented 6 days ago

In addition, report a few issues:

  1. If you delete a message when the model is replying to it, will crash.
  2. When the conversation list on the left column is cleared, if you send a message to the model, will crash.
  3. mlx-community/gemma-2-9b-it-4bit has an extra string <end_of_turn> at the end. 3