alexrozanski / LlamaChat

Chat with your favourite LLaMA models in a native macOS app
https://llamachat.app
MIT License
1.43k stars 53 forks source link

Chinese LLaMA/Alpaca support #16

Closed ymcui closed 1 year ago

ymcui commented 1 year ago

Dear LlamaChat Maintainer,

Greetings from Yiming Cui, the Chinese-LLaMA/Alpaca maintainer. I would like to thank you for your efforts in making LLaMA-like models more accessible to the community. I just conducted a quick test, loading our Chinese-Alpaca-7B/13B model (ggml format), and it functioned without any errors. The system outputs closely resemble those generated by llama.cpp, and I believe there are no major issues concerning the support of Chinese models. I plan to include a description of LlamaChat on our project page and I am looking forward to future updates of LlamaChat. Thank you.

Best regards, Yiming

P.S. As a quick suggestion, some users might be interested in using advanced hyper-parameter settings (such as temperature, top-k, top-p, threads, etc.) to generate more diverse outputs. (forgive me if this feature is already implemented in LlamaChat)

Screenshot: WX20230417-083210@2x

alexrozanski commented 1 year ago

Hey @ymcui, thanks so much for testing, and for your contributions! As you discovered LlamaChat technically works with any compatible .ggml or .pth file, but I will be adding dedicated UI for Chinese-LLaMA/Alpaca soon!

And yes, would love to have a link to LlamaChat from the project page 🙏🏼

ymcui commented 1 year ago

Hello,

I have included links for LlamaChat,

Cheers 🎉

alexrozanski commented 1 year ago

thanks @ymcui!