Closed ymcui closed 1 year ago
Hey @ymcui, thanks so much for testing, and for your contributions! As you discovered LlamaChat technically works with any compatible .ggml
or .pth
file, but I will be adding dedicated UI for Chinese-LLaMA/Alpaca soon!
And yes, would love to have a link to LlamaChat from the project page 🙏🏼
Hello,
I have included links for LlamaChat,
main
branch tomorrow)Cheers 🎉
thanks @ymcui!
Dear LlamaChat Maintainer,
Greetings from Yiming Cui, the Chinese-LLaMA/Alpaca maintainer. I would like to thank you for your efforts in making LLaMA-like models more accessible to the community. I just conducted a quick test, loading our Chinese-Alpaca-7B/13B model (ggml format), and it functioned without any errors. The system outputs closely resemble those generated by llama.cpp, and I believe there are no major issues concerning the support of Chinese models. I plan to include a description of LlamaChat on our project page and I am looking forward to future updates of LlamaChat. Thank you.
Best regards, Yiming
P.S. As a quick suggestion, some users might be interested in using advanced hyper-parameter settings (such as temperature, top-k, top-p, threads, etc.) to generate more diverse outputs. (forgive me if this feature is already implemented in LlamaChat)
Screenshot:![WX20230417-083210@2x](https://user-images.githubusercontent.com/16095339/232353156-b635b311-7645-4657-a31c-57b96b701353.png)