I have been using this repo and have found it to be quite beneficial for my projects. I'm reaching out to suggest and inquire about a potential integration with FastChat. Integrating with FastChat could expand the utility and make it more accessible for real-time chat applications.
Integration with FastChat:
FastChat is becoming increasingly popular for various chat-based applications, and having a seamless integration between PMC-LLama and FastChat could be immensely beneficial. This could enable users to quickly deploy chat models in real-time environments, making the process more streamlined and user-friendly.
System Prompts and Prompt Templates:
Additionally, I wanted to ask if there are any system prompts or prompt templates that would be recommended or required for chat interactions using the models , Having a standardized set of prompts or templates can ensure consistency in chat interactions and improve the overall user experience coz currently it just autocompletes the input prompt even for axiong/PMC_LLaMA_13B.
You could add ur prompt template here :https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py
I believe these suggestions, if implemented, could greatly enhance the utility and adoption of ur repo. I'm looking forward to your thoughts on this.
I have been using this repo and have found it to be quite beneficial for my projects. I'm reaching out to suggest and inquire about a potential integration with FastChat. Integrating with FastChat could expand the utility and make it more accessible for real-time chat applications.
Integration with FastChat: FastChat is becoming increasingly popular for various chat-based applications, and having a seamless integration between PMC-LLama and FastChat could be immensely beneficial. This could enable users to quickly deploy chat models in real-time environments, making the process more streamlined and user-friendly.
System Prompts and Prompt Templates: Additionally, I wanted to ask if there are any system prompts or prompt templates that would be recommended or required for chat interactions using the models , Having a standardized set of prompts or templates can ensure consistency in chat interactions and improve the overall user experience coz currently it just autocompletes the input prompt even for axiong/PMC_LLaMA_13B. You could add ur prompt template here :https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py
I believe these suggestions, if implemented, could greatly enhance the utility and adoption of ur repo. I'm looking forward to your thoughts on this.