chaoyi-wu / PMC-LLaMA

The official codes for "PMC-LLaMA: Towards Building Open-source Language Models for Medicine"
583 stars 52 forks source link

Integration with FastChat and Inquiry About System Prompts #14

Closed Akshay1-6180 closed 11 months ago

Akshay1-6180 commented 1 year ago

I have been using this repo and have found it to be quite beneficial for my projects. I'm reaching out to suggest and inquire about a potential integration with FastChat. Integrating with FastChat could expand the utility and make it more accessible for real-time chat applications.

Integration with FastChat: FastChat is becoming increasingly popular for various chat-based applications, and having a seamless integration between PMC-LLama and FastChat could be immensely beneficial. This could enable users to quickly deploy chat models in real-time environments, making the process more streamlined and user-friendly.

System Prompts and Prompt Templates: Additionally, I wanted to ask if there are any system prompts or prompt templates that would be recommended or required for chat interactions using the models , Having a standardized set of prompts or templates can ensure consistency in chat interactions and improve the overall user experience coz currently it just autocompletes the input prompt even for axiong/PMC_LLaMA_13B. You could add ur prompt template here :https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py

I believe these suggestions, if implemented, could greatly enhance the utility and adoption of ur repo. I'm looking forward to your thoughts on this.

WeixiongLin commented 11 months ago

Thanks! I agree that inappropriate prompts have caused obstacles. We are trying to adapt to the prompt template in fastchat now.