deepseek-ai / DeepSeek-Coder

DeepSeek Coder: Let the Code Write Itself
https://coder.deepseek.com/
MIT License
6.61k stars 461 forks source link

HF chat-ui Prompt Template (DeepSeek Coder 6.7B) #104

Open GANJAC opened 8 months ago

GANJAC commented 8 months ago

Hi everyone, for my project i want to use HF chat-ui (https://github.com/huggingface/chat-ui) with this great model served by llama.cpp server:

https://huggingface.co/TheBloke/deepseek-coder-6.7B-instruct-GGUF

Please, can someone give me the correct "chatPromptTemplate" to use in .env file ?

Any suggestions are super appreciated!

Thank you all!

SyedTahirHussan commented 1 month ago

To integrate the HF chat-ui with the deepseek-coder-6.7B-instruct-GGUF model served by llama.cpp, you'll need to define the correct chatPromptTemplate in the .env file. This template will guide how inputs and outputs are formatted during interaction with the model.

Here's a basic structure that you can use:

chatPromptTemplate="{\"instruction\": \"{{input}}\", \"response\": \"{{output}}\"}" Breakdown: input: The user's message or instruction. output: The model's generated response. This template is minimalistic and directly fits with the instruct-type model, where you provide an instruction, and the model responds.

Additional Suggestions: Tokenization: Make sure that the input and output are tokenized properly when sending requests to the model. This ensures the model interprets the prompts correctly. Environment Configuration: Ensure that your .env file is configured with the correct API keys, endpoints, and other settings needed for both HF chat-ui and llama.cpp. Fine-Tuning: Depending on how the model behaves, you might need to tweak the prompt template or the server settings to improve response quality.