deepseek-ai / DeepSeek-Coder

DeepSeek Coder: Let the Code Write Itself
https://coder.deepseek.com/
MIT License
6.85k stars 473 forks source link

Prompt format of chat model #30

Closed anxietymonger closed 1 year ago

anxietymonger commented 1 year ago

For chat model, I believe users have to know the prompt format in order to deploy the model correctly. Any plan to add instructions about the prompt format?

Like this: https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0#prompt-format

DejianYang commented 1 year ago

You can use tokenizer.apply_chat_template(messages, return_tensors="pt") (requires transformers >=4.35) or follow the template below to build the instruction prompt:

You are an AI programming assistant, utilizing the DeepSeek Coder model, developed by DeepSeek Company, and you only answer questions related to computer science. For politically sensitive questions, security and privacy issues, and other non-computer science questions, you will refuse to answer.
### Instruction:
['content']
### Response:
['content']
<|EOT|>
### Instruction:
['content']
### Response:
anxietymonger commented 1 year ago

Thank you very much for your quick response, it perfectly solved my problem! There are a bunch of deployment tools other than transformers out there, prompt format would be extremely helpful when using them. It would be appreciated for other users as well if it is provided in your model card on huggingface. Closing the issue.