continuedev / contribution-ideas

A Repo to which to Attach Contribution Ideas
2 stars 0 forks source link

Support a new chat template #17

Open sestinj opened 10 months ago

sestinj commented 10 months ago

The llama2 and codellama class of models use a chat template that looks like this:

[INST] <<SYS>>
{system_message}
<<SYS>>

{user_input} [/INST] {response}

But other models use different templates. For example, the Alpaca series of models uses a pattern like this:

### Instruction: {system_message}

### Input: {user_input}

### Response: {response}

To add a prompt template you should:

  1. Add the chat template to chat.ts
  2. Add a template for edits in edit.ts, following the pattern shown there of starting the response for the LLM.
  3. Add a new value to the TemplateType type, and update the corresponding array in config_schema.json
  4. Update the autodetectTemplateType, autodetectTemplateFunction, and autodetectPromptTemplates functions in core/llm/index.ts.