Portkey-AI / gateway

A Blazing Fast AI Gateway with integrated Guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
https://portkey.ai/features/ai-gateway
MIT License
6k stars 410 forks source link

[Feature] Create tokenizer for applying chat template for mistral, llama etc #625

Open narengogi opened 1 week ago

narengogi commented 1 week ago

What Would You Like to See with the Gateway?

Open source models like Llama and mistral expect instruction and completion inputs in a chat template format for optimal text completions example:

<|begin_of_text|>\\n<|start_header_id|>user<|end_header_id|>\\nCountry: United States\\nCapital: <|eot_id|><|start_header_id|>assistant<|end_header_id|>\\nWashington DC<|eot_id|><|start_header_id|>user<|end_header_id|>\\nWhat is up my good friend?<|eot_id|><|start_header_id|>assistant<|end_header_id|>\\n

use the following for reference: https://www.llama.com/docs/model-cards-and-prompt-formats/llama3_1 https://medium.com/@marketing_novita.ai/how-to-use-mistral-chat-template-e0b2a973f031

Portkey's current implementation does not apply the templates, rather appends roles this way

    transform: (params: Params) => {
      let prompt: string = '';
      if (!!params.messages) {
        let messages: Message[] = params.messages;
        messages.forEach((msg, index) => {
          if (index === 0 && msg.role === 'system') {
            prompt += `system: ${msg.content}\n`;
          } else if (msg.role == 'user') {
            prompt += `user: ${msg.content}\n`;
          } else if (msg.role == 'assistant') {
            prompt += `assistant: ${msg.content}\n`;
          } else {
            prompt += `${msg.role}: ${msg.content}\n`;
          }
        });
        prompt += 'Assistant:';
      }
      return prompt;
    },

Context for your Request

No response

Your Twitter/LinkedIn

No response

keshavkrishna commented 5 days ago

@narengogi can i work on this??

narengogi commented 4 days ago

I'm already working on this @keshavkrishna