vercel / modelfusion

The TypeScript library for building AI applications.
https://modelfusion.dev
MIT License
1.01k stars 76 forks source link

Ollama Mistral .withPromptTemplate Error #285

Closed joshon closed 4 months ago

joshon commented 4 months ago

Using 0.136.0 or 0.128.0

I use Ollama with Mistral like this on route.js:

const model = ollama
    .CompletionTextGenerator({ 
      model: "mistral",
      maxGenerationTokens: 1000,
      temperature: 0.1,
     })
    .withPromptTemplate(instruction());

Where instruction() looks like this:

const BEGIN_SEGMENT = "<s>";
const END_SEGMENT = "</s>";
const BEGIN_INSTRUCTION = "[INST] ";
const END_INSTRUCTION = " [/INST] ";

// Function for formatting an instruction prompt
export function instruction() {
    return {
      stopSequences: [END_SEGMENT],
      format(prompt) {
          const instructionStr = prompt.messages[prompt.messages.length -1].content
        if (prompt.system != null) {
        let final_inst_prompt = ``;
        if(prompt.messages[prompt.messages.length -1].role == "summary")
        {
            final_inst_prompt =  instructionStr;
        }else
        {
            final_inst_prompt =  `${BEGIN_SEGMENT}${BEGIN_INSTRUCTION}${prompt.system}${END_INSTRUCTION}${END_SEGMENT}${BEGIN_INSTRUCTION}${instructionStr}${END_INSTRUCTION}${prompt.responsePrefix ?? ""}`;
        }
        return final_inst_prompt;   
        }
      },
    };
  }

It sends: <s>[INST] You are a chat bot. [/INST] </s>[INST] hello [/INST]

It connects to Mistral but every stream package causes a parseJSON error, I think because of the value true for the key done not being in quotes:

    at validateTypes (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:917:19)
    at parseJSON (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:963:16)
    ... 2 lines matching cause stack trace ...
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
  cause: ZodError: [
    {
      "code": "invalid_type",
      "expected": "number",
      "received": "undefined",
      "path": [
        "total_duration"
      ],
      "message": "Required"
    },
    {
      "code": "invalid_type",
      "expected": "number",
      "received": "undefined",
      "path": [
        "eval_count"
      ],
      "message": "Required"
    },
    {
      "code": "invalid_type",
      "expected": "number",
      "received": "undefined",
      "path": [
        "eval_duration"
      ],
      "message": "Required"
    }
  ]
      at get error [as error] (webpack-internal:///(rsc)/./node_modules/zod/lib/index.mjs:616:31)
      at ZodSchema.validate (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:903:27)
      at validateTypes (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:915:41)
      at parseJSON (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:963:16)
      at processLine (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:4136:18)
      at eval (webpack-internal:///(rsc)/./node_modules/modelfusion/index.js:4159:17)
      at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
    issues: [ [Object], [Object], [Object] ],
    addIssue: [Function (anonymous)],
    addIssues: [Function (anonymous)],
    errors: [ [Object], [Object], [Object] ]
  },
  value: {
    model: 'mistral',
    created_at: '2024-02-14T01:06:34.61041Z',
    response: '',
    done: true
  }
}

Any ideas what might be causing this?

joshon commented 4 months ago

We were returning a string when an object was expected. callAPI needed the prompt in an object like this: {prompt: "instructionStr"}

jakedetels commented 4 months ago

Hi @joshon. It looks like your instruction function is not assignable to withPromptTemplate. The function you pass in must have a format method that returns an OllamaCompletionPrompt shaped like this:

export interface OllamaCompletionPrompt {
  /**
   * Text prompt.
   */
  prompt: string;

  /**
   Images. Supports base64-encoded `png` and `jpeg` images up to 100MB in size.
   */
  images?: Array<string>;
}

So just make sure your format function always returns something (never undefined) and make sure it returns in the format {prompt: 'my prompt text' }

btw, if you consider switching from using JavaScript to TypeScript, you'll get immediate feedback from the TS compiler when you're using types incorrectly. (Big time saver in development.). With your existing code, the TS compiler shows this error below in VS Code:

image
lgrammel commented 4 months ago

@joshon you can use the built-in mistral prompt template, see e.g. https://github.com/lgrammel/modelfusion/blob/main/examples/basic/src/model-provider/ollama/ollama-completion-generate-text-example.ts