Closed joshon closed 4 months ago
We were returning a string when an object was expected. callAPI needed the prompt in an object like this: {prompt: "instructionStr"}
Hi @joshon. It looks like your instruction
function is not assignable to withPromptTemplate
. The function you pass in must have a format
method that returns an OllamaCompletionPrompt
shaped like this:
export interface OllamaCompletionPrompt {
/**
* Text prompt.
*/
prompt: string;
/**
Images. Supports base64-encoded `png` and `jpeg` images up to 100MB in size.
*/
images?: Array<string>;
}
So just make sure your format
function always returns something (never undefined) and make sure it returns in the format {prompt: 'my prompt text' }
btw, if you consider switching from using JavaScript to TypeScript, you'll get immediate feedback from the TS compiler when you're using types incorrectly. (Big time saver in development.). With your existing code, the TS compiler shows this error below in VS Code:
@joshon you can use the built-in mistral prompt template, see e.g. https://github.com/lgrammel/modelfusion/blob/main/examples/basic/src/model-provider/ollama/ollama-completion-generate-text-example.ts
Using 0.136.0 or 0.128.0
I use Ollama with Mistral like this on route.js:
Where instruction() looks like this:
It sends:
<s>[INST] You are a chat bot. [/INST] </s>[INST] hello [/INST]
It connects to Mistral but every stream package causes a parseJSON error, I think because of the value true for the key done not being in quotes:
Any ideas what might be causing this?