Open ColtonMcInroy opened 5 months ago
With Ollama, you need to specify how you go from text output to tools. There are some examples here: https://github.com/lgrammel/modelfusion/tree/main/examples/basic/src/model-provider/ollama
E.g.:
const { tool, args, toolCall, result } = await useTool(
ollama
.CompletionTextGenerator({
model: "mistral",
promptTemplate: ollama.prompt.Mistral,
raw: true, // required when using custom prompt template
format: "json",
temperature: 0,
stopSequences: ["\n\n"], // prevent infinite generation
})
.withInstructionPrompt()
.asToolCallGenerationModel(jsonToolCallPrompt.text()),
new MathJsTool({ name: "calculator" }),
"What's fourteen times twelve?"
);
console.log(`Tool call`, toolCall);
console.log(`Tool: ${tool}`);
console.log(`Arguments: ${JSON.stringify(args)}`);
console.log(`Result: ${result}`);
That being said, chat w/ tools is not supported yet w/ Ollama, and the interface for creating tool models might change.
Changing the example MiddleSchoolMathAgent.ts from openai to ollama results in the following error: TypeError: model.doGenerateToolCalls is not a function