Would it be possible to add the maxTokens?: number; parameter to the Ollama Class as well? Otherwise Ollama doesnt work with ChatHistory.js due to
if (!this.llm.metadata.maxTokens) {
throw new Error("LLM maxTokens is not set. Needed so the summarizer ensures the context window size of the LLM.");
}
Would it be possible to add the maxTokens?: number; parameter to the Ollama Class as well? Otherwise Ollama doesnt work with ChatHistory.js due to
if (!this.llm.metadata.maxTokens) { throw new Error("LLM maxTokens is not set. Needed so the summarizer ensures the context window size of the LLM."); }
Best regards,
Sebastian