run-llama / LlamaIndexTS

Data framework for your LLM applications. Focus on server side solution
https://ts.llamaindex.ai
MIT License
1.87k stars 354 forks source link

maxTokens Parameter in ollama #587

Closed phalix closed 8 months ago

phalix commented 8 months ago

Would it be possible to add the maxTokens?: number; parameter to the Ollama Class as well? Otherwise Ollama doesnt work with ChatHistory.js due to

if (!this.llm.metadata.maxTokens) { throw new Error("LLM maxTokens is not set. Needed so the summarizer ensures the context window size of the LLM."); }

Best regards,

Sebastian

himself65 commented 8 months ago

Thanks for feedback