Closed chriss1245 closed 1 month ago
I'm looking into it. I'll PR if I come to a nice solution
There are actually more Options missing.
If you compare whats available with what is implemented, you can see that there are way more options. I have to be honest though, I don't know what half of the options even do. So I'll focus on the base_url
and the seed
for now. The URL part i have working already.
Sounds good. To be honest me neither. I mentioned seed and base_url cause i use those. Thanks a lot!
@noggynoggy I also use Base URL , would appreciate if you could create a PR and get it merged.
The base_url functionality was added by #24719. Maybe ill add the other Parameters later with a PR
Can you please add seed parameters. It's also kind of an essential one.
@chriss1245 you can close this issue. Both have been implemented.
The latest release does not include the changes yet, so in the meantime:
wget https://raw.githubusercontent.com/langchain-ai/langchain/master/libs/partners/ollama/langchain_ollama/chat_models.py -O .venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py
make sure the path works for you (venv name, python version)
Sure, thanks a lot!
Thanks for doing this, I've been so confused!
Checked other resources
Example Code
When running:
The parameters base_url and seed get ignored. Reviewing the code of this instance, I see that the class definition is missing these attributes.
Error Message and Stack Trace (if applicable)
No response
Description
Regarding seed, in PR 249 in ollama, this feature was added to allow reproducibility of the experiments. Regarding base_url, since ollama allow us to host llms in our own servers, we need to be able to specify the url of the server.
Plus in OllamaFunctions from the package langchain_experimental does provide support to this.
System Info
langchain==0.2.11 langchain-chroma==0.1.2 langchain-community==0.2.10 langchain-core==0.2.23 langchain-experimental==0.0.63 langchain-groq==0.1.6 langchain-ollama==0.1.0 langchain-text-splitters==0.2.2