Closed ujongnoh closed 2 months ago
Thanks @ujongnoh for opening the issue.
Looks like the base_url
is set in the langchain
package here: https://github.com/langchain-ai/langchain/blob/0dec72cab08cad712a6916368dc37c70faefb7a0/libs/community/langchain_community/llms/ollama.py#L32
Maybe there could be a TextField
for the Ollama provider to allow providing another URL, similar to the OpenAI provdider?
Wow!! Exactly what i want it.. Thanks!!
Hello.. I'm aplogize for reopen this issue..
i'm trying to api test using jupyter_ai 2.19.1 version but occured this error
File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1257, in _create_direct_connection raise last_exc File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1226, in _create_direct_connection transp, proto = await self._wrap_create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1033, in _wrap_create_connection raise client_error(req.connection_key, exc) from exc aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]
i think that this promblem reason is base url of Ollama object in langchain_community library is fixed by 'localhost:11434" so If you change the source code, wouldn't it look like this?
class _OllamaCommon(BaseLanguageModel):
before -> base_url: str = "http://localhost:11434"
to -> base_url: Optional[List[str]] = None
"""Base url the model is hosted under."""
but do not have permission modify this base url code .. isn't it?? Thank you!!
@ujongnoh I also had the same issue and solved it by referring to the link below. https://github.com/langchain-ai/langchain/issues/24703 I solved it by installing langchain-ollama==0.1.0 Good luck!!
@dlqqq we are still seeing same issue https://github.com/jupyterlab/jupyter-ai/issues/1004
Hello! Thank you for all Developer who Develop Jupyter AI Service.. Our team want to Connect between JupyterLab Pod that KubeSpawner deploy in Kubernetes and Ollama Pod in same namespace but.. ollama provider baseurl is fixed "localhost:11443" in source code Do you have a way to make the baseurl attractive without modifying the code? if you don't any other plans for this thing, do you have plan about this request??
Thank you!!!