jupyterlab / jupyter-ai

A generative AI extension for JupyterLab
https://jupyter-ai.readthedocs.io/
BSD 3-Clause "New" or "Revised" License
3.13k stars 311 forks source link

Why Ollama Provider baseurl is fixed to "localhost:11443"? #902

Closed ujongnoh closed 2 months ago

ujongnoh commented 2 months ago

Hello! Thank you for all Developer who Develop Jupyter AI Service.. Our team want to Connect between JupyterLab Pod that KubeSpawner deploy in Kubernetes and Ollama Pod in same namespace but.. ollama provider baseurl is fixed "localhost:11443" in source code Do you have a way to make the baseurl attractive without modifying the code? if you don't any other plans for this thing, do you have plan about this request??

Thank you!!!

jtpio commented 2 months ago

Thanks @ujongnoh for opening the issue.

Looks like the base_url is set in the langchain package here: https://github.com/langchain-ai/langchain/blob/0dec72cab08cad712a6916368dc37c70faefb7a0/libs/community/langchain_community/llms/ollama.py#L32

Maybe there could be a TextField for the Ollama provider to allow providing another URL, similar to the OpenAI provdider?

https://github.com/jupyterlab/jupyter-ai/blob/12d069e19f9296401e15ca4aa7117f5062f2b62e/packages/jupyter-ai-magics/jupyter_ai_magics/partner_providers/openai.py#L55-L57

ujongnoh commented 2 months ago

Wow!! Exactly what i want it.. Thanks!!

ujongnoh commented 1 month ago

Hello.. I'm aplogize for reopen this issue.. i'm trying to api test using jupyter_ai 2.19.1 version but occured this error File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1257, in _create_direct_connection raise last_exc File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1226, in _create_direct_connection transp, proto = await self._wrap_create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1033, in _wrap_create_connection raise client_error(req.connection_key, exc) from exc aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]

i think that this promblem reason is base url of Ollama object in langchain_community library is fixed by 'localhost:11434" so If you change the source code, wouldn't it look like this?

class _OllamaCommon(BaseLanguageModel): before -> base_url: str = "http://localhost:11434" to -> base_url: Optional[List[str]] = None """Base url the model is hosted under."""

but do not have permission modify this base url code .. isn't it?? Thank you!!

taehee commented 1 month ago

@ujongnoh I also had the same issue and solved it by referring to the link below. https://github.com/langchain-ai/langchain/issues/24703 I solved it by installing langchain-ollama==0.1.0 Good luck!!

sqlreport commented 1 day ago

@dlqqq we are still seeing same issue https://github.com/jupyterlab/jupyter-ai/issues/1004