aiplanethub / beyondllm

Build, evaluate and observe LLM apps
https://beyondllm.aiplanet.com/
Apache License 2.0
264 stars 40 forks source link

Unable to change the default Ollama server URL under LLM #56

Closed shikhardadhich closed 4 months ago

shikhardadhich commented 5 months ago

Describe the bug My Ollama server is running on a different server and I am unable to provide the Ollama base URL in the current code, since the URL is hard coded to localhost:11434

To Reproduce Steps to reproduce the behavior: Under this file https://github.com/aiplanethub/beyondllm/blob/main/src/beyondllm/llms/ollama.py Need to add a parameter to accept the Base_URL import ollama self.client = ollama.Client(host='http://localhost:11434')

tarun-aiplanet commented 5 months ago

Understood, during initial addition we added for localhost. Do you want to work on this issue by adding client_url as user input? @shikhardadhich

shikhardadhich commented 5 months ago

Sure, I will add it. @tarun-aiplanet

tarun-aiplanet commented 5 months ago

Thank you, I will assign this task to you. You can open PR and link it to this issue

tarun-aiplanet commented 4 months ago

Closing this issue, as its been resolved