jaluoma / pruju-ai

An AI teaching assistant that allows students to interact with the teacher's course materials
MIT License
42 stars 16 forks source link

remote ollama support via API #8

Open adatepitesz opened 4 months ago

adatepitesz commented 4 months ago

I would like to use Ollama as llm provider, however from a remote cloud provider. I was unable to find where do I put the URL for this. Please provide me with some instructions what to modify!

jaluoma commented 4 months ago

This is not supported, I'm afraid, but it should definitely be a feature. It looks like it's possible.

https://api.python.langchain.com/en/latest/chat_models/langchain_community.chat_models.ollama.ChatOllama.html

I believe

chat = ChatOllama(model=default_model,temperature=0)

Should be:

chat = ChatOllama(model=default_model,temperature=0, base_url = custom_url)

with custom_url set as your model endpoint.