TransformerOptimus / SuperAGI

<⚡️> SuperAGI - A dev-first open source autonomous AI agent framework. Enabling developers to build, manage & run useful autonomous agents quickly and reliably.
https://superagi.com/
MIT License
15.32k stars 1.84k forks source link

Ollama support? #1365

Closed BluePhi09 closed 1 day ago

BluePhi09 commented 10 months ago

Is there any way to use Ollama to host the llm models?

MikeyBeez commented 9 months ago

This should be easy:
from langchain.llms import Ollama ollama = Ollama(base_url='http://localhost:11434', model="llama2") response = ollama(prompt)

Cheers!

shaileshjswl commented 9 months ago

Import the Ollama class from langchain.llms

from langchain.llms import Ollama

Create an instance of the Ollama class with specified parameters

ollama = Ollama(base_url='http://localhost:11434', model="llama2")

Define the prompt you want to send to the Ollama model

prompt = "Your prompt goes here"

Make a request to the Ollama model with the provided prompt

response = ollama(prompt)

Print the response from the Ollama model

print(response)

sydfernandes commented 1 month ago

Where do we add this?

motoxxx138 commented 1 month ago

Same question; how do we utilize this to use ollama with superagi? Thank you.

BluePhi09 commented 1 day ago

You can now just use Ollama Openai compatible API