AgentEra / Agently-Daily-News-Collector

An open-source LLM based automatically daily news collecting workflow showcase powered by Agently AI application development framework.
http://agently.tech
Apache License 2.0
450 stars 75 forks source link

Can ollama service replace OpenAI? #9

Closed YetoL closed 6 months ago

YetoL commented 6 months ago

I want to replace OpenAI with the local Ollama, I wonder if there is any plan for iteration in this regard. The use of Ollama can lower the threshold for using this project.

Maplemx commented 6 months ago

If your Ollama model can be requested with OpenAI style messages, then YES.

You can simply switch your model request client to OAIClient(we customized some messages format optimize process to ensure messages follow the rules like must put system message in the first place, must only have one system message and user message and assistant message must in strict orders...) and set your base url to local model server address to use it.

If you're using Agently framework in Python code, you can switch to models those follow OpenAI style like this:

import Agently
agent = (
    Agently.create_agent()
        .set_settings("current_model", "OAIClient")
        .set_settings("model.OAIClient.auth", { "api_key": "<Your-API-Key>" })
        .set_settings("model.OAIClient.url", "https://api.mistral.ai/v1")
        .set_settings("model.OAIClient.options", { "model": "open-mistral-7b" })
)

If you're trying to edit SETTINGS.yaml in this project:

MODEL_PROVIDER: OAIClient
MODEL_URL: <Your Local Model URL>
MODEL_AUTH:
  api_key: "<Your API Key if needed>"
MODEL_OPTIONS:
  model: <Your model id if needed>

Also, you can read this jupyter document to see how to switch to a local model started by Xinference: https://github.com/Maplemx/Agently/blob/main/playground/using_local_open_source_model_to_drive_agents.ipynb