OpenBMB / ChatDev

Create Customized Software using Natural Language Idea (through LLM-powered Multi-Agent Collaboration)
https://arxiv.org/abs/2307.07924
Apache License 2.0
24.38k stars 3.06k forks source link

Add ollama integration instruction in the readme . #352

Closed hemangjoshi37a closed 3 months ago

hemangjoshi37a commented 4 months ago

Please anyone who has write access to this repo please provide any documentation or anything on how to replace openAI models with ollama models . thanks .

XZH-HZX commented 4 months ago

Hi:) You can adjust the relevant settings in the model_backend.py, including modeltype, your API_KEY, etc.

thinh9e commented 3 months ago

If you want to use Ollama in local setup, you can follow my steps:

  1. Set environment variables:
    export OPENAI_API_KEY=ollama  # any value
    export BASE_URL=http://localhost:11434/v1  # your Ollama API server
  2. Replace model parameter to your model in Ollama: https://github.com/OpenBMB/ChatDev/blob/bbb145048ef1a1be727cd4f176b8bafe1a5f15db/camel/model_backend.py#L100 Example:
    response = client.chat.completions.create(*args, **kwargs, model="gemma:2b-instruct",
                                              **self.model_config_dict)
  3. Run:
    python3 run.py --task "[description_of_your_idea]" --name "[project_name]"
hemangjoshi37a commented 3 months ago

@thinh9e ok thanks. but this should be added to the readme file so that it is accessible to anyone.