VRSEN / agency-swarm-lab

MIT License
522 stars 238 forks source link

Open Source Swarm: Ollama running in separate Docker Container #24

Open kartguru opened 3 months ago

kartguru commented 3 months ago

Despite being new to docker, I believe others will also already have a Ollama running in a separate Docker Container.

Could you please provide instructions on how to interact with separate containers.

Separate 'Open Source Swarm' instructions may be required as the docker run -it -v ./:/app --rm -p 7860:7860 -e OPENAI_API_KEY=<YourOpenAIKey> vrsen/agency-swarm seems somewhat redundant if not using OPENAI_API_KEY

Thanks, and thanks for your enthusiasm ;)

kartguru commented 3 months ago

My efforts to run and interact using docker networks haven't been fruitful however in trying to run litellm I've seen my separate Open WebUI container appears to offer a way to manage under Settings > Models > Manage LiteLLM Models. Can the Open WebUI container be used instead?