microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
34.14k stars 4.94k forks source link

Magentic-One local LLM (Ollama) support? #4135

Open mehulgupta2016154 opened 4 days ago

mehulgupta2016154 commented 4 days ago

What feature would you like to be added?

How Magentic-One be used with local LLMs or Ollama?

Why is this needed?

This will enable users to use Magentic-One with open-source LLMs other than OpenAI-API

009topersky commented 3 days ago

In this video, it seems demonstrate the usage of ollama and Magnetic One..

https://www.youtube.com/watch?v=-WqHY3uE_K0

OminousIndustries commented 3 days ago

That is my video posted above :) I have a semi-functional fork of this that works with ollama and was tested with llama-3.2-11b-vision. Here is a link to the repo: https://github.com/OminousIndustries/autogen-llama3.2

The install steps should be the same as the regular magentic-one install. You can ignore the "Environment Configuration for Chat Completion Client" since the model info is hard coded into the utils.py in my repo (which is a current limitation as it chains it to llama-3.2-11b-vision), but since I was using that for testing, it worked for my purposes!