BuilderIO / micro-agent

An AI agent that writes (actually useful) code for you
https://www.builder.io/blog/micro-agent
MIT License
2.61k stars 206 forks source link

Question: is possible to add Ollama support? #32

Closed velteyn closed 2 months ago

velteyn commented 2 months ago

Hi , great project here. I would like to ask if it is possible to have Ollama support fo local LLMs like Mistral Instruct 8b that can run in a laptop. Thz

tpaulshippy commented 2 months ago

See https://github.com/BuilderIO/micro-agent/issues/12

Should already have it to some extent although may need to add support for more models.

gvzq commented 2 months ago

@velteyn here the config that worked for me.

Run micro-agent config and set all the option with the following:

endpoint=http://localhost:11434/v1/
model=mixtral:8x7b
key=ollama