BuilderIO / micro-agent

An AI agent that writes (actually useful) code for you
https://www.builder.io/blog/micro-agent
MIT License
2.52k stars 200 forks source link

Ollama support #12

Closed steve8708 closed 2 months ago

steve8708 commented 2 months ago

integrate with ollama-js for fully local development

note: in the code there is a USE_ASSISTANT flag, just treat that as always off (false) for ollama as it has no assistants API (also fwiw i don't think the assistants direction is necessary or essential anyway)

in short, you would make sure any call to openai for chat completions goes to ollama-js instead

steve8708 commented 2 months ago

added it - its pretty dumb/unreliable. highly recommend gpt-4o instead

tpaulshippy commented 2 months ago

Curious - what were you seeing that was dumb/unreliable?

tpaulshippy commented 2 months ago

Did you try a variety of models? I'm seeing a lot of specialized models for code, like codellama, starcoder2, or codegemma.

This line indicates maybe not.

steve8708 commented 2 months ago

I tried only llama3 and phi, models i've had good success with in the past or heard the most good things about, but definitely would be game for help testing other alternatives

the main "dumb" thing was no matter what, it woudl always output stuff likei this

function parse(str: string) {
  const parsed = ts.parse(str);
  // rest of code here
}

it would never output the full code and would only output parts of it with "rest of code here" type comments throughout

gvzq commented 2 months ago

@steve8708 if you're hardware is good enough you can get some neat results. For example, the mixtral:8x22b with Ollama might be used to override the openai endpoints locally.

endpoint=http://localhost:11434/v1/
model=mixtral:8x22b
key=ollama

Alternatively, Groq's mixtral-8x7b-32768 has amazing speeds and a huge context window.

endpoint=https://api.groq.com/openai/v1
model=mixtral-8x7b-32768
key=gsk_KEY

My code generated generated perfectly but failed at GET /openai/v1/assistants. 👎

steve8708 commented 2 months ago

ah! just pushed a fix for that. try version 0.1.1 and use micro-agent config to change USE_ASSISTANT to false

steve8708 commented 2 months ago

perhaps in a future version we should just automatically turn assistants off if a custom endpoint is used 🤔

steve8708 commented 2 months ago

yeah i'll just turn assistant off anytime an endpoint is used ✅

gvzq commented 2 months ago

@steve8708 you're moving at lightning speeds. It's working now and thank you for this project.

AlonJoshua commented 2 months ago

yeah i'll just turn assistant off anytime an endpoint is used ✅

Hey @steve8708 have you already updated this behavior in the package?

Right now config set USE_ASSISTANT=false works only when playing with micro-agent project locally:

Screenshot 2024-06-23 at 12 44 09

But doesn't recognize that config when using micro-agent package:

Screenshot 2024-06-23 at 12 47 04
steve8708 commented 2 months ago

ah, i may have not released recently enough! I'll push a release now

steve8708 commented 2 months ago

release done, try 0.1.3 (e.g. via running micro-agent update). still see the issue with that?

AlonJoshua commented 2 months ago

The issue is fixed, thanks!