statelyai / agent

Create state-machine-powered LLM agents using XState
https://stately.ai/docs/agents
156 stars 11 forks source link

Feature request: ollama-js adapter to work locally #15

Closed otrebu closed 3 months ago

otrebu commented 7 months ago

Create a ollama-js adapter to have the option to use ollama instead of OpenAI.

If you are not working on this I might give it a go.

davidkpiano commented 7 months ago

@otrebu I would gladly welcome you to give this a try - it should be as simple as copying from the OpenAI adapter.

otrebu commented 7 months ago

I started on it, but I have a question: do you have any suggestions in how to deal with tools? Ollama doesn't currently seem to support the same functionality/API for that.

davidkpiano commented 7 months ago

I started on it, but I have a question: do you have any suggestions in how to deal with tools? Ollama doesn't currently seem to support the same functionality/API for that.

Let's just stub this for now.

otrebu commented 6 months ago

I have done my best so far, a bit stuck on some Typescript types.

What is it best to do if I need a little hand? Still open a pull request?

This is the commit on the fork: https://github.com/otrebu/agent/commit/334b077e1e495cebcceaff453f238f4010184cdc

I haven't tested the code yet.

davidkpiano commented 6 months ago

Yes please @otrebu, open a pull request and I will gladly work on this with you.

otrebu commented 6 months ago

Amazing thanks @davidkpiano .

Hopefully I created it correctly: https://github.com/statelyai/agent/pull/24 ( first one ever for me 😄 )

otrebu commented 4 months ago

@davidkpiano did you have a chance to have a look? Otherwise I will try again after I rebase.

davidkpiano commented 4 months ago

@davidkpiano did you have a chance to have a look? Otherwise I will try again after I rebase.

I have recently added the Vercel AI SDK, which abstracts multiple models. I will see if/how ollama can be used with that, but it's a much more scalable solution than trying to build our own adapters 😅

otrebu commented 4 months ago

Oh ok, thank you! Sounds good.

airtonix commented 3 months ago

yep we don't need this.

this exists :

import { createAgent } from '@stately/agent';
import { createOllama } from 'ollama-ai-provider';

const ollama = createOllama({ baseURL: process.env.MY_AMAZEBALLS_OLLAMA_BASEURL });

// found with ollama --list
const myCustomOllamaModelName = process.env.MY_AMAZEBALLS_OLLAMA_MODELNAME || 'mixtral';

const model = ollama(myCustomOllamaModelName');

const agent = createAgent({
  name: 'my-foo-bar-agent',
  model,
  events: {
    //... my amazing events
  },
});
otrebu commented 3 months ago

Thank you, I will close.