kolbytn / mindcraft

MIT License
2.31k stars 282 forks source link

Add support to APIs similar to OpenAI (Grook, Koboldcpp) #310

Open pansutodeus opened 2 weeks ago

pansutodeus commented 2 weeks ago

The openai package has support for different base url, just setting the environment variable, with this we are able to use any api that is similar to openai, for example this is for the grok api

export OPENAI_BASE_URL="https://api.x.ai/v1/" ; node main.js

The only modifications necessary for making this work are.

Adding grok the following lines. https://github.com/kolbytn/mindcraft/blob/a6edd8fc44813e01c4decfda3d6b6ee018671620/src/agent/prompter.js#L37

         else if (chat.model.includes('gpt') || chat.model.includes('o1') || chat.model.includes('grok') )

https://github.com/kolbytn/mindcraft/blob/a6edd8fc44813e01c4decfda3d6b6ee018671620/src/models/gpt.js#L29

        if (this.model_name.includes('o1') || this.model_name.includes('grok')) {

now only changing the model name in andy.json

    "model": "grok-beta",

and adding the grok api key to keys.json

    "OPENAI_API_KEY": "xai-HS...",

with this grok is 100% functional and works flawlessly, i assume this isn't a proper way to implement it but hopefully this is of help to someone.

for koboldcpp i found an issue with the following line, it always triggers it making it unusable just changing length to anything else makes it work, but i assume it breaks it for other models, not sure why koboldcpp always triggers it maybe is because i was using a 22B not sure. https://github.com/kolbytn/mindcraft/blob/a6edd8fc44813e01c4decfda3d6b6ee018671620/src/models/gpt.js#L39

pansutodeus commented 2 weeks ago

I'm so dumb this is already implemented.

but the change in gpt.js is necessary, it doesn't work without it.

just using this and the api key makes it work

"model": {
  "api": "openai",
  "url": "https://api.x.ai/v1/",
  "model": "grok-beta"
},
pansutodeus commented 2 weeks ago

after further testing seems like the issue for koboldcpp constantly getting "Context length exceeded, trying again with shorter context." is my koboldcpp settings/maybe the model limits? not sure.

"model": {
     "api": "openai",
     "url": "http://127.0.0.1:5001/v1/",
    "model": "koboldcpp/Mistral-Small-22B-ArliAI-RPMax-v1.1-q8_0"
},

the only current issue running with kobold is https://github.com/kolbytn/mindcraft/issues/306