itsPreto / baby-code

100% Private & Simple. OSS 🐍 Code Interpreter for LLMs 🦙
https://github.com/itsPreto/baby-code
34 stars 7 forks source link

Hook-up to existing server #12

Open molander opened 1 year ago

molander commented 1 year ago

Hey! Thanks for doing this, it is fantastic and I have been having a blast running it on a FreeBSD box.

Silly question and probably an easy one that I am simply fumbling on...how do I hook baby-code.py up to an existing llama.cpp server? Rather than having it start its own.

Thanks for throwing this out there and I think the UI is pretty good, btw :D

Cheers, -matt

itsPreto commented 1 year ago

Hey thanks for reaching out! Glad you're making some of this:)

So all that baby_code.py does is wrap the server.cpp and bridges it along with my own endpoints to execute and interpret the code.

If you want to start the server as a stand-alone application you simply remove that logic (in the main function at the end of the file) and run ./server yourself before running baby_code.py!

Let me know if this clarifies your question!

molander commented 1 year ago

Ah, gotcha! I was almost there but I didn't put the --path arg on the ./server running outside of /baby-code dir, so I got confused when the standard llama.cpp page kept pulling up. Which makes total sense :D

I have some videos on my other box I'll have to post for you, they are interesting and funny. This seems to be a really neat way to teach an LLM about coding and the env it is on. At one point, I had a 34B Wizard Python Coder model plugged into my local Home Assistant instance controlling everything easily. Super neat!

Yup, got it going! Thanks :D https://imgur.com/a/ocvyavT

itsPreto commented 1 year ago

I would recommend looking into a more full-fledged version of what I was going for: https://github.com/KillianLucas/open-interpreter

this would be way easier to integrate with a Home Assistant.

molander commented 1 year ago

haha, maybe. I have it but it is just not as fun for some reason. there is no confetti! ;)

itsPreto commented 1 year ago

haha the confetti definitely keeps me entertained too lol glad you're getting a kick out of it!

itsPreto commented 1 year ago

@molander anything in particular you would like to see implemented? I kind of put if off to the side because I don't have any fun ideas to go off of but if you've got anything interesting pls do share!

itsPreto commented 1 year ago

I'll be synching with the llama.cpp/server UI implementation as they integrated multimodality through LLava 7B/13B.

molander commented 10 months ago

Sorry, missed that one! That is fantastic!! Haha, this is still my fav and I use him to help with a lot of sysadmin tasks from my FreeBSD box. Ummm for feature requests...well, you could go big and do a MemGPT integration and see if BabyCoder could become a MemGPT agent. That sounds like a lot of work lol. Easy one would be some transparency/UI system prompt-level control. I don't know, or maybe being able to save some responses/prompts although actually yah, that is primarily what i end up doing is reminding him of things like 'psst, we are working in that directory remember?' haha, so maybe memgpt might be worthwhile, I know they'd get a hoot out of this over there. Have you looked at Gorilla OpenFunctions? Now that could be super interesting..plugin that guy and suddenly BabyCode can do stuff like Zapier! Haha!