Closed Bubareh closed 2 days ago
This is caused by the lack of /tmp folder on windows. I'll try to push a fix in the next days.
In the meantime you could launch it inside WSL
I pushed a update that shuld fix the issue. Let me know if it solved the problem of launching the software on windows
Yes, everything is working, thank you!
An amazing project! Previously, I used thinking only as part of the prompt, enclosing it in tags using LLM, but this is a whole new level!
It would be nice to create a community of Ndes developers around this frontend, like in ComfyUI. I'll try to contribute to this as much as possible once I figure things out myself.
I have a couple more questions: Will it be possible to connect GraphLLM to KoboldCPP and Oobabooga if you just change the address? Especially if it's the address of an API tunnel in the OpenAI format from google colab?
Yeah, the long term plan is to accept community made nodes. Right now it's too early. I'm still making breaking changes to the code and this would force the contributors to rewrite their code each time.
I'm slowly adding support for other APIs. Right now only llama.cpp and groq are supported. The next one will be OpenAI API and compatible servers. I opened a issue here to keep track of the progress. https://github.com/matteoserva/GraphLLM/issues/3
Thank you for your support. :)
EDIT: OpenAI API added
Hi! I'm very interested in your project because LLMs really lack modular frontends. However, for some reason I can't run the editor, it gives an error:
I installed everything through Miniconda, I usually don't have problems with this. By the way, chat.py works fine and gives answers from the model.