matteoserva / GraphLLM

94 stars 5 forks source link

FileNotFoundError #2

Closed Bubareh closed 2 days ago

Bubareh commented 1 week ago

Hi! I'm very interested in your project because LLMs really lack modular frontends. However, for some reason I can't run the editor, it gives an error:

Exception occurred during processing of request from ('127.0.0.1', 54024)
Traceback (most recent call last):
  File "C:\Users\User\miniconda3\Lib\socketserver.py", line 318, in _handle_request_noblock
    self.process_request(request, client_address)
  File "C:\Users\User\miniconda3\Lib\socketserver.py", line 349, in process_request
    self.finish_request(request, client_address)
  File "C:\Users\User\miniconda3\Lib\socketserver.py", line 362, in finish_request
    self.RequestHandlerClass(request, client_address, self)
  File "G:\GPT\Soft\GraphLLM\modules\server\web_app.py", line 201, in __init__
    super().__init__(*args, **kwargs)
  File "C:\Users\User\miniconda3\Lib\socketserver.py", line 761, in __init__
    self.handle()
  File "C:\Users\User\miniconda3\Lib\http\server.py", line 436, in handle
    self.handle_one_request()
  File "C:\Users\User\miniconda3\Lib\http\server.py", line 424, in handle_one_request
    method()
  File "G:\GPT\Soft\GraphLLM\modules\server\web_app.py", line 293, in do_POST
    res = op(post_data)
          ^^^^^^^^^^^^^
  File "G:\GPT\Soft\GraphLLM\modules\server\web_app.py", line 162, in exec
    with open("/tmp/graph.json", "w") as f:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/graph.json'
----------------------------------------
imgs/icon-stop.png
127.0.0.1 - - [22/Oct/2024 16:41:44] "GET /editor/imgs/icon-stop.png HTTP/1.1" 200 -

I installed everything through Miniconda, I usually don't have problems with this. By the way, chat.py works fine and gives answers from the model.

matteoserva commented 1 week ago

This is caused by the lack of /tmp folder on windows. I'll try to push a fix in the next days.

In the meantime you could launch it inside WSL

matteoserva commented 6 days ago

I pushed a update that shuld fix the issue. Let me know if it solved the problem of launching the software on windows

Bubareh commented 5 days ago

Yes, everything is working, thank you!

An amazing project! Previously, I used thinking only as part of the prompt, enclosing it in tags using LLM, but this is a whole new level!

It would be nice to create a community of Ndes developers around this frontend, like in ComfyUI. I'll try to contribute to this as much as possible once I figure things out myself.

I have a couple more questions: Will it be possible to connect GraphLLM to KoboldCPP and Oobabooga if you just change the address? Especially if it's the address of an API tunnel in the OpenAI format from google colab?

matteoserva commented 5 days ago

Yeah, the long term plan is to accept community made nodes. Right now it's too early. I'm still making breaking changes to the code and this would force the contributors to rewrite their code each time.

I'm slowly adding support for other APIs. Right now only llama.cpp and groq are supported. The next one will be OpenAI API and compatible servers. I opened a issue here to keep track of the progress. https://github.com/matteoserva/GraphLLM/issues/3

Thank you for your support. :)

EDIT: OpenAI API added