Open grencez opened 1 year ago
Since I plan to use WebRTC for the remote server, it would make sense to pull in the https://github.com/paullouisageneau/libdatachannel/ library now and use its websocket functionality for the local server. It also pulls in a JSON library, which could be convenient.
Not sure what to do about the webpage serving though. I can roll a simple http server, but it would have to hand off to the websocket (on the same port) I guess? Unclear.
In the long term, it's probably best for the local server behave like the remote server and use WebRTC. So in the short term, I once again prefer the original idea of a dumb HTTP-only interface that responds to JavaScript polling.
Code layout can be:
Alternatively... It would be even easier to use a simple python or nodejs process that spawns the chat
executable and serves some html & javascript content. Might as well start there...
The coprocess funcitonality was added in https://github.com/rendezqueue/rendezllama/issues/22.
I only know basic web stuff, so anyone is free to pick this up. See updated description.
A local HTTP server would be a nice way to dockerize and prototype an interface. My rough design would be:
src/
.chat
CLI and communicates with it via coprocess commands (see assistant_coprocess example).