Closed greenido closed 7 months ago
Absolutely, thanks for making an issue. Is your REST API llama.cpp/examples/server or something else? I made #28 because a few people wanted that.
Ahh... yes please! Something like the server example would be awesome.
Cool! Also, maybe I misunderstood your original question. Right now you can hit the llama.cpp/examples/server running on localhost:8690 if you're chatting with FreeChat (at some point I want to dynamically bind that port because FreeChat will break if it's taken). The server includes a tiny html front-end as well as a streaming completion API https://github.com/ggerganov/llama.cpp/tree/master/examples/server
Architecturally, FreeChat is basically a 1-click runner and alternate front-end for server.cpp.
Sounds good - I'll use this one for now. Thank you!
On Wed, Nov 15, 2023 at 10:46 AM Peter Sugihara @.***> wrote:
Cool! Also, maybe I misunderstood your original question. Right now you can hit the llama.cpp/examples/server running on localhost:8690 if you're chatting with FreeChat (at some point I want to dynamically bind that port because FreeChat will break if it's taken). The server includes a tiny html front-end as well as a streaming completion API https://github.com/ggerganov/llama.cpp/tree/master/examples/server
— Reply to this email directly, view it on GitHub https://github.com/psugihara/FreeChat/issues/31#issuecomment-1813076203, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAPAID7UCGLHQ74LMDB5QTYEUEWXAVCNFSM6AAAAAA7LIFA26VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMJTGA3TMMRQGM . You are receiving this because you authored the thread.Message ID: @.***>
It would be great to be able to chat/query this with REST API so we can bind it into other projects.