cmp-nct / ggllm.cpp

Falcon LLM ggml framework with CPU and GPU support
Other
242 stars 21 forks source link

Is there any GUI or Web UI for ggllm.cpp? #52

Open JohnClaw opened 1 year ago

chrisbward commented 1 year ago

or a http API?

mirek190 commented 1 year ago

I want that awesome simple www server from llama.cpp as well

Screenshot 2023-07-09 175006

cmp-nct commented 1 year ago

We'll definitely need something like that. Though I've a ton of features and ideas to try and only 15 hours a day time.

It will have to wait for a bit, or someone else ports that.

matthoffner commented 1 year ago

Thanks @cmp-nct

I have a simple UI based off the official Falcon space uusing ggllm via ctransformers: https://huggingface.co/spaces/matthoffner/falcon-mini

I've been building spaces on HF using ggml fastapis, I have a boilerplate repo I'm working off here https://github.com/matthoffner/ggml-fastapi

hiwudery commented 1 year ago

A modified falcon_server.cpp is as following. It can help you to build web-ui by http api. server_code.zip

linuxmagic-mp commented 1 year ago

I have a simple UI based off the official Falcon space uusing ggllm via ctransformers: https://huggingface.co/spaces/matthoffner/falcon-mini

FYI, just errors out at the moment..

matthoffner commented 1 year ago

I have a simple UI based off the official Falcon space uusing ggllm via ctransformers: https://huggingface.co/spaces/matthoffner/falcon-mini

FYI, just errors out at the moment..

Feel free to open an issue, it might scale down when its not being used.

JohnClaw commented 1 year ago

I meant a windows gui app or a local offline web ui that can be opened in Microsoft Edge etc.

cmp-nct commented 1 year ago

From my current roadmap view I'll see into performance optimizations as next step and once that is done I'll look into the best way to quickly add accessibility through a web frontend.

sirajperson commented 1 year ago

I started working on a fork of llama-cpp-python for ggllm.cpp, but it's not working yet. Anyone that wants to help is more than welcome. falcon-cpp-python

linuxmagic-mp commented 1 year ago

I started working on a fork of llama-cpp-python for ggllm.cpp, but it's not working yet. Anyone that wants to help is more than welcome. falcon-cpp-python

I think that rather than MORE forks and confusion, you might just pull from llama-cpp-python, and help make that work for both llama and falcon models. They are already working on that. Just a suggestion. I will be testing later this week on some of pre-requisites.

cmp-nct commented 1 year ago

I should bring a word of caution: we'll see huge changes with the next release. More than all previous updates combined. If time permits it will already include a minimal web-based GUI that can then be further developed/extended. I expect it to be finished within a week, though my time planning usually is off

djmaze commented 1 year ago

It seems LocalAI has support for ggllm already. (Did not try this out yet.) As it offers an OpenAI API compatible interface, you can use it in conjunction with any web-based client such as chatbot-ui.