KlausSchaefers / quant-ux

Quant-UX - Prototype, Test and Learn
GNU General Public License v3.0
1.94k stars 200 forks source link

AI helper #263

Open Alex-work-1 opened 10 months ago

Alex-work-1 commented 10 months ago

New AI support to generate UI

There are open-source .gguf models available which are well optimized to run locally on CPU only. For example I can run Mistral-7b on my Macbook Air 2017 {processor: Intel core i5 1.8 GHz; RAM: 8 GB} with 2 words per second generation. I took around 4 GB of ROM ( if I would use .safetensors it would took 15 GB of ROM and possibly would be slower as .safetensors are more designed for GPU usage).

Integration of open-source AI models through API

The app GPT4ALL from NomicAI allows to enable API server locally to share AI access with other open-source applications like Quant-ux.

Feature

Would it be possible to add access through GPT4ALL API (open-source) or other custom APIs instead of ChatGPT (proprietary) to generate UIs?

KlausSchaefers commented 10 months ago

Hi,

that is a cool suggestion. I haven't tried to run a local LLM yet. My guess is that changing the API is not so easy, as I did some prompt engineering to get the good output. Most likely these prompts would need to be updated also for every model.

I will play a little bit with it. but my bandwidth is limited. If you want to implement the feature, the scope is limited to just a handful of files. I can give you a short intro if you like.

Cheers,

Klaus

Alex-work-1 commented 10 months ago

Hi Klaus,

Sure, I can try to test and modify these prompts if that will help to add this feature in future updates. Please let me know what were the prompts for ChatGPT and how do I check if AI response is correct.

Possibly the API will not be needed to change, as GPT4ALL states that the API is OpenAI compliant.

Cheers, Alex

@KlausSchaefers