nomic-ai / gpt4all

GPT4All: Chat with Local LLMs on Any Device
https://gpt4all.io
MIT License
67.03k stars 7.37k forks source link

Documentation for build the project from source code #2441

Open t-dasun opened 3 weeks ago

t-dasun commented 3 weeks ago

Can some one guide me to setup gpt4all from src code.

I have used this link https://github.com/nomic-ai/gpt4all/blob/main/gpt4all-chat/build_and_run.md

I installed qt as in the description. Build gpt4all-backend and the gpt4all-chat
cmake ../../gpt4all-backend/ -DLLMODEL_CUDA=OFF -DLLMODEL_KOMPUTE=OFF

So, if the executable is chat that generates in gpt4all-chat/build/ bin when it execute I'm getting following

constructGlobalLlama: could not find Llama implementation for backend: kompute
constructGlobalLlama: could not find Llama implementation for backend: cuda
[Warning] (Thu Jun 13 23:20:20 2024): QQmlApplicationEngine failed to load component
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/main.qml:66:5: Type ChatView unavailable
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/qml/ChatView.qml:483:5: Type SettingsDialog unavailable
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/qml/SettingsDialog.qml:104:9: Type MySettingsStack unavailable
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/qml/MySettingsStack.qml:67:30: IconLabel is not a type
[Warning] (Thu Jun 13 23:20:20 2024): QIODevice::read (QNetworkReplyHttpImpl): device not open
[Warning] (Thu Jun 13 23:20:20 2024): ERROR: Couldn't parse:  "" "illegal value"

Can someone direct me to proper document. That how can I use bindings and how to set the model that we need to run. Or if anyone can explain or help me to build this locally I can start work on document .

I need to know gpt4all-backend, gpt4all-binding, gpt4all-chat and gpt4all-traning build and run steps.

t-dasun commented 3 weeks ago

@AndriyMulyar