nathanlesage / local-chat

LocalChat is a ChatGPT-like chat that runs on your computer
https://nathanlesage.github.io/local-chat/
GNU General Public License v3.0
59 stars 1 forks source link

setChatHistory() in node-llama-cpp-v3 #5

Closed scenaristeur closed 4 months ago

scenaristeur commented 4 months ago

Hi @nathanlesage thxs for your help on https://github.com/withcatai/node-llama-cpp/pull/105#issuecomment-1981826680

you use session.setChatHistory() https://github.com/nathanlesage/local-chat/blob/45c7d6b80c3e1bb566778ac195d8e87ec46558db/src/main/LlamaProvider.ts#L210, but i can not find where this function is defined ? could you help me , please

nathanlesage commented 4 months ago

I basically just searched the tree of node llama cpp. It's a bit difficult but it's manageable. Basically, how node llama cpp treats the models is the following:

  1. You instantiate a session using context and loaded model
  2. During instantiation, you can already provide a system prompt, unless you want to use the default one
  3. Afterwards, you basically need to provide any messages prior to when the prompt will come, in a form as I did in my source code. There is an issue with the function calling that makes it iffy, but if you don't use it it's relatively straight forward, I have commented the appropriate part of the code.

Lastly, please don't open issues here if you have questions regarding the source code. Instead, please use another channel -- for quicker communication, feel free to join the Zettlr discord where you can reach me more quickly: https://go.zettlr.com/discord