withcatai / catai

Run AI ✨ assistant locally! with simple API for Node.js 🚀
https://withcatai.github.io/catai/
MIT License
450 stars 29 forks source link

add systemPrompt in Model Settings #64

Closed scenaristeur closed 9 months ago

scenaristeur commented 9 months ago

fix #63

scenaristeur commented 9 months ago

As you can see in the demo picture of the chat, there is a suprising behaviour ,after the line "I be a pirate..." the model add a ": what are you doing ?" but i think it is due to the model i have used doplphin that must not adopt the same chatwrapper ? or is it something else ?

ido-pluto commented 9 months ago

It is also not the model falt only, in the next beta of @catai/node-llama-cpp there will be some optimization to the model wrapper, in a way that models will better understand the syntax of the wrapper by using specific Unicode reserved for this purpose.

You can try using a model with more parameters and it may better understand the model wrapper.

ido-pluto commented 9 months ago

Please change your pull request according to my comments, and I will merge it right away :)

scenaristeur commented 9 months ago

which comments ? what should i change ?

ido-pluto commented 9 months ago

@scenaristeur sorry for the delay, I noticed I did not publish the review

github-actions[bot] commented 9 months ago

:tada: This PR is included in version 3.1.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket: