withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
983 stars 91 forks source link

feat(model): support emoji #1

Closed ido-pluto closed 1 year ago

ido-pluto commented 1 year ago

Description of change

Enable the model to output emoji with the default chat session

Pull-Request Checklist

github-actions[bot] commented 1 year ago

:tada: This PR is included in version 1.2.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket: