withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
894 stars 86 forks source link

fix: Llama3_1ChatWrapper types #350

Closed vlamanna closed 1 week ago

vlamanna commented 1 week ago

Description of change

Fix the type mismatch in the Llama3_1ChatWrapper constructor.

Fixes #349

Pull-Request Checklist

giladgd commented 1 week ago

Thanks for the PR!

The _specialTokensTextForPreamble parameter is internal and should not be exposed. I fixed this issue by hiding the _specialTokensTextForPreamble option from the function signature in the transpiled types in #351. I'll release it in the next few days.