feat: specialTokens parameter for model.detokenize
fix: FunctionaryChatWrapper bugs
fix: function calling syntax bugs
fix: show GPU layers in the Model line in CLI commands
refactor: rename LlamaChatWrapper to Llama2ChatWrapper
Fixes #204
Pull-Request Checklist
[x] Code is up-to-date with the master branch
[x] npm run format to apply eslint formatting
[x] npm run test passes with this change
[x] This pull request links relevant issues as Fixes #0000
[x] There are new or updated unit tests validating the change
[ ] Documentation has been updated to reflect this change
[x] The new commits and pull request title follow conventions explained in pull request guidelines (PRs that do not follow this convention will not be merged)
Description of change
--gpu
flag in generation CLI commandsspecialTokens
parameter formodel.detokenize
FunctionaryChatWrapper
bugsGPU layers
in theModel
line in CLI commandsLlamaChatWrapper
toLlama2ChatWrapper
Fixes #204
Pull-Request Checklist
master
branchnpm run format
to apply eslint formattingnpm run test
passes with this changeFixes #0000