Closed giladgd closed 6 months ago
:tada: This PR is included in version 3.0.0-beta.2 :tada:
The release is available on:
v3.0.0-beta.2
Your semantic-release bot :package::rocket:
:tada: This PR is included in version 3.0.0-beta.4 :tada:
The release is available on:
v3.0.0-beta.4
Your semantic-release bot :package::rocket:
Description of change
LlamaChat
LlamaText
utilx64
arch by defaultMetal on macOS on Intel Macs isn't supported well by
llama.cpp
ATM, so disabling it by default will be better for most Intel macOS users.Resolves #101 Fixes #114 Related: https://github.com/langchain-ai/langchainjs/pull/3588 (
LlamaChat
)How to use function calling
Support for function calling is still a work in progress, but as for now, Functionary models are supported.
Pull-Request Checklist
master
branchnpm run format
to apply eslint formattingnpm run test
passes with this changeFixes #0000