withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

feat: fall back to build from source if prebuilt binary loading fails #54

Closed giladgd closed 11 months ago

giladgd commented 11 months ago

Description of change

Pull-Request Checklist

github-actions[bot] commented 11 months ago

:tada: This PR is included in version 2.5.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket: