withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

feat: get embedding for text #123

Closed giladgd closed 7 months ago

giladgd commented 8 months ago

Related: https://github.com/langchain-ai/langchainjs/issues/3626

jlabrada71 commented 8 months ago

Hi, I created a PR for this feature. It needs to be carefully reviewed since it is the first code in C++ that I write in years, also the first time that do something related to node api, so, there might be some none sense in the code.

github-actions[bot] commented 7 months ago

:tada: This issue has been resolved in version 3.0.0-beta.3 :tada:

The release is available on:

Your semantic-release bot :package::rocket: