withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://withcatai.github.io/node-llama-cpp/
MIT License
760 stars 65 forks source link

get embedding for text #128

Closed jlabrada71 closed 7 months ago

jlabrada71 commented 7 months ago

Description of change

Fixes #123

Implements a method that returns the embedding of a text. It is an adapted copy of the embedding.cpp from llama.cpp/examples

Pull-Request Checklist

giladgd commented 7 months ago

@jlabrada71 I'm already working on support for embedding that will be available soon in the version 3 beta. The implementation example in the llama.cpp repo requires some more adaptations to provide a stable support for this without side effects, and version 3 is a major refactor that changes the structure of the project, so I prefer to wait for my own implementation for this one.

I predict it'll be available in the version 3 beta in about ~3 weeks.

Anyway, thank you for your contribution :)