ollama / ollama-js

Ollama JavaScript library
https://ollama.com
MIT License
2.34k stars 185 forks source link

ollama.embed is returning different embeddings compared to vanilla Ollama #170

Closed eugeniosegala closed 8 hours ago

eugeniosegala commented 10 hours ago

The following request:

const response = await ollama.embed({
    model: 'nomic-embed-text:v1.5',
    input: "hello",
  });

Is returning very different embeddings:

 0.017930806, -0.0058585233,   -0.17534214,   -0.013760389,

Compared to vanilla Ollama:

curl http://localhost:11434/api/embeddings -d '{
  "model": "nomic-embed-text:v1.5",
  "prompt": "hello"
}'

Why?

0.42144104838371277, -0.13775105774402618, -4.1203837394714355, -0.3233441710472107

Is Ollama JS applying some sort of normalisation?

hopperelec commented 10 hours ago

You're comparing the embed function to the embeddings endpoint. If you want ollama-js to give the same output as you showed from the Ollama API, use the embeddings function. For the other way round, use the embed endpoint.

eugeniosegala commented 9 hours ago

Thank you @hopperelec.

I will try this.

What is the difference between the 2? I did not notice the embeddings function in the doc.

hopperelec commented 8 hours ago

I believe embeddings is a deprecated version, so generally you should use embed