langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.6k stars 2.16k forks source link

Deprecated Ollama Embeddings Work (path "@langchain/community/embeddings/ollama") , but the "@langchain/ollama" import don't Work #6564

Closed Rafael-Julio-2242 closed 1 month ago

Rafael-Julio-2242 commented 2 months ago

Checked other resources

Example Code

I was testing the embeddings for a future project that i need to create, and i have found out that the newer version of the class "OllamaEmbeddings" don't quite work.

The following code:


import { OllamaEmbeddings } from "@langchain/ollama";

(async () => {

  const model = new OllamaEmbeddings({
   model: "nomic-embed-text",
   onFailedAttempt: (err: any) => {
    console.log("[Error]: ", err);
   }
  });

  const embededText = await model.embedQuery("test");

  console.log(embededText);

})();

This Returns an Not Found error, it tries to connect to the Ollama API but it says it don't find.

The error:

ResponseError: 404 page not found
    at checkOk (/Users/user/Documents/projetos/testProject/node_modules/ollama/dist/shared/ollama.384eb0a9.cjs:72:9)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async post (/Users/user/Documents/projetos/testProject/node_modules/ollama/dist/shared/ollama.384eb0a9.cjs:119:3)
    at async Ollama.embed (/Users/user/Documents/projetos/testProject/node_modules/ollama/dist/shared/ollama.384eb0a9.cjs:392:22)
    at async RetryOperation._fn (/Users/user/Documents/projetos/testProject/node_modules/p-retry/index.js:50:12) {
  error: '404 page not found',
  status_code: 404,
  attemptNumber: 7,
  retriesLeft: 0
}

But when i use axios or the older version, everthing works just fine.

Same code, but using the older version (the code that works Ok):


import { OllamaEmbeddings } from "@langchain/community/embeddings/ollama";

(async () => {

  const model = new OllamaEmbeddings({
   model: "nomic-embed-text",
   onFailedAttempt: (err: any) => {
    console.log("[Error]: ", err);
   }
  });

  const embededText = await model.embedQuery("test");

  console.log(embededText);

})();

The test that i have made with "axios" works, and it ends up being a simple test:

import axios from "axios";
import { OllamaEmbeddings } from "@langchain/ollama";

(async () => {

  const res = await axios.post('http://localhost:11434/api/embeddings', {
   "model": "nomic-embed-text",
   "prompt": "testing"
  }, {
   validateStatus: () => true // Don't want axios to throw errors
  });

  if (res.status !== 200 && res.status !== 201) {

   console.log('[ERROR]: ', res.data);
   return;
  }

  console.log(res.data);

})();

Error Message and Stack Trace (if applicable)

No response

Description

It says in the docs and in the code that the version of "@langchain/community/embeddings/ollama" is deprecated, but that version is working and the other version is not actually working.

System Info

node version v22.6.0 npm version 10.8.2 Mac

Package.json file:

{
  "name": "test",
  "version": "1.0.0",
  "main": "index.js",
  "scripts": {
    "dev": "ts-node src/index.ts"
  },
  "devDependencies": {
    "typescript": "^5.5.4"
  },
  "dependencies": {
    "@langchain/community": "^0.2.28",
    "@langchain/core": "^0.2.27",
    "@langchain/ollama": "^0.0.4",
    "@langchain/openai": "^0.2.7",
    "@types/node": "^22.4.0",
    "axios": "^1.7.4",
    "cheerio": "^1.0.0",
    "langchain": "^0.2.16"
  }
}
dosubot[bot] commented 2 months ago

Hey, @Rafael-Julio-2242! I'm here to help you with your bug reports, answer your questions, and guide you on how to contribute. Let's solve this issue together!

The newer version of the "OllamaEmbeddings" class from "@langchain/ollama" returns a 404 error because the base URL might be incorrectly set or the endpoint might not exist on the server. The deprecated version from "@langchain/community/embeddings/ollama" works fine because it defaults to "http://localhost:11434" and includes a fallback to "http://127.0.0.1:11434" for Node 18 compatibility [1][2].

Here is the relevant code snippet from the deprecated version that handles the base URL:

let formattedBaseUrl = baseUrl;
if (formattedBaseUrl.startsWith("http://localhost:")) {
  // Node 18 has issues with resolving "localhost"
  // See https://github.com/node-fetch/node-fetch/issues/1624
  formattedBaseUrl = formattedBaseUrl.replace(
    "http://localhost:",
    "http://127.0.0.1:"
  );
}

This ensures that the base URL is correctly formatted, which might be missing or different in the newer version, leading to the 404 error.

Additionally, the newer version includes a migration warning and re-exports everything from the deprecated version, as shown in the following code snippet:

import { logVersion010MigrationWarning } from "../util/entrypoint_deprecation.js";

/* #__PURE__ */ logVersion010MigrationWarning({
  oldEntrypointName: "embeddings/ollama",
});
export * from "@langchain/community/embeddings/ollama";

This indicates that the newer version is essentially a wrapper around the deprecated version, but the migration warning or other changes might be causing the 404 error.

Moreover, the newer version of the Ollama class in the "@langchain/ollama" package might not be correctly setting the base URL or other parameters. The constructor in the deprecated version ensures that the base URL is correctly formatted and defaults to "http://localhost:11434":

constructor(fields: OllamaInput & BaseLLMParams) {
  super(fields);
  this.model = fields.model ?? this.model;
  this.baseUrl = fields.baseUrl?.endsWith("/")
    ? fields.baseUrl.slice(0, -1)
    : fields.baseUrl ?? this.baseUrl;
  this.keepAlive = fields.keepAlive ?? this.keepAlive;
  // other initializations
}

If the newer version does not include similar logic, it could lead to incorrect base URLs and subsequent 404 errors [3].

To continue talking to Dosu, mention @dosu.

jacoblee93 commented 1 month ago

Apologies for missing this, this should now be present!

chaostracker commented 1 month ago

I still have this exact same issue with 0.1.0.

bracesproul commented 1 month ago

@chaostracker what version of ollama do you have installed? You can check via:

ollama -v

I have version 0.3.11 installed, and it's working properly.

chaostracker commented 1 month ago

It was ollama version 0.2.1. I upgraded to 0.3.11 and now it works as expected, so thanks.