langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
12.16k stars 2.05k forks source link

OpenAIEmbeddings configuration of baseURL is not possible #3086

Closed truedat101 closed 10 months ago

truedat101 commented 10 months ago

Reviewing the source code and investigating the API docs: https://js.langchain.com/docs/api/embeddings_openai/classes/OpenAIEmbeddings

and the source: https://github.com/langchain-ai/langchainjs/blob/0.0.175/langchain/src/embeddings/openai.ts

for OpenAIEmbeddings JS I was trying to understand how to provide my own baseURL and not use the default openai.com endpoint. This seems not to be possible unless using Azure services. But there are enough ambiguities in the docs that I wasn't willing to give up.

I asked the docs as I was confused as to why I can't set my own baseURL / apiUrldirectly (in case I am not using openAI.com). I asked the chatbot directly:

Response was:


const { OpenAIEmbeddings } = require('langchain');

β€’ Create an instance of the OpenAIEmbeddings class, passing the necessary configuration options:

const embeddings = new OpenAIEmbeddings({
  apiKey: 'your-api-key',
  apiUrl: 'https://your-api-url',
  apiVersion: 'your-api-version',
  organizationId: 'your-organization-id',
  proxy: 'http://your-proxy-server',
  timeout: 5000,
});

If I run the above code, this doesn't do anything. It still calls api.openai.com. If I provide { configuration : { come config } } I can provide an api key, but anything I put in there related to baseURL doesn't work. To sort out the confusion, I looked through the source and say that it might be possibly if one can specify the OpenAI client. But I cannot figure out how to pass in a client object (from the OpenAI module) without modifying the langchain source.

Seeking clarification on how to set the baseURL or ApiURL or OpenAI client for OpenAIEmbeddings js.

dosubot[bot] commented 10 months ago

πŸ€–

Hello @truedat101,

I'm here to assist you with your question about configuring the baseURL for the OpenAIEmbeddings in the langchainjs framework.

Based on the similar issues I found in the langchainjs repository, it seems that you can set the baseURL by passing it in the configuration object when creating a new instance of the OpenAIEmbeddings model. Here's an example:

import { OpenAIEmbeddings } from "langchain/embeddings/openai";

const embeddings = new OpenAIEmbeddings({
  openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
  batchSize: 512, // Default value if omitted is 512. Max is 2048
  configuration: {
    baseURL: "https://your_custom_url.com",
  },
});

In this code, the baseURL is set to "https://your_custom_url.com". You can replace this with your own custom URL. The model will then use this URL for all API requests. If you're part of an organization, you can set process.env.OPENAI_ORGANIZATION to your OpenAI organization id, or pass it in as organization when initializing the model.

You can find more information about this in the following sources:

I hope this helps! If you have any other questions or need further clarification, feel free to ask.

Best, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

jacoblee93 commented 10 months ago

Bot response is correct, closing for now although we should document better.

truedat101 commented 10 months ago

Hmm, so apiUrl is correct to use and so is baseURL? I will retest, and verify if this is indeed working.