langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.75k stars 2.2k forks source link

Connection error when using langchain within VS Code Extension (Host) #6806

Open jeanibarz opened 2 months ago

jeanibarz commented 2 months ago

Checked other resources

Example Code

// This is not easily reproductible because it has to be included inside a VS Code extension to fails...
// I don't know enought about creating VS Code extensions to create a minimal working example
// Although this is the implementation I use to generate a completion

// Retrieve proxy settings
const httpConfig = vscode.workspace.getConfiguration('http');
const proxy = httpConfig.get('proxy');

logger.info('Retrieved proxy settings:', proxy);

if (proxy) {
    logger.info('Proxy detected. Configuring Axios to use the proxy.');

    const agent = new HttpsProxyAgent(proxy);
    axios.defaults.proxy = false; // Disable default Axios proxy behavior
    axios.defaults.httpsAgent = agent;

    logger.info('Axios proxy configuration complete.');
} else {
    logger.info('No proxy detected. Proceeding without proxy configuration.');
}

try {
    model = new ChatOpenAI({
        modelName: 'gpt-4o-mini', // same issue with gpt-4o, same issue also with GroqChat
        maxRetries: 1,
    });

    try {
        const response = await model.invoke(
            [
                [
                    "system",
                    "You are a helpful assistant that translates English to French. Translate the user sentence.",
                ],
                [
                    "human",
                    "I love programming.",
                ],
            ]
        );
        logger.info(`LLM response: ${response}`);
    } catch (error) {
        logger.error(`Error message: ${error.message}`);
        logger.error(`Error stack trace: ${error.stack}`);
    }
} catch (error) {
    logger.info(`chatgpt.model: ${provider.modelManager.model} response: ${error}`);
    throw error;
}

Error Message and Stack Trace (if applicable)

2024-09-15T09:55:11.354Z - INFO Retrieved proxy settings: 2024-09-15T09:55:11.354Z - INFO No proxy detected. Proceeding without proxy configuration. 2024-09-15T09:55:13.101Z - ERROR Error message: Connection error. 2024-09-15T09:55:13.101Z - ERROR Error stack trace: Error: Connection error. at OpenAI.makeRequest (/home/jean/git/chatgpt-copilot/out/Extension.js:56648:13) at async /home/jean/git/chatgpt-copilot/out/Extension.js:71714:21 at async RetryOperation._fn (/home/jean/git/chatgpt-copilot/out/Extension.js:22167:19)

Description

I'm trying to use langchain within a VS Code extension I'm working on. I'm able to get a valid response from OpenAI when using library @ai-sdk, but not when using langchain. I tried several stuff: minimal example ,streaming, non-streaming. I checked many times that my env variables was correctly setup, tried passing arguments "model" or "modelName" + apiKey, nothing works. When I send the request, it hangs for sometime and then I get a response error.

When I use OpenAI from as ai-sdk I don't have the issue. When I use langchain outside my VS Code Extension I don't have the issue too. Also the call to OpenAI is successfully logged into my Langsmith account, and I can run the request manually successfully from there, so I'm wondering if its not an issue with a proxy or I don't know.

System Info

"yarn info langchain": yarn_info.txt platform (windows / linux / mac): WSL2 on windows Node version: v18.20.4 yarn version: v1.22.22

dosubot[bot] commented 2 months ago

Hey @jeanibarz! I'm here to help you out with any bugs, questions, or contributions you have in mind. Let's squash those bugs together! 🤖

I found some similar issues and discussions that might help you resolve the connection error when using LangChain within a VS Code extension.

  1. Similar Issue: Issues when trying to use in a VisualStudio Code Extension

    • Status: Closed
    • Solution:
      1. Copy two files vocab.bpe and encoder.json from node_modules/gpt3-encoder to out.
      2. Add node-fetch to dependencies and follow these instructions [1].
  2. Related Issue: VSCODE extensions program development, using LangChain will report an error, it is recommended that LangChain TS does not directly use the API provided in VSCODE, it may not work.

    • Status: Closed
    • Solution:

      1. Introduce the node-fetch dependency:
        npm install --save @types/node-fetch
      2. Import the following package in axios-fetch-adapter.js:
        import fetch, { Headers, Request } from 'node-fetch';
      3. Adjust the event-source-parse.ts method:

        async function getBytes(stream, onChunk) {
        stream.on('data', (chunk) => {
            onChunk(chunk);
        });
        
        stream.on('end', () => {
            onChunk(new Uint8Array(), true);
        });
        }
      4. Add the following to src/extension.ts to fix ReadableStream is not defined:
        
        import { ReadableStream } from 'web-streams-polyfill';
        global.ReadableStream = ReadableStream;
        ``` <sup>[[2]](https://github.com/langchain-ai/langchainjs/issues/1884)</sup>.

These solutions should help you address the connection error when using LangChain within a VS Code extension. If the issue persists, you might want to ensure that your proxy settings are correctly configured and that there are no network restrictions affecting the connection.

To continue talking to Dosu, mention @dosu.