Open jeanibarz opened 2 months ago
Hey @jeanibarz! I'm here to help you out with any bugs, questions, or contributions you have in mind. Let's squash those bugs together! 🤖
I found some similar issues and discussions that might help you resolve the connection error when using LangChain within a VS Code extension.
Similar Issue: Issues when trying to use in a VisualStudio Code Extension
vocab.bpe
and encoder.json
from node_modules/gpt3-encoder
to out
.node-fetch
to dependencies and follow these instructions [1].Solution:
node-fetch
dependency:
npm install --save @types/node-fetch
axios-fetch-adapter.js
:
import fetch, { Headers, Request } from 'node-fetch';
Adjust the event-source-parse.ts
method:
async function getBytes(stream, onChunk) {
stream.on('data', (chunk) => {
onChunk(chunk);
});
stream.on('end', () => {
onChunk(new Uint8Array(), true);
});
}
src/extension.ts
to fix ReadableStream is not defined
:
import { ReadableStream } from 'web-streams-polyfill';
global.ReadableStream = ReadableStream;
``` <sup>[[2]](https://github.com/langchain-ai/langchainjs/issues/1884)</sup>.
These solutions should help you address the connection error when using LangChain within a VS Code extension. If the issue persists, you might want to ensure that your proxy settings are correctly configured and that there are no network restrictions affecting the connection.
To continue talking to Dosu, mention @dosu.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
2024-09-15T09:55:11.354Z - INFO Retrieved proxy settings: 2024-09-15T09:55:11.354Z - INFO No proxy detected. Proceeding without proxy configuration. 2024-09-15T09:55:13.101Z - ERROR Error message: Connection error. 2024-09-15T09:55:13.101Z - ERROR Error stack trace: Error: Connection error. at OpenAI.makeRequest (/home/jean/git/chatgpt-copilot/out/Extension.js:56648:13) at async /home/jean/git/chatgpt-copilot/out/Extension.js:71714:21 at async RetryOperation._fn (/home/jean/git/chatgpt-copilot/out/Extension.js:22167:19)
Description
I'm trying to use
langchain
within a VS Code extension I'm working on. I'm able to get a valid response from OpenAI when using library @ai-sdk, but not when using langchain. I tried several stuff: minimal example ,streaming, non-streaming. I checked many times that my env variables was correctly setup, tried passing arguments "model" or "modelName" + apiKey, nothing works. When I send the request, it hangs for sometime and then I get a response error.When I use OpenAI from as ai-sdk I don't have the issue. When I use langchain outside my VS Code Extension I don't have the issue too. Also the call to OpenAI is successfully logged into my Langsmith account, and I can run the request manually successfully from there, so I'm wondering if its not an issue with a proxy or I don't know.
System Info
"yarn info langchain": yarn_info.txt platform (windows / linux / mac): WSL2 on windows Node version: v18.20.4 yarn version: v1.22.22