langchain-ai / open-canvas

📃 A better UX for chat, writing content, and coding with LLMs.
https://opencanvas.langchain.com/
MIT License
2.72k stars 399 forks source link

TypeError: Failed to parse URL from /assistants #66

Closed shelltea closed 1 month ago

shelltea commented 1 month ago
  1. Add a new .env file with the following content:
# LangSmith tracing
LANGCHAIN_TRACING_V2=true
LANGCHAIN_API_KEY=lsv2_xxx

# LLM API keys
# Anthropic used for reflection
ANTHROPIC_API_KEY=sk-ant-xxxx
# OpenAI used for content generation
OPENAI_API_KEY=sk-xxxx

# Vercel KV stores. Used for system prompt storage.
KV_REST_API_URL=
KV_REST_API_TOKEN=

# LangGraph Deployment, or local development server via LangGraph Studio.
LANGGRAPH_API_URL=
  1. Execute the following command:
yarn install
yarn dev
  1. Open http://localhost:3000/, the console reports the following error:
Error in proxy
TypeError: Failed to parse URL from /assistants
    at node:internal/deps/undici/undici:11754:11
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
  [cause]: TypeError: Invalid URL
      at new URL (node:internal/url:783:36)
      at new Request (node:internal/deps/undici/undici:5287:25)
      at fetch (node:internal/deps/undici/undici:9533:25)
      at Object.fetch (node:internal/deps/undici/undici:11753:10)
      at fetch (node:internal/process/pre_execution:315:27)
      at N (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:52416)
      at D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:54620
      at D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\trace\tracer.js:140:36
      at NoopContextManager.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:7062)
      at ContextAPI.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:518)
      at NoopTracer.startActiveSpan (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:18093)
      at ProxyTracer.startActiveSpan (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:18854)
      at D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\trace\tracer.js:122:103
      at NoopContextManager.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:7062)
      at ContextAPI.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:518)
      at NextTracerImpl.trace (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\trace\tracer.js:122:28)
      at n (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:48532)
      at handleRequest (webpack-internal:///(rsc)/./src/app/api/[..._path]/route.ts:42:27)
      at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
      at async D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:55038
      at async ek.execute (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:45808)
      at async ek.handle (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:56292)
      at async doRender (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1357:42)
      at async cacheEntry.responseCache.get.routeKind (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1567:40)
      at async DevServer.renderToResponseWithComponentsImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1487:28)
      at async DevServer.renderPageComponent (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1911:24)
      at async DevServer.renderToResponseImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1949:32)
      at async DevServer.pipeImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:916:25)
      at async NextNodeServer.handleCatchallRenderRequest (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\next-server.js:272:17)
      at async DevServer.handleRequestImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:812:17)
      at async D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\dev\next-dev-server.js:339:20
      at async Span.traceAsyncFn (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\trace\trace.js:154:20)
      at async DevServer.handleRequest (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\dev\next-dev-server.js:336:24)
      at async invokeRender (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\router-server.js:173:21)
      at async handleRequest (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\router-server.js:350:24)
      at async requestHandlerImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\router-server.js:374:13)
      at async Server.requestListener (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\start-server.js:141:13) {
    code: 'ERR_INVALID_URL',
    input: '/assistants'
  }
}

END ERROR

Error in proxy
TypeError: Failed to parse URL from /threads
    at node:internal/deps/undici/undici:11754:11
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
  [cause]: TypeError: Invalid URL
      at new URL (node:internal/url:783:36)
      at new Request (node:internal/deps/undici/undici:5287:25)
      at fetch (node:internal/deps/undici/undici:9533:25)
      at Object.fetch (node:internal/deps/undici/undici:11753:10)
      at fetch (node:internal/process/pre_execution:315:27)
      at N (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:52416)
      at D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:54620
      at D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\trace\tracer.js:140:36
      at NoopContextManager.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:7062)
      at ContextAPI.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:518)
      at NoopTracer.startActiveSpan (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:18093)
      at ProxyTracer.startActiveSpan (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:18854)
      at D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\trace\tracer.js:122:103
      at NoopContextManager.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:7062)
      at ContextAPI.with (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\@opentelemetry\api\index.js:1:518)
      at NextTracerImpl.trace (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\trace\tracer.js:122:28)
      at n (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:48532)
      at handleRequest (webpack-internal:///(rsc)/./src/app/api/[..._path]/route.ts:42:27)
      at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
      at async D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:55038
      at async ek.execute (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:45808)
      at async ek.handle (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\compiled\next-server\app-route.runtime.dev.js:6:56292)
      at async doRender (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1357:42)
      at async cacheEntry.responseCache.get.routeKind (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1567:40)
      at async DevServer.renderToResponseWithComponentsImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1487:28)
      at async DevServer.renderPageComponent (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1911:24)
      at async DevServer.renderToResponseImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:1949:32)
      at async DevServer.pipeImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:916:25)
      at async NextNodeServer.handleCatchallRenderRequest (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\next-server.js:272:17)
      at async DevServer.handleRequestImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\base-server.js:812:17)
      at async D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\dev\next-dev-server.js:339:20
      at async Span.traceAsyncFn (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\trace\trace.js:154:20)
      at async DevServer.handleRequest (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\dev\next-dev-server.js:336:24)
      at async invokeRender (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\router-server.js:173:21)
      at async handleRequest (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\router-server.js:350:24)
      at async requestHandlerImpl (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\router-server.js:374:13)
      at async Server.requestListener (D:\Development\Projects\GitHub\langchain-ai\open-canvas\node_modules\next\dist\server\lib\start-server.js:141:13) {
    code: 'ERR_INVALID_URL',
    input: '/threads'
  }
}

END ERROR

I am using a Windows computer for local deployment. Is the error due to configuration?

snifhex commented 1 month ago

To run this project, you'll need LangGraph Studio. There are three ways to do this:

shelltea commented 1 month ago

To run this project, you'll need LangGraph Studio. There are three ways to do this:

  • Run LangGraph Studio locally (available only on macOS).
  • Use a LangSmith plus plan (the free plan won’t work, but if you already have a plus plan, you can use that).
  • Create your own API using a framework like FastAPI, Django, or similar. (Haven't tried this not sure if it will really work or not.)

Thank you for your support. I believe the issue is related to LangGraph. I am using a Windows computer and do not have a LangSmith plus plan.

bedney commented 1 month ago

@snifhex - Thanks for your comment here. I'm still coming up to speed on all of the LangChain/LangGraph tooling. Is there a reason why LangGraph Studio itself is necessary? From what I read here, it looks like its an "IDE" of sorts for LangGraph. Why doesn't it just require LangGraph itself (which, I'm thinking, is the "runtime")?

Am I completely off base here?

Cheers!

bracesproul commented 1 month ago

@bedney LangGraph studio is a desktop app, but the app will spin up a full API server & database you can connect to on localhost to interact with your app. So yes, it acts as an IDE, but also exposes a server to interact with your app programmatically, the same way LangGraph Cloud exposes a full API server to interact with (with Cloud being intended for production/connections outside of your local machine, and Studio being for local).

bracesproul commented 1 month ago

@shelltea for now, the only way to run this is either by using LangGraph Studio on Mac, or upgrading your LangSmith plan to get access to Cloud. We are however working on making it easier to run on Windows/Linux locally, but this won't be ready for a couple weeks.

bedney commented 1 month ago

@bedney LangGraph studio is a desktop app, but the app will spin up a full API server & database you can connect to on localhost to interact with your app. So yes, it acts as an IDE, but also exposes a server to interact with your app programmatically, the same way LangGraph Cloud exposes a full API server to interact with (with Cloud being intended for production/connections outside of your local machine, and Studio being for local).

Brace -

Thanks for your quick answer!

Is there an intent at some point to have (let's call it) "LangGraph Server" separate from the IDE so that we can run it on our own server in our own cloud without running the IDE there, which would be useless in a production environment?

Sorry, I won't hijack this bug anymore with basic questions ;-).

bracesproul commented 1 month ago

@bedney this already exists via LangGraph Cloud! (as for running on your own servers, we offer self hosting on your cloud for enterprise customers)

shelltea commented 1 month ago

Thanks for the update, @bracesproul! I'll wait for the Windows version and close this issue for now. Looking forward to giving it a try in a few weeks!