Closed inweb-pro closed 3 weeks ago
Hi, I know the thread is closed, but I wanted to share the approach I successfully used on Windows. I referenced the two links below:
Here are the steps:
Install the langgraph
CLI:
pip install -U langgraph-cli
Use the langgraph
CLI to create a Dockerfile:
langgraph dockerfile -c langgraph.json Dockerfile
In the directory containing langgraph.json
, a Dockerfile
will be created with the following content:
FROM langchain/langgraphjs-api:20
ADD . /deps/open-canvas
RUN cd /deps/open-canvas && yarn install --frozen-lockfile
ENV LANGSERVE_GRAPHS='{"agent": "./src/agent/open-canvas/index.ts:graph", "reflection": "./src/agent/reflection/index.ts:graph", "thread_title": "./src/agent/thread-title/index.ts:graph"}'
WORKDIR /deps/open-canvas
Build the Docker image with the command below:
docker build -t open-canvas .
Run the Docker container with the command below:
docker run `
--env-file .env `
-p 57318:8000 `
-e REDIS_URI="redis://host.docker.internal:6379/0" `
-e DATABASE_URI="postgresql://myuser:mypassword@host.docker.internal:5432/mydatabase" `
-e LANGSMITH_API_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxx" `
open-canvas
Note:
57318
is the port number, which must match the one in constants.ts
.host.docker.internal
is used to access the host from within the Docker container.LANGSMITH_API_KEY
is required.Additionally, I noticed there are three places where the model name is hardcoded and cannot be selected from the UI:
src/agent/open-canvas/nodes/updateArtifact.ts
and src/agent/open-canvas/nodes/updateHighlightedText.ts
are hardcoded to use gpt-4o
.src/agent/reflection/index.ts
is hardcoded to use claude-3-5-sonnet-20240620
.@lanesky Do you know why the LangSmith API key is needed if LangGraph is open source and can be run locally?
I'm working on a project where privacy is important, so we need to run everything locally, including the LLM (hopefully this is possible with open-canvas). Any insights would be greatly appreciated!
While LangGraph is indeed open source, it appears that not all components are included in the public codebase. Here's what I found:
Looking at their documentation (https://langchain-ai.github.io/langgraph/concepts/deployment_options/), they provide a Self-Hosted solution that uses Redis as a realtime message broker. However, when examining the code in their public GitHub repository (https://github.com/langchain-ai/langgraph), I noticed that the Redis-related code isn't included.
Interestingly, if you look at the Dockerfile for the Self-Hosted plan's Docker image langchain/langgraphjs-api:20
(https://hub.docker.com/layers/langchain/langgraphjs-api/20/images/sha256-25ff0937c4803c271430376c66d9f4eae4e17b448efb917797431fe22220486f?context=explore), you'll see:
COPY /api /api # buildkit
COPY /storage /storage # buildkit
I think these parts are licensed codes.
@Julsgaard I also trying to find a way to run this locally to save privacy. If you find a way, please share. Thanks!
We're working on adding better support for running LangGraph Cloud apps on Windows/Linux in the coming weeks! I'll post an update in this repo & update the Readme once that's ready