langchain-ai / open-canvas

📃 A better UX for chat, writing content, and coding with LLMs.
https://opencanvas.langchain.com/
MIT License
2.58k stars 377 forks source link

Is there any way to run this on windows? #114

Closed inweb-pro closed 3 weeks ago

bracesproul commented 3 weeks ago

We're working on adding better support for running LangGraph Cloud apps on Windows/Linux in the coming weeks! I'll post an update in this repo & update the Readme once that's ready

lanesky commented 1 week ago

Hi, I know the thread is closed, but I wanted to share the approach I successfully used on Windows. I referenced the two links below:

Here are the steps:

  1. Install the langgraph CLI:

    pip install -U langgraph-cli
  2. Use the langgraph CLI to create a Dockerfile:

    langgraph dockerfile -c langgraph.json Dockerfile

    In the directory containing langgraph.json, a Dockerfile will be created with the following content:

    FROM langchain/langgraphjs-api:20
    
    ADD . /deps/open-canvas
    
    RUN cd /deps/open-canvas && yarn install --frozen-lockfile
    
    ENV LANGSERVE_GRAPHS='{"agent": "./src/agent/open-canvas/index.ts:graph", "reflection": "./src/agent/reflection/index.ts:graph", "thread_title": "./src/agent/thread-title/index.ts:graph"}'
    
    WORKDIR /deps/open-canvas
  3. Build the Docker image with the command below:

    docker build -t open-canvas .
  4. Run the Docker container with the command below:

    docker run `
       --env-file .env `
       -p 57318:8000 `
       -e REDIS_URI="redis://host.docker.internal:6379/0" `
       -e DATABASE_URI="postgresql://myuser:mypassword@host.docker.internal:5432/mydatabase" `
       -e LANGSMITH_API_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxx" `
       open-canvas

    Note:

    • 57318 is the port number, which must match the one in constants.ts.
    • host.docker.internal is used to access the host from within the Docker container.
    • Redis can be installed in WSL.
    • A LANGSMITH_API_KEY is required.

Additionally, I noticed there are three places where the model name is hardcoded and cannot be selected from the UI:

Julsgaard commented 1 week ago

@lanesky Do you know why the LangSmith API key is needed if LangGraph is open source and can be run locally?

I'm working on a project where privacy is important, so we need to run everything locally, including the LLM (hopefully this is possible with open-canvas). Any insights would be greatly appreciated!

lanesky commented 1 week ago

While LangGraph is indeed open source, it appears that not all components are included in the public codebase. Here's what I found:

Looking at their documentation (https://langchain-ai.github.io/langgraph/concepts/deployment_options/), they provide a Self-Hosted solution that uses Redis as a realtime message broker. However, when examining the code in their public GitHub repository (https://github.com/langchain-ai/langgraph), I noticed that the Redis-related code isn't included. Interestingly, if you look at the Dockerfile for the Self-Hosted plan's Docker image langchain/langgraphjs-api:20 (https://hub.docker.com/layers/langchain/langgraphjs-api/20/images/sha256-25ff0937c4803c271430376c66d9f4eae4e17b448efb917797431fe22220486f?context=explore), you'll see:

COPY /api /api # buildkit
COPY /storage /storage # buildkit

I think these parts are licensed codes.

IsraelShok commented 1 week ago

@Julsgaard I also trying to find a way to run this locally to save privacy. If you find a way, please share. Thanks!