lfnovo / open-notebook

An Open Source implementation of Notebook LM with more flexibility and features
MIT License
62 stars 4 forks source link

error using docker on ubuntu 22.04 #10

Closed ralyodio closed 5 days ago

ralyodio commented 6 days ago

I followed the instrcutions in the setup guide and get this error in the browser:

onnectionRefusedError: [Errno 111] Connection refused
Traceback:

File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling
    result = func()
             ^^^^^^
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 579, in code_to_exec
    exec(code, module.__dict__)
File "/app/app_home.py", line 7, in <module>
    check_version()
File "/app/open_notebook/repository.py", line 57, in check_version
    raise e
File "/app/open_notebook/repository.py", line 43, in check_version
    result = repo_query("SELECT * FROM open_notebook:database_info;")
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/open_notebook/repository.py", line 31, in repo_query
    with db_connection() as connection:
File "/usr/local/lib/python3.11/contextlib.py", line 137, in __enter__
    return next(self.gen)
           ^^^^^^^^^^^^^^
File "/app/open_notebook/repository.py", line 15, in db_connection
    connection = SurrealSyncConnection(
                 ^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/sblpy/connection.py", line 61, in __init__
    self.socket = connect(self.url, max_size=max_size)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/websockets/sync/client.py", line 253, in connect
    sock = socket.create_connection((wsuri.host, wsuri.port), **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/socket.py", line 851, in create_connection
    raise exceptions[0]
File "/usr/local/lib/python3.11/socket.py", line 836, in create_connection
    sock.connect(sa)
lfnovo commented 6 days ago

Hey @ralyodio

Are you running on docker-compose or docker run? Did you create the docker.env file? If so, please share your docker-compose.yaml or the docker command you are running for me to take a look? Most likely some DB configuration that is missing or broken. Here is a docker-compose.yaml that's running on my Ubuntu 22.04

version: '3'

services:
  surrealdb:
    image: surrealdb/surrealdb:v2
    ports:
      - "7777:8000"
    volumes:
      - surreal_data:/mydata
    command: start --log trace --user root --pass root rocksdb:/mydata/mydatabase.db
    pull_policy: always
    user: root

  open_notebook:
    image: lfnovo/open_notebook:latest
    ports:
      - "8080:8502"
    volumes:
      - langgraph_cache:/app/sqlite-db
      - notebook_data:/app/data
    env_file:
      - ./docker.env
    depends_on:
      - surrealdb
    pull_policy: always

volumes:
  surreal_data:
  langgraph_cache:
  notebook_data:

And here is the docker.env file that would work here:

OPENAI_API_KEY=
DEFAULT_MODEL=openai/gpt-4o-mini
SUMMARIZATION_MODEL=openai/gpt-4o-mini
SURREAL_ADDRESS=surrealdb #(or the IP of the server)
SURREAL_USER=root
SURREAL_PASS=root
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=whatever
SURREAL_PORT=8000
GEMINI_API_KEY=
ELEVENLABS_API_KEY=
henilmalaviya commented 5 days ago

I had the same issue when i cloned the .env.example file into docker.env and added a OPENAI_API_KEY. But when i replaced the docker.env with the following (from @lfnovo 's comment):

OPENAI_API_KEY=
DEFAULT_MODEL=openai/gpt-4o-mini
SUMMARIZATION_MODEL=openai/gpt-4o-mini
SURREAL_ADDRESS=surrealdb #(or the IP of the server)
SURREAL_USER=root
SURREAL_PASS=root
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=whatever
SURREAL_PORT=8000
GEMINI_API_KEY=
ELEVENLABS_API_KEY=

and added OPENAI_API_KEY, the error is gone.

ralyodio commented 5 days ago

Here is my setup:

version: '3'

services:
  surrealdb:
    image: surrealdb/surrealdb:v2
    ports:
      - "9110:8000"
    volumes:
      - surreal_data:/mydata
    command: start --log trace --user root --pass root rocksdb:/mydata/mydatabase.db
    pull_policy: always
    user: root

  open_notebook:
    image: lfnovo/open_notebook:latest
    ports:
      - "9112:8502"
    env_file:
      - ./docker.env
    depends_on:
      - surrealdb
    pull_policy: always
    volumes:
      - notebook_data:/app/data

volumes:
  surreal_data:
  notebook_data:

docker.env:


# DEFAULT MODEL_CONFIGURATIONS
#DEFAULT_MODEL="openai/gpt-4o-mini"
#SUMMARIZATION_MODEL="openai/gpt-4o-mini"
#RETRIEVAL_MODEL="openai/gpt-4o-mini"
DEFAULT_MODEL="ollama/llama3.2:3b"
SUMMARIZATION_MODEL="ollama/llama3.2:3b"
RETRIEVAL_MODEL="ollama/llama3.2:3b"

# OPENAI
# USE MODEL NAMES AS "openai/<modelname>"
# EXAMPLE - openai/gpt-4o-mini
OPENAI_API_KEY=

# ANTHROPIC
# USE MODEL NAMES AS "anthropic/<modelname>"
# EXAMPLE - anthropic/claude-3-5-sonnet-20240620
ANTHROPIC_API_KEY=

# GEMINI
# USE MODEL NAMES AS "gemini/<modelname>"
# EXAMPLE - gemini/gemini-1.5-pro-002
GEMINI_API_KEY=

# VERTEXAI
# USE MODEL NAMES AS "vertexai/<modelname>"
# EXAMPLE - vertexai/gemini-1.5-pro-002
VERTEX_PROJECT=my-google-cloud-project-name
GOOGLE_APPLICATION_CREDENTIALS=./google-credentials.json

# OLLAMA
# USE MODEL NAMES AS "ollama/<modelname>"
# EXAMPLE - ollama/gemma2
OLLAMA_API_BASE="https://ai.profullstack.com/ollamaapi"

# OPEN ROUTER
# USE MODEL NAMES AS "openrouter/<modelname>"
# EXAMPLE - openrouter/nvidia/llama-3.1-nemotron-70b-instruct
OPENROUTER_BASE_URL="https://openrouter.ai/api/v1"
OPENROUTER_API_KEY=

# ELEVENLABS
# Used only by the podcast feature
ELEVENLABS_API_KEY=sk_1

# USE THIS IF YOU WANT TO DEBUG THE APP ON LANGSMITH
# LANGCHAIN_TRACING_V2=true
# LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
# LANGCHAIN_API_KEY=
# LANGCHAIN_PROJECT="Open Notebook"

# CONNECTION DETAILS FOR YOUR SURREAL DB
SURREAL_ADDRESS="surrealdb"
SURREAL_PORT=9114
SURREAL_USER="root"
SURREAL_PASS="root"
SURREAL_NAMESPACE="open_notebook"
SURREAL_DATABASE="staging"

# This is used for the summarization feature when the content is to big to fit a single context window
# It is measured in characters, not tokens.
SUMMARY_CHUNK_SIZE=200000
SUMMARY_CHUNK_OVERLAP=1000

# This is used for vector embeddings
# It is measured in characters, not tokens.
EMBEDDING_CHUNK_SIZE=1000
EMBEDDING_CHUNK_OVERLAP=50
ralyodio commented 5 days ago

I'm running with docker compose up --build

lfnovo commented 5 days ago

@ralyodio , you should be using port 8000 in the docker.env for SURREAL_PORT. You are currently using 9114. Let me know if that does it.

Also, you don't really need build since your working with pre-built images. :)

lfnovo commented 5 days ago

I had the same issue when i cloned the .env.example file into docker.env and added a OPENAI_API_KEY. But when i replaced the docker.env with the following (from @lfnovo 's comment):

OPENAI_API_KEY=
DEFAULT_MODEL=openai/gpt-4o-mini
SUMMARIZATION_MODEL=openai/gpt-4o-mini
SURREAL_ADDRESS=surrealdb #(or the IP of the server)
SURREAL_USER=root
SURREAL_PASS=root
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=whatever
SURREAL_PORT=8000
GEMINI_API_KEY=
ELEVENLABS_API_KEY=

and added OPENAI_API_KEY, the error is gone.

Thanks for pointing that out, @henilmalaviya. I will make some changes to the .env.example to make it more clear.

ralyodio commented 5 days ago

that worked, thanks.