SelfhostedPro / Yacht

A web interface for managing docker containers with an emphasis on templating to provide 1 click deployments. Think of it like a decentralized app store for servers that anyone can make packages for.
MIT License
3.45k stars 169 forks source link

[Bug Report] Projects Screen Empty But Projects Exist #448

Closed realandrew closed 3 months ago

realandrew commented 2 years ago

Describe the bug The projects screen says "no projects available, but some are, I can manually get to 1 using the specific URL, although some others are inaccessible even with the URL. The console gets a 500 error when trying to GET /api/compose, which explains why the projects aren't loading in the UI.

To Reproduce Steps to reproduce the behavior:

  1. Go to projects
  2. See 'No projects available' and a 500 error in the console.

Expected behavior Projects screen loads and shows projects.

Desktop (please complete the following information):

Additional context I noticed when there are slight problems with the docker-compose files, the error trickle into Yacht and mess it up. More than once I've had a black screen when clicking on edit project, and had to manually go to the URL to edit the compose file and fix the error in order for Yacht to show the UI correctly again.

Logs As mentioned the browser console gives a 500 Internal Server Error when sending a GET request to /api/compose.

SelfhostedPro commented 2 years ago

Are there any errors in the container?

SelfhostedPro commented 2 years ago

Try using the :devel branch of yacht as it has better error handling for projects.

realandrew commented 2 years ago

I do get an error in the container

INFO:      - "GET /api/compose/ HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/usr/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 390, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/usr/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/usr/lib/python3.8/site-packages/fastapi/applications.py", line 199, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/lib/python3.8/site-packages/starlette/applications.py", line 111, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/lib/python3.8/site-packages/starlette/middleware/errors.py", line 1 81, in __call__
    raise exc from None
  File "/usr/lib/python3.8/site-packages/starlette/middleware/errors.py", line 159, in __call__
    await self.app(scope, receive, _send)
  File "/usr/lib/python3.8/site-packages/starlette/exceptions.py", line 82, in __call__
    raise exc from None
  File "/usr/lib/python3.8/site-packages/starlette/exceptions.py", line 71, in __call__
    await self.app(scope, receive, sender)
  File "/usr/lib/python3.8/site-packages/starlette/routing.py", line 566, in __call__
    await route.handle(scope, receive, send)
  File "/usr/lib/python3.8/site-packages/starlette/routing.py", line 227, in handle
    await self.app(scope, receive, send)
  File "/usr/lib/python3.8/site-packages/starlette/routing.py", line 41, in app
    response = await func(request)
  File "/usr/lib/python3.8/site-packages/fastapi/routing.py", line 201, in app
    raw_response = await run_endpoint_function(
  File "/usr/lib/python3.8/site-packages/fastapi/routing.py", line 150, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/usr/lib/python3.8/site-packages/starlette/concurrency.py", line 34, in run_in_threadpool
    return await loop.run_in_executor(None, func, *args)
  File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/./api/routers/compose.py", line 22, in get_projects
    return get_compose_projects()
  File "/./api/actions/compose.py", line 189, in get_compose_projects
    loaded_compose = yaml.load(compose, Loader=yaml.SafeLoader)
  File "/usr/lib/python3.8/site-packages/yaml/__init__.py", line 114, in load
    return loader.get_single_data()
  File "/usr/lib/python3.8/site-packages/yaml/constructor.py", line 49, in get_single_data
    node = self.get_single_node()
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 36, in get_single_node
    document = self.compose_document()
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 55, in compose_document
    node = self.compose_node(None, None)
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 84, in compose_node
    node = self.compose_mapping_node(anchor)
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 133, in compose_mapping_node
    item_value = self.compose_node(node, item_key)
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 84, in compose_node
    node = self.compose_mapping_node(anchor)
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 133, in compose_mapping_node
    item_value = self.compose_node(node, item_key)
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 84, in compose_node
    node = self.compose_mapping_node(anchor)
  File "/usr/lib/python3.8/site-packages/yaml/composer.py", line 127, in compose_mapping_node
    while not self.check_event(MappingEndEvent):
  File "/usr/lib/python3.8/site-packages/yaml/parser.py", line 98, in check_event
    self.current_event = self.state()
  File "/usr/lib/python3.8/site-packages/yaml/parser.py", line 438, in parse_block_mapping_key
    raise ParserError("while parsing a block mapping", self.marks[-1],
yaml.parser.ParserError: while parsing a block mapping
  in "/config/compose/ditd-discourse/docker-compose.yml", line 28, column 5
expected <block end>, but found '<block sequence start>'
  in "/config/compose/ditd-discourse/docker-compose.yml", line 45, column 7

I'll switch to the devel branch and see what happens.

realandrew commented 2 years ago

The same problem/error appears in the devel branch as well. It seems whenever a project docker-compose file has an error in it, it'll cause the Yacht to freak out and just throw a 500 error.

realandrew commented 2 years ago

If I manually go into the Yacht container and remove the bad project folder from /config/compose it fixes the error. So the fix would likely be better error handling around parsing incorrect docker-compose.yml files.

SelfhostedPro commented 2 years ago

I’ll go ahead and work on the error handling. Do you have the specific docker-compose what was incorrect?

realandrew commented 2 years ago

Sure, it was this (although I replaced sensitive values with placeholders).

version: '2'

services:

  postgresql:
    image: docker.io/bitnami/postgresql:11
    restart: unless-stopped
    environment:
      - ALLOW_EMPTY_PASSWORD=no
      - POSTGRESQL_PASSWORD=<root_db_password>
    volumes:
      - postgresql_data:/bitnami/postgresql

  redis:
    image: docker.io/bitnami/redis:6.0
    environment:
      # - REDIS_DISABLE_COMMANDS=FLUSHDB,FLUSHALL,CONFIG
      - REDIS_PASSWORD=<redis_password>
    volumes:
      - redis_data:/bitnami/discourse
    #command: /opt/bitnami/scripts/redis/run.sh

  discourse:
    image: docker.io/bitnami/discourse:2
    hostname: discourse
    ports:
      - '82:3000'
    depends_on:
      - postgresql
      - redis
    volumes:
      - discourse_data:/bitnami/discourse
    environment:
      # DEBUG
      #- DISCOURSE_SKIP_BOOTSTRAP=yes                # For debugging plugins incompatibility
      # - BITNAMI_DEBUG=true
      VIRTUAL_HOST=forums.divingintothedeep.com
      # This is the database and user Discord will use for the application.
      # It does not exist yet, we are going to create it with other environment variables
      - DISCOURSE_DATABASE_HOST=postgresql
      - DISCOURSE_DATABASE_PORT_NUMBER=5432
      - DISCOURSE_DATABASE_USER=<db_user>
      - DISCOURSE_DATABASE_PASSWORD=<db_password>
      - DISCOURSE_DATABASE_NAME=<db_name>
      # These are the credentials for the superuser
      # so that the Discourse container can create a new user and database
      - POSTGRESQL_CLIENT_POSTGRES_USER=postgres
      - POSTGRESQL_CLIENT_POSTGRES_PASSWORD=<root_db_password>
      # This is the database and the user that it is going to create and use for the application
      - POSTGRESQL_CLIENT_CREATE_DATABASE_NAME=<db_name>
      - POSTGRESQL_CLIENT_CREATE_DATABASE_USER=<db_user>
      - POSTGRESQL_CLIENT_CREATE_DATABASE_PASSWORD=<db_password>
      - POSTGRESQL_CLIENT_CREATE_DATABASE_EXTENSIONS=hstore,pg_trgm
      - ALLOW_EMPTY_PASSWORD=no
      # REDIS
      - DISCOURSE_REDIS_HOST=redis   # DB host
      - DISCOURSE_REDIS_PASSWORD=<redis_password> # DB password
      # DISCOURSE CONFIG
      - DISCOURSE_USERNAME=<discourse_user>    # Default admin username
      - DISCOURSE_PASSWORD=<discourse_password>    # Default admin password
      - DISCOURSE_EMAIL=<discourse_user_email>        # Default admin email
      - DISCOURSE_SITENAME="Diving Into The Deep Forums"
      - DISCOURSE_HOST=forums.divingintothedeep.com            # Default site URL
      - DISCOURSE_ENABLE_HTTPS=yes                  # HTTPS by default
      - DISCOURSE_PORT_NUMBER=3000                  # Default running port
      # - DISCOURSE_ENV=test                          # Allowed values are development, production, test
      # - DISCOURSE_ENABLE_CONF_PERSISTENCE=yes       # Whether to enable persistence of the Discourse discourse.conf
      # SMTP CONFIG
      #- DISCOURSE_SMTP_HOST=smtppro.zoho.com
      #- DISCOURSE_SMTP_PORT=587
      #- DISCOURSE_SMTP_USER=${DISCOURSE_SMTP_USER}
      #- DISCOURSE_SMTP_PASSWORD=${DISCOURSE_SMTP_PASS}
      #- DISCOURSE_SMTP_TLS=yes
      #- DISCOURSE_SMTP_AUTH=login

  sidekiq:
    image: docker.io/bitnami/discourse:2
    depends_on:
      - discourse
    volumes:
      - discourse_sidekiq:/bitnami/discourse
    command: /opt/bitnami/scripts/discourse-sidekiq/run.sh
    environment:
      # - BITNAMI_DEBUG=true
      # POSTGRESQL
      - DISCOURSE_DATABASE_HOST=postgresql
      - DISCOURSE_DATABASE_PORT_NUMBER=5432
      - DISCOURSE_DATABASE_USER=<db_user>
      - DISCOURSE_DATABASE_PASSWORD=<db_password>
      - DISCOURSE_DATABASE_NAME=<db_name>
      # REDIS
      - DISCOURSE_REDIS_HOST=redis   # DB host
      - DISCOURSE_REDIS_PASSWORD=<redis_password> # DB password
      # DISCOURSE CONFIG
      - DISCOURSE_USERNAME=<discourse_user>   # Default admin username
      - DISCOURSE_PASSWORD=<discourse_password>   # Default admin password
      - DISCOURSE_EMAIL=<discourse_user_email>         # Default admin email
      - DISCOURSE_SITENAME="Diving Into The Deep Forums"
      - DISCOURSE_HOST=forums.divingintothedeep.com           # Default site URL
      - DISCOURSE_ENABLE_HTTPS=yes                  # HTTPS by default
      - DISCOURSE_PORT_NUMBER=3000                  # Default running port
      # - DISCOURSE_ENV=test                          # Allowed values are development, production, test
      # - DISCOURSE_ENABLE_CONF_PERSISTENCE=yes       # Whether to enable persistence of the Discourse discourse.conf
      # SMTP CONFIG
      #- DISCOURSE_SMTP_HOST=smtppro.zoho.com
      #- DISCOURSE_SMTP_PORT=587
      #- DISCOURSE_SMTP_USER=${DISCOURSE_SMTP_USER}
      #- DISCOURSE_SMTP_PASSWORD=${DISCOURSE_SMTP_PASS}
      #- DISCOURSE_SMTP_TLS=yes
      #- DISCOURSE_SMTP_AUTH=login

volumes:
  postgresql_data:
    driver: local
  redis_data:
    driver: local
  discourse_data:
    driver: local
  discourse_sidekiq:
    driver: local
SelfhostedPro commented 2 years ago

Thanks! I’ll go ahead and work on getting the error handling setup better.

realandrew commented 2 years ago

I realized the specific error with this file is the VIRTUAL_HOST variable under the discourse service was missing a hyphen in front of it.

wickedyoda commented 2 years ago

Please update the ticket with the issue's current status, if resolved please let us know and close the ticket.

mbaiti commented 2 years ago
Screenshot 2022-05-31 at 09 46 16

same here! How did you find the errors in docker-compose.yml file? @realandrew

@SelfhostedPro cannot install Yacht in :devel because i've used the installer via openmediavault.

wickedyoda commented 2 years ago

I would recommend removing the yacht container and running/installing it via cli outside of omv at devel level. Omv will still see it and won't be affected.

mbaiti commented 2 years ago

installed yacht:devel via CLI outside of OMV, still same problem.

wickedyoda commented 2 years ago

I realized the specific error with this file is the VIRTUAL_HOST variable under the discourse service was missing a hyphen in front of it.

installed yacht:devel via CLI outside of OMV, still same problem.

It looks like there are two issues here under one GitHub ticket, if so can one of you please open a new ticket. If this is the same issue then I am confused; they look different.

@mbaiti Can you please post your exact docker-compose or command your running to get yacht installed? I am booting up my docker-debian test machine now and going to run my docker-compose. Here are the docker-compose I use. You are welcome to try it and let me know, but still please post yours.

I am running (below) with yacht installed via cli. I also use portainer on another host and the agent to control the containers on the system alongside yacht. None of which was installed through OMV, I go straight CLI for initial setup. OMV= 6.0.27-1 (Shaitan) PRETTY_NAME="Debian GNU/Linux 11 (bullseye)" NAME="Debian GNU/Linux" VERSION_ID="11" VERSION="11 (bullseye)" VERSION_CODENAME=bullseye

mbaiti commented 2 years ago

have to correct myself, with yacht:devel installed outside of OMV it works fine now. Looks like it was a caching problem.

realandrew commented 2 years ago
Screenshot 2022-05-31 at 09 46 16

same here! How did you find the errors in docker-compose.yml file? @realandrew

@SelfhostedPro cannot install Yacht in :devel because i've used the installer via openmediavault.

@mbaiti I used a YAML validator. IIRC I used this one: https://jsonformatter.org/yaml-validator/. Obviously, you may want to remove any sensitive values from your compose file before pasting it into an online service such as the one I linked.