bitwarden / self-host

Bitwarden's self-hosted release repository
GNU General Public License v3.0
335 stars 61 forks source link

(terminated by SIGSEGV (core dumped); not expected) #79

Open jaredatron opened 1 year ago

jaredatron commented 1 year ago

I am trying to run a self hosted docker container. When my docker-compose boots bitwarden/self-host:beta using podman:

podman create --name=bitwarden_bitwarden_1 --label io.podman.compose.config-hash=123 --label io.podman.compose.project=bitwarden --label io.podman.compose.version=0.0.1 --label com.docker.compose.project=bitwarden --label com.docker.compose.project.working_dir=/REDACTED --label com.docker.compose.project.config_files=docker-compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=bitwarden --env-file /REDACTED/settings.env -v /REDACTED/data:/etc/bitwarden --net bitwarden_default --network-alias bitwarden -p 2000:8080 --restart always docker.io/bitwarden/self-host:beta
fd7a2181f92c523b68ca65cd9fcf6e04c31fa01cac52840784b24e9aa9bd8c6a

I get this output

2023-01-28 22:15:18,347 INFO Included extra file "/etc/supervisor.d/admin.ini" during parsing
2023-01-28 22:15:18,348 INFO Included extra file "/etc/supervisor.d/api.ini" during parsing
2023-01-28 22:15:18,348 INFO Included extra file "/etc/supervisor.d/events.ini" during parsing
2023-01-28 22:15:18,348 INFO Included extra file "/etc/supervisor.d/icons.ini" during parsing
2023-01-28 22:15:18,348 INFO Included extra file "/etc/supervisor.d/identity.ini" during parsing
2023-01-28 22:15:18,348 INFO Included extra file "/etc/supervisor.d/nginx.ini" during parsing
2023-01-28 22:15:18,351 INFO Included extra file "/etc/supervisor.d/notifications.ini" during parsing
2023-01-28 22:15:18,353 INFO Included extra file "/etc/supervisor.d/scim.ini" during parsing
2023-01-28 22:15:18,354 INFO Included extra file "/etc/supervisor.d/sso.ini" during parsing
2023-01-28 22:15:18,359 INFO RPC interface 'supervisor' initialized
2023-01-28 22:15:18,359 CRIT Server 'unix_http_server' running without any HTTP authentication checking
2023-01-28 22:15:18,360 INFO supervisord started with pid 40
2023-01-28 22:15:19,363 INFO spawned: 'identity' with pid 41
2023-01-28 22:15:19,367 INFO spawned: 'admin' with pid 42
2023-01-28 22:15:19,375 INFO spawned: 'api' with pid 43
2023-01-28 22:15:19,377 INFO spawned: 'icons' with pid 44
2023-01-28 22:15:19,381 INFO spawned: 'nginx' with pid 45
2023-01-28 22:15:19,406 INFO spawned: 'notifications' with pid 46
2023-01-28 22:15:20,913 INFO exited: icons (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:21,010 INFO exited: api (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:21,011 INFO exited: notifications (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:21,016 INFO exited: admin (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:21,026 INFO exited: identity (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:22,031 INFO spawned: 'identity' with pid 83
2023-01-28 22:15:22,034 INFO spawned: 'admin' with pid 84
2023-01-28 22:15:22,044 INFO spawned: 'api' with pid 85
2023-01-28 22:15:22,047 INFO spawned: 'icons' with pid 86
2023-01-28 22:15:22,054 INFO spawned: 'notifications' with pid 87
2023-01-28 22:15:23,460 INFO exited: icons (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:23,476 INFO exited: api (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:23,476 INFO exited: notifications (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:23,503 INFO exited: admin (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:23,513 INFO exited: identity (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:25,517 INFO spawned: 'identity' with pid 123
2023-01-28 22:15:25,520 INFO spawned: 'admin' with pid 124
2023-01-28 22:15:25,526 INFO spawned: 'api' with pid 125
2023-01-28 22:15:25,530 INFO spawned: 'icons' with pid 126
2023-01-28 22:15:25,534 INFO spawned: 'notifications' with pid 127
2023-01-28 22:15:27,016 INFO exited: icons (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:27,019 INFO exited: notifications (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:27,070 INFO exited: api (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:27,090 INFO exited: admin (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:27,090 INFO exited: identity (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:30,097 INFO spawned: 'identity' with pid 163
2023-01-28 22:15:30,102 INFO spawned: 'admin' with pid 164
2023-01-28 22:15:30,108 INFO spawned: 'api' with pid 165
2023-01-28 22:15:30,114 INFO spawned: 'icons' with pid 166
2023-01-28 22:15:30,130 INFO spawned: 'notifications' with pid 167
2023-01-28 22:15:31,490 INFO exited: icons (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:31,536 INFO gave up: icons entered FATAL state, too many start retries too quickly
2023-01-28 22:15:31,537 INFO exited: api (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:31,537 INFO exited: notifications (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:31,544 INFO gave up: api entered FATAL state, too many start retries too quickly
2023-01-28 22:15:31,545 INFO gave up: notifications entered FATAL state, too many start retries too quickly
2023-01-28 22:15:31,545 INFO exited: identity (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:31,549 INFO gave up: identity entered FATAL state, too many start retries too quickly
2023-01-28 22:15:31,549 INFO exited: admin (terminated by SIGSEGV (core dumped); not expected)
2023-01-28 22:15:32,551 INFO gave up: admin entered FATAL state, too many start retries too quickly
2023-01-28 22:15:34,554 INFO success: nginx entered RUNNING state, process has stayed up for > than 15 seconds (startsecs)

and the container stays open, supervisord doesn't exit but all the sub processes (icons, api etc) are not running.

I discovered this because I could visit the login page and enter an email address but upon submitting the form the /api request 502s becuase the internal nginx proxy cannot speak to the its upstream

2023/01/28 22:19:53 [error] 50#50: *5 connect() failed (111: Connection refused) while connecting to upstream, client: 10.89.1.19, server: bitwarden.redacted.com, request: "GET /api/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772 HTTP/1.1", upstream: "http://[::1]:5001/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772", host: "bitwarden.redacted.com", referrer: "https://bitwarden.redacted.com/"
2023/01/28 22:19:53 [warn] 50#50: *5 upstream server temporarily disabled while connecting to upstream, client: 10.89.1.19, server: bitwarden.redacted.com, request: "GET /api/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772 HTTP/1.1", upstream: "http://[::1]:5001/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772", host: "bitwarden.redacted.com", referrer: "https://bitwarden.redacted.com/"
2023/01/28 22:19:53 [error] 50#50: *5 connect() failed (111: Connection refused) while connecting to upstream, client: 10.89.1.19, server: bitwarden.redacted.com, request: "GET /api/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772 HTTP/1.1", upstream: "http://127.0.0.1:5001/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772", host: "bitwarden.redacted.com", referrer: "https://bitwarden.redacted.com/"
2023/01/28 22:19:53 [warn] 50#50: *5 upstream server temporarily disabled while connecting to upstream, client: 10.89.1.19, server: bitwarden.redacted.com, request: "GET /api/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772 HTTP/1.1", upstream: "http://127.0.0.1:5001/devices/knowndevice/test@example.com/5d9c19bb-14d0-4774-8cc0-f4759b65c772", host: "bitwarden.redacted.com", referrer: "https://bitwarden.redacted.com/"

Can anyone recommend a next step to debugging this?

Thank you!

bpjobin commented 1 year ago

Did you find an answer to this?

deadlyicon commented 1 year ago

Yes but I don’t remember how I got around this problem :(

bpjobin commented 1 year ago

Would you mind sharing your docker-compose.yml?

cvele commented 1 year ago

@deadlyicon I'm also facing this issue, would you mind sharing your docker-compose please?

mikensan commented 1 year ago

I was beating my head with this issue today. Using podman as well, and for god knows whatever reason mariadb / mysql did not like my folder (/home/podman/containers/mysql:/var/lib/mysql:Z). It could create files and permissions were good. My problem went away when I just let mysql use a volume. I kept getting other errors such as unathenticated user dropped connection and the logs/admin.log had errors of unable to connect to db and looping "migrating database".

DuckThom commented 9 months ago

I'm also seeing this behaviour on 2024.1.0-beta on an ARM64 platform.

Running supervisorctl tail admin inside the bitwarden container returns:

fail: Microsoft.EntityFrameworkCore.Database.Command[20102]
      Failed executing DbCommand (6ms) [Parameters=[], CommandType='Text', CommandTimeout='30']
      CALL POMELO_BEFORE_DROP_PRIMARY_KEY(NULL, 'Grant');
      ALTER TABLE `Grant` DROP PRIMARY KEY;
Unhandled exception. MySqlConnector.MySqlException (0x80004005): Can't DROP 'PRIMARY'; check that column/key exists
   at MySqlConnector.Core.ResultSet.ReadResultSetHeaderAsync(IOBehavior ioBehavior) in /_/src/MySqlConnector/Core/ResultSet.cs:line 43
   at MySqlConnector.MySqlDataReader.ActivateResultSet(CancellationToken cancellationToken) in /_/src/MySqlConnector/MySqlDataReader.cs:line 130
   at MySqlConnector.MySqlDataReader.CreateAsync(CommandListPosition commandListPosition, ICommandPayloadCreator payloadCreator, IDictionary`2 cachedProcedures, IMySqlCommand command, CommandBehavior behavior, Activity activity, IOBehavior ioBehavior, CancellationToken cancellationToken) in /_/src/MySqlConnector/MySqlDataReader.cs:line 468
   at MySqlConnector.Core.CommandExecutor.ExecuteReaderAsync(IReadOnlyList`1 commands, ICommandPayloadCreator payloadCreator, CommandBehavior behavior, Activity activity, IOBehavior ioBehavior, CancellationToken cancellationToken) in /_/src/MySqlConnector/Core/CommandExecutor.cs:line 56
   at MySqlConnector.MySqlCommand.ExecuteNonQueryAsync(IOBehavior ioBehavior, CancellationToken cancellationToken) in /_/src/MySqlConnector/MySqlCommand.cs:line 296
   at MySqlConnector.MySqlCommand.ExecuteNonQuery() in /_/src/MySqlConnector/MySqlCommand.cs:line 107
   at Microsoft.EntityFrameworkCore.Storage.RelationalCommand.ExecuteNonQuery(RelationalCommandParameterObject parameterObject)
   at Microsoft.EntityFrameworkCore.Migrations.MigrationCommand.ExecuteNonQuery(IRelationalConnection connection, IReadOnlyDictionary`2 parameterValues)
   at Microsoft.EntityFrameworkCore.Migrations.Internal.MigrationCommandExecutor.ExecuteNonQuery(IEnumerable`1 migrationCommands, IRelationalConnection connection)
   at Microsoft.EntityFrameworkCore.Migrations.Internal.Migrator.Migrate(String targetMigration)
   at Microsoft.EntityFrameworkCore.RelationalDatabaseFacadeExtensions.Migrate(DatabaseFacade databaseFacade)
   at Bit.MySqlMigrations.MySqlDbMigrator.MigrateDatabase(Boolean enableLogging, CancellationToken cancellationToken) in /source/util/MySqlMigrations/MySqlDbMigrator.cs:line 30
   at Bit.Admin.HostedServices.DatabaseMigrationHostedService.StartAsync(CancellationToken cancellationToken) in /source/src/Admin/HostedServices/DatabaseMigrationHostedService.cs:line 29
   at Microsoft.Extensions.Hosting.Internal.Host.StartAsync(CancellationToken cancellationToken)
   at Microsoft.Extensions.Hosting.HostingAbstractionsHostExtensions.RunAsync(IHost host, CancellationToken token)
   at Microsoft.Extensions.Hosting.HostingAbstractionsHostExtensions.RunAsync(IHost host, CancellationToken token)
   at Microsoft.Extensions.Hosting.HostingAbstractionsHostExtensions.Run(IHost host)
   at Bit.Admin.Program.Main(String[] args) in /source/src/Admin/Program.cs:line 9

After inspecting the database, there is indeed no Primary key on the "Grant" table.

mysql> show create table `Grant`;

CREATE TABLE `Grant` (
  `Key` varchar(200) CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci NOT NULL,
  `Type` varchar(50) CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci NOT NULL,
  `SubjectId` varchar(200) CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci DEFAULT NULL,
  `SessionId` varchar(100) CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci DEFAULT NULL,
  `ClientId` varchar(200) CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci NOT NULL,
  `Description` varchar(200) CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci DEFAULT NULL,
  `CreationDate` datetime(6) NOT NULL,
  `ExpirationDate` datetime(6) DEFAULT NULL,
  `ConsumedDate` datetime(6) DEFAULT NULL,
  `Data` longtext CHARACTER SET utf8mb4 COLLATE utf8mb4_0900_ai_ci NOT NULL,
  `Id` int NOT NULL DEFAULT '0'
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci

This is my docker-compose.yaml file:

version: "3.8"

services:
  bitwarden:
    image: bitwarden/self-host:beta
    restart: always
    depends_on:
      - db
    env_file:
      - settings.env
    ports:
      - "8000:8080"
    volumes:
      - bitwarden:/etc/bitwarden

  db:
    image: mysql:8.0
    restart: always
    environment:
      MYSQL_DATABASE: "bitwarden_vault"
      MYSQL_RANDOM_ROOT_PASSWORD: "true"
    volumes:
      - data:/var/lib/mysql

volumes:
  bitwarden:
  data:

Not sure if it's related, but I also noticed that the admin still says "2023.12.0" on the 2024.1.0-beta tagged image. Which is the version I'm trying to upgrade from.