letsencrypt / boulder

An ACME-based certificate authority, written in Go.
Mozilla Public License 2.0
5.17k stars 604 forks source link

Failed to run docker-compose #7374

Closed igolman closed 7 months ago

igolman commented 7 months ago

Summary: Failed to run docker-compose on ubuntu 22.04 according to documentation

Steps to reproduce:

root@server:#uname -a
Linux nginx 5.15.0-92-generic #102-Ubuntu SMP Wed Jan 10 09:33:48 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux

root@server:# lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 22.04.3 LTS
Release:    22.04
Codename:   jammy

root@server:# dpkg -l | grep docker
ii  docker-compose                  1.29.2-1                                all          define and run multi-container Docker applications with YAML
ii  docker.io                       24.0.5-0ubuntu1~22.04.1                 amd64        Linux container runtime
ii  python3-docker                  5.0.3-1                                 all          Python 3 wrapper to access docker.io's control socket
ii  python3-dockerpty               0.4.1-2                                 all          Pseudo-tty handler for docker Python client (Python 3.x)

root@server:/# git clone https://github.com/letsencrypt/boulder  /opt/boulder
Cloning into '/opt/boulder'...
remote: Enumerating objects: 68877, done.
remote: Counting objects: 100% (4975/4975), done.
remote: Compressing objects: 100% (1373/1373), done.
remote: Total 68877 (delta 4200), reused 3907 (delta 3590), pack-reused 63902
Receiving objects: 100% (68877/68877), 48.62 MiB | 636.00 KiB/s, done.
Resolving deltas: 100% (43968/43968), done.

root@server:#cd /opt/boulder
root@server:/opt/boulder# docker-compose up -d
WARNING: The GOEXPERIMENT variable is not set. Defaulting to a blank string.
Creating network "boulder_bouldernet" with driver "bridge"
Creating network "boulder_redisnet" with driver "bridge"
Creating network "boulder_consulnet" with driver "bridge"
Creating network "boulder_integrationtestnet" with driver "bridge"
Building boulder
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            Install the buildx component to build images with BuildKit:
            https://docs.docker.com/go/buildx/

Sending build context to Docker daemon  13.82kB
Step 1/36 : FROM buildpack-deps:focal-scm as godeps
 ---> f0845dc47c29
Step 2/36 : ARG GO_VERSION
 ---> Using cache
 ---> 53dfdd6ae929
Step 3/36 : ARG TARGETPLATFORM
 ---> Using cache
 ---> b36aaebf4107
Step 4/36 : ARG BUILDPLATFORM
 ---> Using cache
 ---> 931dc2537b54
Step 5/36 : ENV TARGETPLATFORM=${TARGETPLATFORM:-$BUILDPLATFORM}
 ---> Using cache
 ---> b207bca2141b
Step 6/36 : ENV GO_VERSION=$GO_VERSION
 ---> Using cache
 ---> add061103f7e
Step 7/36 : ENV PATH /usr/local/go/bin:/usr/local/protoc/bin:$PATH
 ---> Using cache
 ---> 3e125d036811
Step 8/36 : ENV GOBIN /usr/local/bin/
 ---> Using cache
 ---> debb097c2ac9
Step 9/36 : RUN curl "https://dl.google.com/go/go${GO_VERSION}.$(echo $TARGETPLATFORM | sed 's|\/|-|').tar.gz" |    tar -C /usr/local -xz
 ---> Running in 166e691a8850
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1449  100  1449    0     0   3191      0 --:--:-- --:--:-- --:--:--  3184

gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now
The command '/bin/sh -c curl "https://dl.google.com/go/go${GO_VERSION}.$(echo $TARGETPLATFORM | sed 's|\/|-|').tar.gz" |    tar -C /usr/local -xz' returned a non-zero code: 2
ERROR: Service 'boulder' failed to build : Build failed

Expected result: Boulder server is running. BTW, the release-2023-11-07 still works...

root@nginx:/opt# git clone https://github.com/letsencrypt/boulder --branch  release-2023-11-07 /opt/boulder
Cloning into '/opt/boulder'...
remote: Enumerating objects: 68877, done.
remote: Counting objects: 100% (4975/4975), done.
remote: Compressing objects: 100% (1373/1373), done.
remote: Total 68877 (delta 4200), reused 3907 (delta 3590), pack-reused 63902
Receiving objects: 100% (68877/68877), 48.62 MiB | 728.00 KiB/s, done.
Resolving deltas: 100% (43968/43968), done.
Note: switching to 'd9b97c7863a7800b3c90b596517a7d4107f877a5'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:

  git switch -c <new-branch-name>

Or undo this operation with:

  git switch -

Turn off this advice by setting config variable advice.detachedHead to false

root@server:/opt# cd boulder/
root@server:/opt/boulder# docker-compose up
WARNING: The GOEXPERIMENT variable is not set. Defaulting to a blank string.
Creating network "boulder_bluenet" with driver "bridge"
Creating network "boulder_redisnet" with driver "bridge"
Creating network "boulder_consulnet" with driver "bridge"
Creating network "boulder_rednet" with driver "bridge"
Pulling boulder (letsencrypt/boulder-tools:go1.21.3_2023-10-12)...
go1.21.3_2023-10-12: Pulling from letsencrypt/boulder-tools
7007490126ef: Pull complete
0b760f979d62: Pull complete
0e934f6d128d: Pull complete
82184fef2fd7: Pull complete
9778718df5ea: Pull complete
f41f24bec5c7: Pull complete
7eaea3d23e03: Pull complete
eacaccd1c642: Pull complete
4ba4b789945f: Pull complete
b15ffca2df73: Pull complete
395fb0cd33cc: Pull complete
fea61c9087f1: Pull complete
Digest: sha256:5854a5858eaeef8f4ab4881efc29b03b3fb9c2be5e7b284983073147b5385d37
Status: Downloaded newer image for letsencrypt/boulder-tools:go1.21.3_2023-10-12
Creating boulder_bredis_1_1  ... done
Creating boulder_bredis_3_1  ... done
Creating boulder_bredis_2_1  ... done
Creating boulder_bconsul_1   ... done
Creating boulder_bmysql_1   ... done
Creating boulder_bredis_4_1 ... done
Creating boulder_bjaeger_1   ... done
Creating boulder_bproxysql_1 ... done
Creating boulder_boulder_1   ... done
Attaching to boulder_bredis_4_1, boulder_bredis_1_1, boulder_bredis_2_1, boulder_bconsul_1, boulder_bredis_3_1, boulder_bjaeger_1, boulder_bproxysql_1, boulder_boulder_1
bconsul_1    | ==> Starting Consul agent...
bconsul_1    |               Version: '1.15.4'
bconsul_1    |            Build Date: '2023-06-23 23:14:17 +0000 UTC'
bconsul_1    |               Node ID: '819ac409-ae22-d973-1c67-e51278930346'
bconsul_1    |             Node name: 'bd7a5e41cd29'
bconsul_1    |            Datacenter: 'dc1' (Segment: '<all>')
bconsul_1    |                Server: true (Bootstrap: false)
bconsul_1    |           Client Addr: [0.0.0.0] (HTTP: 8500, HTTPS: -1, gRPC: 8502, gRPC-TLS: 8503, DNS: 53)
bconsul_1    |          Cluster Addr: 10.55.55.10 (LAN: 8301, WAN: 8302)
bconsul_1    |     Gossip Encryption: false
bconsul_1    |      Auto-Encrypt-TLS: false
bconsul_1    |      Reporting Enabled: false
bconsul_1    |             HTTPS TLS: Verify Incoming: false, Verify Outgoing: false, Min Version: TLSv1_2
bconsul_1    |              gRPC TLS: Verify Incoming: false, Min Version: TLSv1_2
bconsul_1    |      Internal RPC TLS: Verify Incoming: false, Verify Outgoing: false (Verify Hostname: false), Min Version: TLSv1_2
bconsul_1    |
bconsul_1    | ==> Log data will now stream in as it occurs:
bconsul_1    |
bjaeger_1    | 2024/03/08 17:45:39 maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
bjaeger_1    | 2024/03/08 17:45:39 application version: git-commit=d482f3f7bf780f72a20ddfedbe3aa6ec5c5e4613, git-version=v1.50.0, build-date=2023-10-08T00:14:18Z
bjaeger_1    | {"level":"info","ts":1709919939.8869815,"caller":"flags/service.go:119","msg":"Mounting metrics handler on admin server","route":"/metrics"}
bjaeger_1    | {"level":"info","ts":1709919939.8880663,"caller":"flags/service.go:125","msg":"Mounting expvar handler on admin server","route":"/debug/vars"}
bjaeger_1    | {"level":"info","ts":1709919939.8934705,"caller":"flags/admin.go:129","msg":"Mounting health check on admin server","route":"/"}
bjaeger_1    | {"level":"info","ts":1709919939.8938687,"caller":"flags/admin.go:143","msg":"Starting admin HTTP server","http-addr":":14269"}
bjaeger_1    | {"level":"info","ts":1709919939.8940198,"caller":"flags/admin.go:121","msg":"Admin server started","http.host-port":"[::]:14269","health-status":"unavailable"}
bjaeger_1    | {"level":"info","ts":1709919939.894188,"caller":"grpc@v1.58.2/clientconn.go:489","msg":"[core][Channel #1] Channel created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.8993654,"caller":"grpc@v1.58.2/clientconn.go:1839","msg":"[core][Channel #1] original dial target is: \"localhost:4317\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.8996224,"caller":"grpc@v1.58.2/clientconn.go:1846","msg":"[core][Channel #1] parsed dial target is: {URL:{Scheme:localhost Opaque:4317 User: Host: Path: RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.900256,"caller":"grpc@v1.58.2/clientconn.go:1860","msg":"[core][Channel #1] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.900472,"caller":"grpc@v1.58.2/clientconn.go:1868","msg":"[core][Channel #1] parsed dial target is: {URL:{Scheme:passthrough Opaque: User: Host: Path:/localhost:4317 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9010904,"caller":"grpc@v1.58.2/clientconn.go:2001","msg":"[core][Channel #1] Channel authority set to \"localhost:4317\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9055085,"caller":"grpc@v1.58.2/resolver_conn_wrapper.go:246","msg":"[core][Channel #1] Resolver state updated: {\n  \"Addresses\": [\n    {\n      \"Addr\": \"localhost:4317\",\n      \"ServerName\": \"\",\n      \"Attributes\": null,\n      \"BalancerAttributes\": null,\n      \"Metadata\": null\n    }\n  ],\n  \"Endpoints\": [\n    {\n      \"Addresses\": [\n        {\n          \"Addr\": \"localhost:4317\",\n          \"ServerName\": \"\",\n          \"Attributes\": null,\n          \"BalancerAttributes\": null,\n          \"Metadata\": null\n        }\n      ],\n      \"Attributes\": null\n    }\n  ],\n  \"ServiceConfig\": null,\n  \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9056714,"caller":"grpc@v1.58.2/balancer_conn_wrappers.go:180","msg":"[core][Channel #1] Channel switches to new LB policy \"pick_first\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9057198,"caller":"grpc@v1.58.2/balancer_conn_wrappers.go:298","msg":"[core][Channel #1 SubChannel #2] Subchannel created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.915488,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #1] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.915822,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #1 SubChannel #2] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9159293,"caller":"grpc@v1.58.2/clientconn.go:1453","msg":"[core][Channel #1 SubChannel #2] Subchannel picks a new address \"localhost:4317\" to connect","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"warn","ts":1709919939.918278,"caller":"grpc@v1.58.2/clientconn.go:1515","msg":"[core][Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {Addr: \"localhost:4317\", ServerName: \"localhost:4317\", }. Err: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:4317: connect: connection refused\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9183462,"caller":"grpc@v1.58.2/clientconn.go:1340","msg":"[core][Channel #1 SubChannel #2] Subchannel Connectivity change to TRANSIENT_FAILURE, last error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:4317: connect: connection refused\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9183652,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #1] Channel Connectivity change to TRANSIENT_FAILURE","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919939.9189637,"caller":"memory/factory.go:79","msg":"Memory storage initialized","configuration":{"MaxTraces":0}}
bjaeger_1    | {"level":"info","ts":1709919939.920903,"caller":"static/strategy_store.go:138","msg":"Loading sampling strategies","filename":"/etc/jaeger/sampling_strategies.json"}
bjaeger_1    | {"level":"info","ts":1709919940.1068723,"caller":"grpc@v1.58.2/server.go:667","msg":"[core][Server #3] Server created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.107037,"caller":"server/grpc.go:104","msg":"Starting jaeger-collector gRPC server","grpc.host-port":"[::]:14250"}
bjaeger_1    | {"level":"info","ts":1709919940.1070702,"caller":"server/http.go:56","msg":"Starting jaeger-collector HTTP server","http host-port":":14268"}
bjaeger_1    | {"level":"info","ts":1709919940.1071954,"caller":"server/zipkin.go:52","msg":"Not listening for Zipkin HTTP traffic, port not configured"}
bjaeger_1    | {"level":"warn","ts":1709919940.1072354,"caller":"internal@v0.86.0/warning.go:40","msg":"Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks","documentation":"https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"}
bjaeger_1    | {"level":"info","ts":1709919940.1072662,"caller":"grpc@v1.58.2/server.go:667","msg":"[core][Server #4] Server created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1072848,"caller":"otlpreceiver@v0.86.0/otlp.go:83","msg":"Starting GRPC server","endpoint":"0.0.0.0:4317"}
bjaeger_1    | {"level":"warn","ts":1709919940.1073818,"caller":"internal@v0.86.0/warning.go:40","msg":"Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks","documentation":"https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"}
bjaeger_1    | {"level":"info","ts":1709919940.1117668,"caller":"grpc@v1.58.2/server.go:855","msg":"[core][Server #3 ListenSocket #5] ListenSocket created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1132228,"caller":"otlpreceiver@v0.86.0/otlp.go:101","msg":"Starting HTTP server","endpoint":"0.0.0.0:4318"}
bjaeger_1    | {"level":"info","ts":1709919940.1133652,"caller":"grpc/builder.go:74","msg":"Agent requested insecure grpc connection to collector(s)"}
bjaeger_1    | {"level":"info","ts":1709919940.113451,"caller":"grpc@v1.58.2/clientconn.go:489","msg":"[core][Channel #6] Channel created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1135283,"caller":"grpc@v1.58.2/clientconn.go:1839","msg":"[core][Channel #6] original dial target is: \"localhost:14250\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1135972,"caller":"grpc@v1.58.2/clientconn.go:1846","msg":"[core][Channel #6] parsed dial target is: {URL:{Scheme:localhost Opaque:14250 User: Host: Path: RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1136556,"caller":"grpc@v1.58.2/clientconn.go:1860","msg":"[core][Channel #6] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1137226,"caller":"grpc@v1.58.2/clientconn.go:1868","msg":"[core][Channel #6] parsed dial target is: {URL:{Scheme:passthrough Opaque: User: Host: Path:/localhost:14250 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1137815,"caller":"grpc@v1.58.2/clientconn.go:2001","msg":"[core][Channel #6] Channel authority set to \"localhost:14250\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.113889,"caller":"grpc@v1.58.2/resolver_conn_wrapper.go:246","msg":"[core][Channel #6] Resolver state updated: {\n  \"Addresses\": [\n    {\n      \"Addr\": \"localhost:14250\",\n      \"ServerName\": \"\",\n      \"Attributes\": null,\n      \"BalancerAttributes\": null,\n      \"Metadata\": null\n    }\n  ],\n  \"Endpoints\": [\n    {\n      \"Addresses\": [\n        {\n          \"Addr\": \"localhost:14250\",\n          \"ServerName\": \"\",\n          \"Attributes\": null,\n          \"BalancerAttributes\": null,\n          \"Metadata\": null\n        }\n      ],\n      \"Attributes\": null\n    }\n  ],\n  \"ServiceConfig\": null,\n  \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1139674,"caller":"grpc@v1.58.2/server.go:855","msg":"[core][Server #4 ListenSocket #7] ListenSocket created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1140597,"caller":"grpc@v1.58.2/balancer_conn_wrappers.go:180","msg":"[core][Channel #6] Channel switches to new LB policy \"round_robin\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1141448,"caller":"grpc@v1.58.2/balancer_conn_wrappers.go:298","msg":"[core][Channel #6 SubChannel #8] Subchannel created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.114219,"caller":"base/balancer.go:182","msg":"[roundrobin]roundrobinPicker: Build called with info: {map[]}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.114283,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #6] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.114657,"caller":"grpc/builder.go:115","msg":"Checking connection to collector"}
bjaeger_1    | {"level":"info","ts":1709919940.1146975,"caller":"grpc/builder.go:126","msg":"Agent collector connection state change","dialTarget":"localhost:14250","status":"CONNECTING"}
bjaeger_1    | {"level":"info","ts":1709919940.1147327,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #6 SubChannel #8] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.114866,"caller":"grpc@v1.58.2/clientconn.go:1453","msg":"[core][Channel #6 SubChannel #8] Subchannel picks a new address \"localhost:14250\" to connect","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1155682,"caller":"./main.go:259","msg":"Starting agent"}
bjaeger_1    | {"level":"info","ts":1709919940.1337605,"caller":"querysvc/query_service.go:134","msg":"Archive storage not created","reason":"archive storage not supported"}
bjaeger_1    | {"level":"info","ts":1709919940.1338425,"caller":"app/flags.go:144","msg":"Archive storage not initialized"}
bjaeger_1    | {"level":"info","ts":1709919940.1341832,"caller":"grpc@v1.58.2/server.go:667","msg":"[core][Server #9] Server created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1357527,"caller":"grpc@v1.58.2/clientconn.go:489","msg":"[core][Channel #10] Channel created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1359253,"caller":"grpc@v1.58.2/clientconn.go:1839","msg":"[core][Channel #10] original dial target is: \"localhost:16685\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1360023,"caller":"grpc@v1.58.2/clientconn.go:1846","msg":"[core][Channel #10] parsed dial target is: {URL:{Scheme:localhost Opaque:16685 User: Host: Path: RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1361156,"caller":"grpc@v1.58.2/clientconn.go:1860","msg":"[core][Channel #10] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1367276,"caller":"grpc@v1.58.2/clientconn.go:1868","msg":"[core][Channel #10] parsed dial target is: {URL:{Scheme:passthrough Opaque: User: Host: Path:/localhost:16685 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1368327,"caller":"grpc@v1.58.2/clientconn.go:2001","msg":"[core][Channel #10] Channel authority set to \"localhost:16685\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1369681,"caller":"grpc@v1.58.2/resolver_conn_wrapper.go:246","msg":"[core][Channel #10] Resolver state updated: {\n  \"Addresses\": [\n    {\n      \"Addr\": \"localhost:16685\",\n      \"ServerName\": \"\",\n      \"Attributes\": null,\n      \"BalancerAttributes\": null,\n      \"Metadata\": null\n    }\n  ],\n  \"Endpoints\": [\n    {\n      \"Addresses\": [\n        {\n          \"Addr\": \"localhost:16685\",\n          \"ServerName\": \"\",\n          \"Attributes\": null,\n          \"BalancerAttributes\": null,\n          \"Metadata\": null\n        }\n      ],\n      \"Attributes\": null\n    }\n  ],\n  \"ServiceConfig\": null,\n  \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.138354,"caller":"app/agent.go:69","msg":"Starting jaeger-agent HTTP server","http-port":5778}
bjaeger_1    | {"level":"info","ts":1709919940.1385357,"caller":"grpc@v1.58.2/balancer_conn_wrappers.go:180","msg":"[core][Channel #10] Channel switches to new LB policy \"pick_first\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1386118,"caller":"grpc@v1.58.2/balancer_conn_wrappers.go:298","msg":"[core][Channel #10 SubChannel #11] Subchannel created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1386912,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #10] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.141504,"caller":"app/server.go:216","msg":"Query server started","http_addr":"[::]:16686","grpc_addr":"[::]:16685"}
bjaeger_1    | {"level":"info","ts":1709919940.141587,"caller":"healthcheck/handler.go:129","msg":"Health Check state change","status":"ready"}
bjaeger_1    | {"level":"info","ts":1709919940.14164,"caller":"app/server.go:299","msg":"Starting GRPC server","port":16685,"addr":":16685"}
bjaeger_1    | {"level":"info","ts":1709919940.1416924,"caller":"grpc@v1.58.2/server.go:855","msg":"[core][Server #9 ListenSocket #12] ListenSocket created","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1417608,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #10 SubChannel #11] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1419384,"caller":"grpc@v1.58.2/clientconn.go:1453","msg":"[core][Channel #10 SubChannel #11] Subchannel picks a new address \"localhost:16685\" to connect","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1457338,"caller":"app/server.go:280","msg":"Starting HTTP server","port":16686,"addr":":16686"}
bjaeger_1    | {"level":"info","ts":1709919940.1477973,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #6 SubChannel #8] Subchannel Connectivity change to READY","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1493998,"caller":"base/balancer.go:182","msg":"[roundrobin]roundrobinPicker: Build called with info: {map[SubConn(id:8):{{Addr: \"localhost:14250\", ServerName: \"\", }}]}","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.149479,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #6] Channel Connectivity change to READY","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.1495337,"caller":"grpc/builder.go:126","msg":"Agent collector connection state change","dialTarget":"localhost:14250","status":"READY"}
bjaeger_1    | {"level":"info","ts":1709919940.1498861,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #10 SubChannel #11] Subchannel Connectivity change to READY","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.149905,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #10] Channel Connectivity change to READY","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.9206717,"caller":"grpc@v1.58.2/clientconn.go:1340","msg":"[core][Channel #1 SubChannel #2] Subchannel Connectivity change to IDLE, last error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:4317: connect: connection refused\"","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.9207869,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #1 SubChannel #2] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.9208593,"caller":"grpc@v1.58.2/clientconn.go:1453","msg":"[core][Channel #1 SubChannel #2] Subchannel picks a new address \"localhost:4317\" to connect","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.9232755,"caller":"grpc@v1.58.2/clientconn.go:1338","msg":"[core][Channel #1 SubChannel #2] Subchannel Connectivity change to READY","system":"grpc","grpc_log":true}
bjaeger_1    | {"level":"info","ts":1709919940.923309,"caller":"grpc@v1.58.2/clientconn.go:592","msg":"[core][Channel #1] Channel Connectivity change to READY","system":"grpc","grpc_log":true}
bproxysql_1  | 2024-03-08 17:45:40 [INFO] Using config file /test/proxysql/proxysql.cnf
bproxysql_1  | Renaming database file /var/lib/proxysql/proxysql.db
bproxysql_1  | 2024-03-08 17:45:40 [INFO] Current RLIMIT_NOFILE: 1048576
bproxysql_1  | 2024-03-08 17:45:40 [INFO] Using OpenSSL version: OpenSSL 3.1.0 14 Mar 2023
bproxysql_1  | 2024-03-08 17:45:40 [INFO] No SSL keys/certificates found in datadir (/var/lib/proxysql). Generating new keys/certificates.
bredis_1_1   | 1:C 08 Mar 2024 17:45:39.058 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
bredis_1_1   | 1:C 08 Mar 2024 17:45:39.058 # Redis version=6.2.7, bits=64, commit=00000000, modified=0, pid=1, just started
bredis_1_1   | 1:C 08 Mar 2024 17:45:39.058 # Configuration loaded
bredis_1_1   | 1:M 08 Mar 2024 17:45:39.063 # A key '__redis__compare_helper' was added to Lua globals which is not on the globals allow list nor listed on the deny list.
bredis_1_1   | 1:M 08 Mar 2024 17:45:39.065 # Server initialized
bredis_1_1   | 1:M 08 Mar 2024 17:45:39.065 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
bredis_2_1   | 1:C 08 Mar 2024 17:45:39.059 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
bredis_2_1   | 1:C 08 Mar 2024 17:45:39.059 # Redis version=6.2.7, bits=64, commit=00000000, modified=0, pid=1, just started
bredis_2_1   | 1:C 08 Mar 2024 17:45:39.059 # Configuration loaded
bredis_2_1   | 1:M 08 Mar 2024 17:45:39.069 # A key '__redis__compare_helper' was added to Lua globals which is not on the globals allow list nor listed on the deny list.
bredis_2_1   | 1:M 08 Mar 2024 17:45:39.069 # Server initialized
bredis_2_1   | 1:M 08 Mar 2024 17:45:39.069 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
bredis_3_1   | 1:C 08 Mar 2024 17:45:39.481 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
bredis_3_1   | 1:C 08 Mar 2024 17:45:39.482 # Redis version=6.2.7, bits=64, commit=00000000, modified=0, pid=1, just started
bredis_3_1   | 1:C 08 Mar 2024 17:45:39.482 # Configuration loaded
bredis_3_1   | 1:M 08 Mar 2024 17:45:39.491 # A key '__redis__compare_helper' was added to Lua globals which is not on the globals allow list nor listed on the deny list.
bredis_3_1   | 1:M 08 Mar 2024 17:45:39.491 # Server initialized
bredis_3_1   | 1:M 08 Mar 2024 17:45:39.491 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
bredis_4_1   | 1:C 08 Mar 2024 17:45:38.914 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
bredis_4_1   | 1:C 08 Mar 2024 17:45:38.914 # Redis version=6.2.7, bits=64, commit=00000000, modified=0, pid=1, just started
bredis_4_1   | 1:C 08 Mar 2024 17:45:38.914 # Configuration loaded
bredis_4_1   | 1:M 08 Mar 2024 17:45:38.924 # A key '__redis__compare_helper' was added to Lua globals which is not on the globals allow list nor listed on the deny list.
bredis_4_1   | 1:M 08 Mar 2024 17:45:38.924 # Server initialized
bredis_4_1   | 1:M 08 Mar 2024 17:45:38.924 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
boulder_1    |  * Starting enhanced syslogd rsyslogd
bproxysql_1  | 2024-03-08 17:45:41 [INFO] ProxySQL version 2.5.4-58-gd15b40a
bproxysql_1  | 2024-03-08 17:45:41 [INFO] Detected OS: Linux ddc20fc64330 5.15.0-92-generic #102-Ubuntu SMP Wed Jan 10 09:33:48 UTC 2024 x86_64
bproxysql_1  | 2024-03-08 17:45:41 [INFO] ProxySQL SHA1 checksum: 28fe7e3c3ed2652fa992a1646052a77cd4f317d2
bproxysql_1  | 2024-03-08 17:45:41 [INFO] SSL keys/certificates found in datadir (/var/lib/proxysql): loading them.
bproxysql_1  | 2024-03-08 17:45:41 [INFO] Loaded built-in SQLite3
bproxysql_1  | Standard ProxySQL MySQL Logger rev. 2.5.0421 -- MySQL_Logger.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | Standard ProxySQL Cluster rev. 0.4.0906 -- ProxySQL_Cluster.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | Standard ProxySQL Statistics rev. 1.4.1027 -- ProxySQL_Statistics.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | Standard ProxySQL HTTP Server Handler rev. 1.4.1031 -- ProxySQL_HTTP_Server.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | 2024-03-08 17:45:41 [INFO] Using UUID: 650c42be-7587-4332-b065-a967bb76f952 , randomly generated. Writing it to database
bproxysql_1  | 2024-03-08 17:45:41 [INFO] Computed checksum for 'LOAD ADMIN VARIABLES TO RUNTIME' was '0xD9A70D11108305FC', with epoch '1709919941'
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Computed checksum for 'LOAD MYSQL VARIABLES TO RUNTIME' was '0x501B482BBF74CB1C', with epoch '1709919942'
bproxysql_1  | Standard ProxySQL Admin rev. 2.0.6.0805 -- ProxySQL_Admin.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | 2024-03-08 17:45:42 [INFO] ProxySQL SHA1 checksum: 28fe7e3c3ed2652fa992a1646052a77cd4f317d2
bproxysql_1  | Standard MySQL Threads Handler rev. 0.2.0902 -- MySQL_Thread.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | Standard MySQL Authentication rev. 0.2.0902 -- MySQL_Authentication.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Computed checksum for 'LOAD MYSQL USERS TO RUNTIME' was '0xA0BE9614D9C9445B', with epoch '1709919942'
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Dumping mysql_servers_incoming
bproxysql_1  | +--------------+---------------+------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+
bproxysql_1  | | hostgroup_id | hostname      | port | gtid_port | weight | status | compression | max_connections | max_replication_lag | use_ssl | max_latency_ms | comment |
bproxysql_1  | +--------------+---------------+------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+
bproxysql_1  | | 0            | boulder-mysql | 3306 | 0         | 1      | 0      | 0           | 100             | 0                   | 0       | 200            |         |
bproxysql_1  | +--------------+---------------+------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Dumping mysql_servers LEFT JOIN mysql_servers_incoming
bproxysql_1  | +-------------+--------------+----------+------+
bproxysql_1  | | mem_pointer | hostgroup_id | hostname | port |
bproxysql_1  | +-------------+--------------+----------+------+
bproxysql_1  | +-------------+--------------+----------+------+
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Dumping mysql_servers JOIN mysql_servers_incoming
bproxysql_1  | +--------------+---------------+------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+-------------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+
bproxysql_1  | | hostgroup_id | hostname      | port | gtid_port | weight | status | compression | max_connections | max_replication_lag | use_ssl | max_latency_ms | comment | mem_pointer | gtid_port | weight | status | compression | max_connections | max_replication_lag | use_ssl | max_latency_ms | comment |
bproxysql_1  | +--------------+---------------+------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+-------------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+
bproxysql_1  | | 0            | boulder-mysql | 3306 | 0         | 1      | 0      | 0           | 100             | 0                   | 0       | 200            |         | 0           | 0         | 1      | 0      | 0           | 100             | 0                   | 0       | 200            |         |
bproxysql_1  | +--------------+---------------+------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+-------------+-----------+--------+--------+-------------+-----------------+---------------------+---------+----------------+---------+
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Creating new server in HG 0 : boulder-mysql:3306 , gtid_port=0, weight=1, status=0
bproxysql_1  | 2024-03-08 17:45:42 [INFO] New mysql_group_replication_hostgroups table
bproxysql_1  | 2024-03-08 17:45:42 [INFO] New mysql_galera_hostgroups table
bproxysql_1  | 2024-03-08 17:45:42 [INFO] New mysql_aws_aurora_hostgroups table
bproxysql_1  | 2024-03-08 17:45:42 [INFO] New mysql_hostgroup_attributes table
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Checksum for table mysql_servers is 0xED00A16C7BF48C6A
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Rebuilding 'Hostgroup_Manager_Mapping' due to checksums change - mysql_servers { old: 0x0, new: 0xED00A16C7BF48C6A }, mysql_replication_hostgroups { old:0x0, new:0x0 }
bproxysql_1  | 2024-03-08 17:45:42 [INFO] MySQL_HostGroups_Manager::commit() locked for 43ms
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Computed checksum for 'LOAD PROXYSQL SERVERS TO RUNTIME' was '0x0000000000000000', with epoch '1709919942'
bproxysql_1  | Standard Query Processor rev. 2.0.6.0805 -- Query_Processor.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | 2024-03-08 17:45:42 [INFO] Computed checksum for 'LOAD MYSQL QUERY RULES TO RUNTIME' was '0xA910065FD7E28CBF', with epoch '1709919942'
bproxysql_1  | In memory Standard Query Cache (SQC) rev. 1.2.0905 -- Query_Cache.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | Standard MySQL Monitor (StdMyMon) rev. 2.0.1226 -- MySQL_Monitor.cpp -- Wed Jul 19 08:08:07 2023
bproxysql_1  | 2024-03-08 17:45:42 [INFO] For information about products and services visit: https://proxysql.com/
bproxysql_1  | 2024-03-08 17:45:42 [INFO] For online documentation visit: https://proxysql.com/documentation/
bproxysql_1  | 2024-03-08 17:45:42 [INFO] For support visit: https://proxysql.com/services/support/
bproxysql_1  | 2024-03-08 17:45:42 [INFO] For consultancy visit: https://proxysql.com/services/consulting/
boulder_1    |    ...done.
boulder_1    | Fri Mar  8 17:45:42 UTC 2024 - still trying to connect to boulder-mysql:3306
boulder_1    | Fri Mar  8 17:45:43 UTC 2024 - still trying to connect to boulder-mysql:3306
boulder_1    | Fri Mar  8 17:45:44 UTC 2024 - still trying to connect to boulder-mysql:3306
boulder_1    | Fri Mar  8 17:45:45 UTC 2024 - still trying to connect to boulder-mysql:3306
boulder_1    | Connected to boulder-mysql:3306
boulder_1    | Connected to bproxysql:6032
boulder_1    |
boulder_1    | boulder_sa_test
boulder_1    | Doesn't exist - creating
boulder_1    | Applied 2 migrations
boulder_1    | ERROR 1146 (42S02) at line 36: Table 'boulder_sa_test.revokedCertificates' doesn't exist
boulder_1    | ERROR 1146 (42S02) at line 56: Table 'boulder_sa_test.revokedCertificates' doesn't exist
boulder_1    | Added users from ../db-users/boulder_sa.sql
boulder_1    |
boulder_1    | boulder_sa_integration
boulder_1    | Doesn't exist - creating
boulder_1    | Applied 2 migrations
boulder_1    | ERROR 1146 (42S02) at line 36: Table 'boulder_sa_integration.revokedCertificates' doesn't exist
boulder_1    | ERROR 1146 (42S02) at line 56: Table 'boulder_sa_integration.revokedCertificates' doesn't exist
boulder_1    | Added users from ../db-users/boulder_sa.sql
boulder_1    |
boulder_1    | incidents_sa_test
boulder_1    | Doesn't exist - creating
boulder_1    | Applied 1 migration
boulder_1    | Added users from ../db-users/incidents_sa.sql
boulder_1    |
boulder_1    | incidents_sa_integration
boulder_1    | Doesn't exist - creating
boulder_1    | Applied 1 migration
boulder_1    | Added users from ../db-users/incidents_sa.sql
boulder_1    |
boulder_1    | database setup complete
...
...
jsha commented 7 months ago

FWIW here are the versions of the docker package I have installed:

$ dpkg -l | grep docker
ii  docker-buildx-plugin                           0.12.1-1~ubuntu.23.10~mantic               amd64        Docker Buildx cli plugin.
ii  docker-ce                                      5:25.0.3-1~ubuntu.23.10~mantic             amd64        Docker: the open-source application container engine
ii  docker-ce-cli                                  5:25.0.3-1~ubuntu.23.10~mantic             amd64        Docker CLI: the open-source application container engine
ii  docker-ce-rootless-extras                      5:25.0.3-1~ubuntu.23.10~mantic             amd64        Rootless support for Docker.
ii  docker-compose-plugin                          2.24.6-1~ubuntu.23.10~mantic               amd64        Docker Compose (V2) plugin for the Docker CLI.

Those are from the Docker specific APT repository, which I installed using instructions from https://docs.docker.com/desktop/install/linux-install/.

Also, we've moved to the docker compose subcommand that is now part of Docker itself, instead of the docker-compose standalone Python script.

igolman commented 7 months ago

@jsha Thanks for your feedback!

I confirm that switching from docker.io packages provided by Ubuntu to official docker-ce package helped.

From my side this can be closed. It may be worth emphasizing in the README.md file the docker-ce package as a explicit requirement.