hashicorp / nomad

Nomad is an easy-to-use, flexible, and performant workload orchestrator that can deploy a mix of microservice, batch, containerized, and non-containerized applications. Nomad is easy to operate and scale and has native Consul and Vault integrations.
https://www.nomadproject.io/
Other
14.87k stars 1.95k forks source link

test flake in TestAutopilot_RollingUpdate #13931

Closed tgross closed 2 years ago

tgross commented 2 years ago

Example run https://app.circleci.com/pipelines/github/hashicorp/nomad/28734/workflows/9c27bd15-423e-437e-be7e-eaf456d23624/jobs/317342. CircleCI has identified this test as flaky with multiple failures on main in the last 30 days.

test logs > Failed === RUN TestAutopilot_RollingUpdate 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.636Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" node-063 2022-07-27T11:17:24.637Z [TRACE] nomad/worker.go:330: worker: changed workload status: worker_id=32cec098-6878-ee8d-8777-6169e91b319c from=WaitingToDequeue to=Paused 2022-07-27T11:17:24.637Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.637Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.637Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" node-063 2022-07-27T11:17:24.637Z [DEBUG] nomad/encrypter.go:409: nomad.keyring.replicator: exiting key replication node-068 2022-07-27T11:17:24.637Z [INFO] raft@v1.3.5/api.go:570: nomad.raft: initial configuration: index=0 servers=[] node-068 2022-07-27T11:17:24.637Z [INFO] raft@v1.3.5/raft.go:152: nomad.raft: entering follower state: follower="Node at 127.0.0.1:9136 [Follower]" leader= node-068 2022-07-27T11:17:24.638Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-068.global 127.0.0.1 node-068 2022-07-27T11:17:24.638Z [INFO] nomad/server.go:1734: nomad: starting scheduling worker(s): num_workers=4 schedulers=["batch", "system", "sysbatch", "noop", "service", "_core"] node-068 2022-07-27T11:17:24.638Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f index=1 of=4 node-068 2022-07-27T11:17:24.638Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=713527ec-db64-4ba1-50e3-ef25e7962875 index=2 of=4 node-068 2022-07-27T11:17:24.638Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=96867b49-2a02-eabe-80f2-1fff56d73276 index=3 of=4 node-068 2022-07-27T11:17:24.638Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=1f86f651-fe6a-7391-567f-04b56b9de657 index=4 of=4 node-068 2022-07-27T11:17:24.638Z [INFO] nomad/server.go:1746: nomad: started scheduling worker(s): num_workers=4 schedulers=["batch", "system", "sysbatch", "noop", "service", "_core"] node-068 2022-07-27T11:17:24.639Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-068.global (Addr: 127.0.0.1:9136) (DC: dc1)" node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f from=Starting to=Started node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f from=UnknownStatus to=Running node-068 2022-07-27T11:17:24.639Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f from=Running to=WaitingToDequeue node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 from=Starting to=Started node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 from=UnknownStatus to=Running node-068 2022-07-27T11:17:24.639Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 from=Running to=WaitingToDequeue node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 from=Starting to=Started node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 from=UnknownStatus to=Running node-068 2022-07-27T11:17:24.639Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 from=Running to=WaitingToDequeue node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 from=Starting to=Started node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 from=UnknownStatus to=Running node-068 2022-07-27T11:17:24.639Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 node-068 2022-07-27T11:17:24.639Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 from=Running to=WaitingToDequeue node-068 2022-07-27T11:17:24.639Z [DEBUG] nomad/encrypter.go:397: nomad.keyring.replicator: starting encryption key replication 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.639Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" node-064 2022-07-27T11:17:24.640Z [DEBUG] nomad/encrypter.go:409: nomad.keyring.replicator: exiting key replication node-069 2022-07-27T11:17:24.640Z [INFO] raft@v1.3.5/api.go:570: nomad.raft: initial configuration: index=0 servers=[] node-069 2022-07-27T11:17:24.640Z [INFO] raft@v1.3.5/raft.go:152: nomad.raft: entering follower state: follower="Node at 127.0.0.1:9138 [Follower]" leader= node-069 2022-07-27T11:17:24.640Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-069.global 127.0.0.1 node-069 2022-07-27T11:17:24.640Z [INFO] nomad/server.go:1734: nomad: starting scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "noop", "_core"] node-069 2022-07-27T11:17:24.640Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=d0ae57ff-e69f-e538-00a2-894d6a219b02 index=1 of=4 node-069 2022-07-27T11:17:24.640Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=aa635903-efeb-67a5-f787-33803b7db752 index=2 of=4 node-069 2022-07-27T11:17:24.640Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=aa635903-efeb-67a5-f787-33803b7db752 from=Starting to=Started node-069 2022-07-27T11:17:24.640Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=aa635903-efeb-67a5-f787-33803b7db752 from=UnknownStatus to=Running node-069 2022-07-27T11:17:24.640Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=aa635903-efeb-67a5-f787-33803b7db752 node-069 2022-07-27T11:17:24.640Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=aa635903-efeb-67a5-f787-33803b7db752 from=Running to=WaitingToDequeue node-069 2022-07-27T11:17:24.640Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=7a5f1006-fad0-ec22-328b-66b28c72ca21 index=3 of=4 node-069 2022-07-27T11:17:24.640Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 index=4 of=4 node-069 2022-07-27T11:17:24.640Z [INFO] nomad/server.go:1746: nomad: started scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "noop", "_core"] node-070 2022-07-27T11:17:24.643Z [INFO] raft@v1.3.5/api.go:570: nomad.raft: initial configuration: index=0 servers=[] node-070 2022-07-27T11:17:24.643Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-070.global 127.0.0.1 node-070 2022-07-27T11:17:24.643Z [INFO] nomad/server.go:1734: nomad: starting scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "noop", "_core"] node-070 2022-07-27T11:17:24.643Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=70f494ae-3ab7-ae39-7f07-b8883d222033 index=1 of=4 node-070 2022-07-27T11:17:24.643Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 index=2 of=4 node-070 2022-07-27T11:17:24.643Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=45d8f36f-5e1e-2de1-c678-fa70945c1989 index=3 of=4 node-070 2022-07-27T11:17:24.643Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=26e3ddf7-9887-a130-6f17-ba751c007bed index=4 of=4 node-070 2022-07-27T11:17:24.643Z [INFO] nomad/server.go:1746: nomad: started scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "noop", "_core"] node-069 2022-07-27T11:17:24.640Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 from=Starting to=Started node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 from=UnknownStatus to=Running node-069 2022-07-27T11:17:24.643Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 from=Running to=WaitingToDequeue node-069 2022-07-27T11:17:24.640Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 from=Starting to=Started node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 from=UnknownStatus to=Running node-069 2022-07-27T11:17:24.643Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 from=Running to=WaitingToDequeue node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 from=Starting to=Started node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 from=UnknownStatus to=Running node-069 2022-07-27T11:17:24.643Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 node-069 2022-07-27T11:17:24.643Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 from=Running to=WaitingToDequeue node-069 2022-07-27T11:17:24.643Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-069.global (Addr: 127.0.0.1:9138) (DC: dc1)" node-069 2022-07-27T11:17:24.643Z [DEBUG] nomad/encrypter.go:397: nomad.keyring.replicator: starting encryption key replication 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.643Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" node-070 2022-07-27T11:17:24.643Z [INFO] raft@v1.3.5/raft.go:152: nomad.raft: entering follower state: follower="Node at 127.0.0.1:9140 [Follower]" leader= node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Starting to=Started node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=UnknownStatus to=Running node-070 2022-07-27T11:17:24.644Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Running to=WaitingToDequeue node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Starting to=Started node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=UnknownStatus to=Running node-070 2022-07-27T11:17:24.644Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Running to=WaitingToDequeue node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Starting to=Started node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=UnknownStatus to=Running node-070 2022-07-27T11:17:24.644Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Running to=WaitingToDequeue node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed from=Starting to=Started node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed from=UnknownStatus to=Running node-070 2022-07-27T11:17:24.644Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed node-070 2022-07-27T11:17:24.644Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed from=Running to=WaitingToDequeue node-070 2022-07-27T11:17:24.644Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-070.global (Addr: 127.0.0.1:9140) (DC: dc1)" node-070 2022-07-27T11:17:24.644Z [DEBUG] nomad/encrypter.go:397: nomad.keyring.replicator: starting encryption key replication node-065 2022-07-27T11:17:24.644Z [DEBUG] nomad/encrypter.go:409: nomad.keyring.replicator: exiting key replication node-069 2022-07-27T11:17:24.644Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Initiating push/pull sync with: 127.0.0.1:9137 node-068 2022-07-27T11:17:24.644Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Stream connection from=127.0.0.1:35108 node-068 2022-07-27T11:17:24.644Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-069.global 127.0.0.1 node-068 2022-07-27T11:17:24.644Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-069.global (Addr: 127.0.0.1:9138) (DC: dc1)" node-069 2022-07-27T11:17:24.645Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-068.global 127.0.0.1 node-069 2022-07-27T11:17:24.645Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-068.global (Addr: 127.0.0.1:9136) (DC: dc1)" node-070 2022-07-27T11:17:24.645Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Initiating push/pull sync with: 127.0.0.1:9137 node-068 2022-07-27T11:17:24.645Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Stream connection from=127.0.0.1:35110 node-068 2022-07-27T11:17:24.645Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-070.global 127.0.0.1 node-070 2022-07-27T11:17:24.645Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-069.global 127.0.0.1 node-070 2022-07-27T11:17:24.645Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-068.global 127.0.0.1 node-068 2022-07-27T11:17:24.645Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-070.global (Addr: 127.0.0.1:9140) (DC: dc1)" node-070 2022-07-27T11:17:24.645Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-069.global (Addr: 127.0.0.1:9138) (DC: dc1)" node-070 2022-07-27T11:17:24.645Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Initiating push/pull sync with: 127.0.0.1:9139 node-069 2022-07-27T11:17:24.646Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Stream connection from=127.0.0.1:56032 node-069 2022-07-27T11:17:24.646Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-070.global 127.0.0.1 node-069 2022-07-27T11:17:24.646Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-070.global (Addr: 127.0.0.1:9140) (DC: dc1)" node-070 2022-07-27T11:17:24.648Z [INFO] nomad/serf.go:225: nomad: found expected number of peers, attempting to bootstrap cluster...: peers="127.0.0.1:9140,127.0.0.1:9138,127.0.0.1:9136" node-070 2022-07-27T11:17:24.648Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-068.global (Addr: 127.0.0.1:9136) (DC: dc1)" node-068 2022-07-27T11:17:24.648Z [INFO] nomad/serf.go:189: nomad: disabling bootstrap mode because existing Raft peers being reported by peer: peer_name=nomad-070.global peer_address=127.0.0.1:9140 node-069 2022-07-27T11:17:24.648Z [INFO] nomad/serf.go:225: nomad: found expected number of peers, attempting to bootstrap cluster...: peers="127.0.0.1:9136,127.0.0.1:9140,127.0.0.1:9138" node-063 2022-07-27T11:17:24.648Z [WARN] nomad/stats_fetcher.go:98: nomad.stats_fetcher: failed retrieving server health: server=nomad-066.global error="context deadline exceeded" node-063 2022-07-27T11:17:24.648Z [WARN] nomad/stats_fetcher.go:98: nomad.stats_fetcher: failed retrieving server health: server=nomad-067.global error="context deadline exceeded" node-063 2022-07-27T11:17:24.648Z [WARN] nomad/stats_fetcher.go:98: nomad.stats_fetcher: failed retrieving server health: server=nomad-063.global error="context deadline exceeded" node-063 2022-07-27T11:17:24.648Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=32cec098-6878-ee8d-8777-6169e91b319c from=Paused to=Resuming node-063 2022-07-27T11:17:24.648Z [TRACE] nomad/worker.go:340: worker: changed workload status: worker_id=32cec098-6878-ee8d-8777-6169e91b319c from=Paused to=WaitingToDequeue node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:345: worker: changed worker status: worker_id=32cec098-6878-ee8d-8777-6169e91b319c from=Resuming to=Started node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=32cec098-6878-ee8d-8777-6169e91b319c from=Started to=Stopped node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=32cec098-6878-ee8d-8777-6169e91b319c from=WaitingToDequeue to=Stopped node-063 2022-07-27T11:17:24.649Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=32cec098-6878-ee8d-8777-6169e91b319c node-063 2022-07-27T11:17:24.648Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=548cb4fd-fd2d-a827-b127-22380bb3f292 from=WaitingToDequeue to=Backoff node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=548cb4fd-fd2d-a827-b127-22380bb3f292 from=Pausing to=Stopped node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=548cb4fd-fd2d-a827-b127-22380bb3f292 from=Backoff to=Stopped node-063 2022-07-27T11:17:24.649Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=548cb4fd-fd2d-a827-b127-22380bb3f292 node-063 2022-07-27T11:17:24.648Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=976b9738-c634-9381-384f-7f0d8fbea4b8 from=WaitingToDequeue to=Backoff node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=976b9738-c634-9381-384f-7f0d8fbea4b8 from=Started to=Stopped node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=976b9738-c634-9381-384f-7f0d8fbea4b8 from=Backoff to=Stopped node-063 2022-07-27T11:17:24.649Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=976b9738-c634-9381-384f-7f0d8fbea4b8 node-063 2022-07-27T11:17:24.648Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=af9a04fa-057b-a2b2-3f53-8f2813f447c2 from=WaitingToDequeue to=Backoff node-063 2022-07-27T11:17:24.649Z [INFO] nomad/leader.go:86: nomad: cluster leadership lost node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=af9a04fa-057b-a2b2-3f53-8f2813f447c2 from=Pausing to=Stopped node-063 2022-07-27T11:17:24.649Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=af9a04fa-057b-a2b2-3f53-8f2813f447c2 from=Backoff to=Stopped node-063 2022-07-27T11:17:24.649Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=af9a04fa-057b-a2b2-3f53-8f2813f447c2 node-067 2022-07-27T11:17:24.654Z [DEBUG] nomad/encrypter.go:409: nomad.keyring.replicator: exiting key replication node-068 2022-07-27T11:17:24.693Z [WARN] raft@v1.3.5/raft.go:205: nomad.raft: no known peers, aborting election node-070 2022-07-27T11:17:24.711Z [WARN] raft@v1.3.5/raft.go:217: nomad.raft: heartbeat timeout reached, starting election: last-leader= node-070 2022-07-27T11:17:24.711Z [INFO] raft@v1.3.5/raft.go:255: nomad.raft: entering candidate state: node="Node at 127.0.0.1:9140 [Candidate]" term=2 node-070 2022-07-27T11:17:24.711Z [DEBUG] raft@v1.3.5/raft.go:273: nomad.raft: votes: needed=2 node-070 2022-07-27T11:17:24.711Z [DEBUG] raft@v1.3.5/raft.go:292: nomad.raft: vote granted: from=c61f03dc-5505-9fbc-7d27-c7cf83915347 term=2 tally=1 node-068 2022-07-27T11:17:24.711Z [DEBUG] raft@v1.3.5/raft.go:1488: nomad.raft: lost leadership because received a requestVote with a newer term node-069 2022-07-27T11:17:24.711Z [DEBUG] raft@v1.3.5/raft.go:1488: nomad.raft: lost leadership because received a requestVote with a newer term node-070 2022-07-27T11:17:24.711Z [DEBUG] raft@v1.3.5/raft.go:292: nomad.raft: vote granted: from=d7f46dd1-5d25-3c67-8a91-ac25c836c2fe term=2 tally=2 node-070 2022-07-27T11:17:24.711Z [INFO] raft@v1.3.5/raft.go:297: nomad.raft: election won: tally=2 node-070 2022-07-27T11:17:24.711Z [INFO] raft@v1.3.5/raft.go:369: nomad.raft: entering leader state: leader="Node at 127.0.0.1:9140 [Leader]" node-070 2022-07-27T11:17:24.711Z [INFO] raft@v1.3.5/raft.go:490: nomad.raft: added peer, starting replication: peer=d7f46dd1-5d25-3c67-8a91-ac25c836c2fe node-070 2022-07-27T11:17:24.711Z [INFO] raft@v1.3.5/raft.go:490: nomad.raft: added peer, starting replication: peer=22e1a9ff-a48b-4a15-eac6-83c02a5e92ae node-070 2022-07-27T11:17:24.712Z [INFO] nomad/leader.go:73: nomad: cluster leadership acquired node-068 2022-07-27T11:17:24.712Z [WARN] raft@v1.3.5/raft.go:1328: nomad.raft: failed to get previous log: previous-index=1 last-index=0 error="log not found" node-070 2022-07-27T11:17:24.712Z [WARN] raft@v1.3.5/replication.go:259: nomad.raft: appendEntries rejected, sending older logs: peer="{Voter 22e1a9ff-a48b-4a15-eac6-83c02a5e92ae 127.0.0.1:9136}" next=1 node-070 2022-07-27T11:17:24.712Z [INFO] raft@v1.3.5/replication.go:441: nomad.raft: pipelining replication: peer="{Voter d7f46dd1-5d25-3c67-8a91-ac25c836c2fe 127.0.0.1:9138}" node-070 2022-07-27T11:17:24.712Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Started to=Pausing node-070 2022-07-27T11:17:24.712Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Started to=Pausing node-070 2022-07-27T11:17:24.712Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Started to=Pausing node-070 2022-07-27T11:17:24.712Z [INFO] raft@v1.3.5/replication.go:441: nomad.raft: pipelining replication: peer="{Voter 22e1a9ff-a48b-4a15-eac6-83c02a5e92ae 127.0.0.1:9136}" node-070 2022-07-27T11:17:24.712Z [TRACE] nomad/leader.go:1717: nomad.core: initializing keyring node-070 2022-07-27T11:17:24.712Z [INFO] nomad/leader.go:1737: nomad.core: initialized keyring: id=cd3433db-3c85-02a7-e89c-c49b183137b4 node-070 2022-07-27T11:17:24.717Z [TRACE] nomad/fsm.go:357: nomad.fsm: ClusterSetMetadata: cluster_id=66a8eaa6-317c-e587-68f3-d4e076e802b1 create_time=1658920644713717011 node-070 2022-07-27T11:17:24.718Z [INFO] nomad/leader.go:1753: nomad.core: established cluster id: cluster_id=66a8eaa6-317c-e587-68f3-d4e076e802b1 create_time=1658920644713717011 node-070 2022-07-27T11:17:24.718Z [INFO] nomad/leader.go:1805: nomad: eval broker status modified: paused=false node-070 2022-07-27T11:17:24.718Z [INFO] nomad/leader.go:1812: nomad: blocked evals status modified: paused=false node-070 2022-07-27T11:17:24.719Z [TRACE] drainer/watch_jobs.go:150: nomad.drain.job_watcher: getting job allocs at index: index=1 autopilot_test.go:224: adding server s4 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.mock_driver: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.723Z [TRACE] eventer/eventer.go:74: plugin_loader.raw_exec: task event loop shutdown: plugin_dir="" node-071 2022-07-27T11:17:24.725Z [INFO] raft@v1.3.5/api.go:570: nomad.raft: initial configuration: index=0 servers=[] 2022-07-27T11:17:24.725Z [TRACE] eventer/eventer.go:74: plugin_loader.exec: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.725Z [TRACE] eventer/eventer.go:74: plugin_loader.qemu: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.725Z [TRACE] eventer/eventer.go:74: plugin_loader.java: task event loop shutdown: plugin_dir="" 2022-07-27T11:17:24.725Z [TRACE] eventer/eventer.go:74: plugin_loader.docker: task event loop shutdown: plugin_dir="" node-071 2022-07-27T11:17:24.725Z [INFO] raft@v1.3.5/raft.go:152: nomad.raft: entering follower state: follower="Node at 127.0.0.1:9142 [Follower]" leader= node-071 2022-07-27T11:17:24.725Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-071.global 127.0.0.1 node-071 2022-07-27T11:17:24.725Z [INFO] nomad/server.go:1734: nomad: starting scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "noop", "_core"] node-071 2022-07-27T11:17:24.725Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=67deb249-1d1d-6b2d-4042-e23eb10e0697 index=1 of=4 node-071 2022-07-27T11:17:24.725Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a index=2 of=4 node-071 2022-07-27T11:17:24.725Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=7400da47-1812-ec99-7e91-2dc29a6c3a1d index=3 of=4 node-071 2022-07-27T11:17:24.725Z [DEBUG] nomad/server.go:1741: nomad: started scheduling worker: id=0cfa8a16-4163-2395-bc86-3ef387e9c885 index=4 of=4 node-071 2022-07-27T11:17:24.725Z [INFO] nomad/server.go:1746: nomad: started scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "noop", "_core"] node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=Starting to=Started node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=UnknownStatus to=Running node-071 2022-07-27T11:17:24.726Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=Running to=WaitingToDequeue node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=Starting to=Started node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=UnknownStatus to=Running node-071 2022-07-27T11:17:24.726Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Initiating push/pull sync with: 127.0.0.1:9137 node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=Starting to=Started node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=UnknownStatus to=Running node-071 2022-07-27T11:17:24.726Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=Running to=WaitingToDequeue node-068 2022-07-27T11:17:24.726Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: memberlist: Stream connection from=127.0.0.1:35138 node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=Starting to=Started node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=UnknownStatus to=Running node-071 2022-07-27T11:17:24.726Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=Running to=WaitingToDequeue node-071 2022-07-27T11:17:24.726Z [DEBUG] nomad/worker.go:391: worker: running: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a node-071 2022-07-27T11:17:24.726Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=Running to=WaitingToDequeue node-071 2022-07-27T11:17:24.726Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-071.global (Addr: 127.0.0.1:9142) (DC: dc1)" node-071 2022-07-27T11:17:24.726Z [DEBUG] nomad/encrypter.go:397: nomad.keyring.replicator: starting encryption key replication node-068 2022-07-27T11:17:24.726Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-071.global 127.0.0.1 node-068 2022-07-27T11:17:24.726Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-071.global (Addr: 127.0.0.1:9142) (DC: dc1)" node-071 2022-07-27T11:17:24.726Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-069.global 127.0.0.1 node-071 2022-07-27T11:17:24.726Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-070.global 127.0.0.1 node-071 2022-07-27T11:17:24.726Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-068.global 127.0.0.1 node-071 2022-07-27T11:17:24.726Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-069.global (Addr: 127.0.0.1:9138) (DC: dc1)" node-071 2022-07-27T11:17:24.727Z [INFO] nomad/serf.go:189: nomad: disabling bootstrap mode because existing Raft peers being reported by peer: peer_name=nomad-068.global peer_address=127.0.0.1:9136 node-071 2022-07-27T11:17:24.727Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-070.global (Addr: 127.0.0.1:9140) (DC: dc1)" node-071 2022-07-27T11:17:24.727Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-068.global (Addr: 127.0.0.1:9136) (DC: dc1)" node-069 2022-07-27T11:17:24.739Z [INFO] go-hclog@v1.2.0/stdlog.go:57: nomad: serf: EventMemberJoin: nomad-071.global 127.0.0.1 node-069 2022-07-27T11:17:24.739Z [INFO] nomad/serf.go:60: nomad: adding server: server="nomad-071.global (Addr: 127.0.0.1:9142) (DC: dc1)" node-068 2022-07-27T11:17:24.739Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-068 2022-07-27T11:17:24.739Z [TRACE] nomad/encrypter.go:435: nomad.keyring.replicator: replicating new key: id=cd3433db-3c85-02a7-e89c-c49b183137b4 node-068 2022-07-27T11:17:24.739Z [TRACE] nomad/encrypter.go:473: nomad.keyring.replicator: added key: key=cd3433db-3c85-02a7-e89c-c49b183137b4 node-071 2022-07-27T11:17:24.741Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-069.global node-068 2022-07-27T11:17:24.741Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-069.global node-069 2022-07-27T11:17:24.741Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-069 2022-07-27T11:17:24.744Z [TRACE] nomad/encrypter.go:435: nomad.keyring.replicator: replicating new key: id=cd3433db-3c85-02a7-e89c-c49b183137b4 node-070 2022-07-27T11:17:24.744Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-069 2022-07-27T11:17:24.744Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.744Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.744Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.744Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.744Z [TRACE] nomad/encrypter.go:473: nomad.keyring.replicator: added key: key=cd3433db-3c85-02a7-e89c-c49b183137b4 node-069 2022-07-27T11:17:24.781Z [TRACE] nomad/fsm.go:357: nomad.fsm: ClusterSetMetadata: cluster_id=66a8eaa6-317c-e587-68f3-d4e076e802b1 create_time=1658920644713717011 node-068 2022-07-27T11:17:24.786Z [TRACE] nomad/fsm.go:357: nomad.fsm: ClusterSetMetadata: cluster_id=66a8eaa6-317c-e587-68f3-d4e076e802b1 create_time=1658920644713717011 node-071 2022-07-27T11:17:24.810Z [WARN] raft@v1.3.5/raft.go:205: nomad.raft: no known peers, aborting election node-069 2022-07-27T11:17:24.827Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-069.global node-069 2022-07-27T11:17:24.827Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-069 2022-07-27T11:17:24.827Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-069.global node-071 2022-07-27T11:17:24.827Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-071 2022-07-27T11:17:24.827Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.839Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-069.global node-071 2022-07-27T11:17:24.839Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-071 2022-07-27T11:17:24.839Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-069.global node-068 2022-07-27T11:17:24.839Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-070 2022-07-27T11:17:24.841Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.841Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-068 2022-07-27T11:17:24.844Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-071.global node-068 2022-07-27T11:17:24.927Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.938Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-069 2022-07-27T11:17:24.938Z [DEBUG] go-hclog@v1.2.0/stdlog.go:55: nomad: serf: messageJoinType: nomad-070.global node-070 2022-07-27T11:17:25.368Z [TRACE] nomad/worker.go:330: worker: changed workload status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=WaitingToDequeue to=Paused node-070 2022-07-27T11:17:25.449Z [TRACE] nomad/worker.go:330: worker: changed workload status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=WaitingToDequeue to=Paused node-070 2022-07-27T11:17:25.450Z [TRACE] nomad/worker.go:330: worker: changed workload status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=WaitingToDequeue to=Paused node-071 2022-07-27T11:17:29.756Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:29.764Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:29.774Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:29.777Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=Backoff to=WaitingToDequeue node-071 2022-07-27T11:17:29.785Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=Backoff to=WaitingToDequeue node-071 2022-07-27T11:17:29.794Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=Backoff to=WaitingToDequeue node-071 2022-07-27T11:17:29.836Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:29.856Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=Backoff to=WaitingToDequeue autopilot_test.go:230: autopilot_test.go:231: didn't find map[89cdb0a1-dab4-c1d2-09d2-7ce7e2c111f9:true] in []raft.ServerID{"c61f03dc-5505-9fbc-7d27-c7cf83915347", "d7f46dd1-5d25-3c67-8a91-ac25c836c2fe", "22e1a9ff-a48b-4a15-eac6-83c02a5e92ae"} node-071 2022-07-27T11:17:31.742Z [INFO] nomad/server.go:665: nomad: shutting down server node-071 2022-07-27T11:17:31.742Z [WARN] go-hclog@v1.2.0/stdlog.go:59: nomad: serf: Shutdown without a Leave node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=Started to=Stopped node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d from=Backoff to=Stopped node-071 2022-07-27T11:17:31.742Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=7400da47-1812-ec99-7e91-2dc29a6c3a1d node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=Started to=Stopped node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 from=Backoff to=Stopped node-071 2022-07-27T11:17:31.742Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=0cfa8a16-4163-2395-bc86-3ef387e9c885 node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=Started to=Stopped node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 from=Backoff to=Stopped node-071 2022-07-27T11:17:31.742Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=67deb249-1d1d-6b2d-4042-e23eb10e0697 node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=WaitingToDequeue to=Backoff node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=Started to=Stopped node-071 2022-07-27T11:17:31.742Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a from=Backoff to=Stopped node-071 2022-07-27T11:17:31.742Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=92fd3d9e-7d5b-a854-706f-b54ad7c4f00a node-070 2022-07-27T11:17:31.743Z [INFO] nomad/server.go:665: nomad: shutting down server node-070 2022-07-27T11:17:31.743Z [WARN] go-hclog@v1.2.0/stdlog.go:59: nomad: serf: Shutdown without a Leave node-070 2022-07-27T11:17:31.743Z [DEBUG] nomad/leader.go:82: nomad: shutting down leader loop node-070 2022-07-27T11:17:31.743Z [INFO] raft@v1.3.5/replication.go:489: nomad.raft: aborting pipeline replication: peer="{Voter 22e1a9ff-a48b-4a15-eac6-83c02a5e92ae 127.0.0.1:9136}" node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Paused to=Resuming node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Paused to=Resuming node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Paused to=Resuming node-070 2022-07-27T11:17:31.743Z [INFO] raft@v1.3.5/replication.go:489: nomad.raft: aborting pipeline replication: peer="{Voter d7f46dd1-5d25-3c67-8a91-ac25c836c2fe 127.0.0.1:9138}" node-069 2022-07-27T11:17:31.743Z [INFO] nomad/server.go:665: nomad: shutting down server node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed from=WaitingToDequeue to=Backoff node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed from=Started to=Stopped node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed from=Backoff to=Stopped node-070 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=26e3ddf7-9887-a130-6f17-ba751c007bed node-070 2022-07-27T11:17:31.743Z [INFO] nomad/leader.go:86: nomad: cluster leadership lost node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 from=WaitingToDequeue to=Backoff node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:340: worker: changed workload status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Paused to=WaitingToDequeue node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:345: worker: changed worker status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Resuming to=Started node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=Started to=Stopped node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 from=WaitingToDequeue to=Stopped node-070 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=70f494ae-3ab7-ae39-7f07-b8883d222033 node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:340: worker: changed workload status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Paused to=WaitingToDequeue node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:345: worker: changed worker status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Resuming to=Started node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=Started to=Stopped node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 from=WaitingToDequeue to=Stopped node-070 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=3dd28ea8-b03b-c519-2102-44e1b63cbfb8 node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 from=WaitingToDequeue to=Backoff node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:340: worker: changed workload status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Paused to=WaitingToDequeue node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f from=WaitingToDequeue to=Backoff node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:345: worker: changed worker status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Resuming to=Started node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=Started to=Stopped node-070 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 from=WaitingToDequeue to=Stopped node-070 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=45d8f36f-5e1e-2de1-c678-fa70945c1989 node-069 2022-07-27T11:17:31.743Z [WARN] go-hclog@v1.2.0/stdlog.go:59: nomad: serf: Shutdown without a Leave node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 from=WaitingToDequeue to=Backoff node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=aa635903-efeb-67a5-f787-33803b7db752 from=WaitingToDequeue to=Backoff node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=aa635903-efeb-67a5-f787-33803b7db752 from=Started to=Stopped node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=aa635903-efeb-67a5-f787-33803b7db752 from=Backoff to=Stopped node-069 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=aa635903-efeb-67a5-f787-33803b7db752 node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 from=WaitingToDequeue to=Backoff node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 from=Started to=Stopped node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 from=Backoff to=Stopped node-069 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=7a5f1006-fad0-ec22-328b-66b28c72ca21 node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 from=WaitingToDequeue to=Backoff node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 from=Started to=Stopped node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 from=Backoff to=Stopped node-069 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=b78bff0e-ea34-e3c0-f841-99e7205ce9a8 node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 from=WaitingToDequeue to=Backoff node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 from=Started to=Stopped node-069 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 from=Backoff to=Stopped node-069 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=d0ae57ff-e69f-e538-00a2-894d6a219b02 node-068 2022-07-27T11:17:31.743Z [INFO] nomad/server.go:665: nomad: shutting down server node-068 2022-07-27T11:17:31.743Z [WARN] go-hclog@v1.2.0/stdlog.go:59: nomad: serf: Shutdown without a Leave node-069 2022-07-27T11:17:31.743Z [DEBUG] nomad/encrypter.go:409: nomad.keyring.replicator: exiting key replication node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f from=Started to=Stopped node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f from=Backoff to=Stopped node-068 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=c55e2cbd-f8ca-de7a-0950-e1a5b93b7e7f node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 from=Started to=Stopped node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 from=Backoff to=Stopped node-068 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=713527ec-db64-4ba1-50e3-ef25e7962875 node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 from=Started to=Stopped node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 from=Backoff to=Stopped node-068 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=96867b49-2a02-eabe-80f2-1fff56d73276 node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:237: worker: changed worker status: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 from=Started to=Stopped node-068 2022-07-27T11:17:31.743Z [TRACE] nomad/worker.go:260: worker: changed workload status: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 from=Backoff to=Stopped node-068 2022-07-27T11:17:31.743Z [DEBUG] nomad/worker.go:369: worker: stopped: worker_id=1f86f651-fe6a-7391-567f-04b56b9de657 --- FAIL: TestAutopilot_RollingUpdate (7.11s)

This test exercises similar code paths to https://github.com/hashicorp/nomad/issues/13930

github-actions[bot] commented 1 year ago

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.