hashicorp / nomad

Nomad is an easy-to-use, flexible, and performant workload orchestrator that can deploy a mix of microservice, batch, containerized, and non-containerized applications. Nomad is easy to operate and scale and has native Consul and Vault integrations.
https://www.nomadproject.io/
Other
14.82k stars 1.94k forks source link

Template failed to send signals [user defined signal 1]: 1 error(s) occurred: * Task not running #5459

Open prologic opened 5 years ago

prologic commented 5 years ago

Using Nomad 0.9.0-beta3 here.

I have a haproxy task that I'm trying to restart on template changes (from Consul Template) but the entire task/group fails with:

Template failed to send signals [user defined signal 1]: 1 error(s) occurred: * Task not running
--

And the job remains in a failed state.

Job spec looks something like this:

job "haproxy" {
  datacenters = ["dc1"]
  type = "system"

  update {
    max_parallel     = 1
    min_healthy_time = "30s"
    auto_revert      = true
  }

  group "haproxy" {
    restart {
      interval = "6m"
      attempts = 10
      delay    = "30s"
      mode     = "delay"
    }

    task "haproxy" {
      driver = "exec"

      artifact {
        source      = "http://packages.local:8000/haproxy-v1.9.4.tar.gz"
        destination = "/"
        options {
          archive = "tar.gz"
        }
      }

      template {
        source = "/nomad/conf/configs/haproxy/haproxy.cfg.tpl"
        destination = "/etc/haproxy/haproxy.cfg"
        change_mode   = "signal"
        change_signal = "SIGUSR1"
      }

      config {
        command = "usr/local/sbin/haproxy"
        args= ["-f", "/etc/haproxy/haproxy.cfg"]
      }

      service {
        name = "haproxy"
        port = "xxx"
        check {
          name = "alive"
          type = "tcp"
          interval = "10s"
          timeout = "2s"
        }
      }

      resources {
        cpu = 200
        memory = 128
        network {
          mbits = 20
          port "xxx" {
            static = 4443"
          }
        }
      }
    }
  }
}

Couple of questions:

prologic commented 5 years ago

Full nomad alloc status logs:

Recent Events:
Time                  Type                   Description
2019-03-22T01:13:27Z  Killing                Template failed to send signals [user defined signal 1]: 1 error(s) occurred:

* Task not running
2019-03-22T01:13:23Z  Restarting             Task restarting in 31.906118053s
2019-03-22T01:13:23Z  Terminated             Exit Code: 0
2019-03-22T01:13:23Z  Signaling              Template re-rendered
2019-03-22T01:12:32Z  Started                Task started by client
2019-03-22T01:12:31Z  Downloading Artifacts  Client is downloading artifacts
2019-03-22T01:12:31Z  Task Setup             Building Task Directory
2019-03-22T01:12:31Z  Received               Task received by client
prologic commented 5 years ago

The overall state also seems wrong; the UI is showing "Running" but all task groups are in a "failed" state and not going anywhere. Screen Shot 2019-03-22 at 11 45 43 am

prologic commented 5 years ago

In fact the Job and its tasks just continue to fail even after a restart:

ID                  = 2dd17b65
Eval ID             = 61ea2f42
Name                = haproxy.haproxy[0]
Node ID             = 37cb9dd1
Job ID              = haproxy
Job Version         = 2
Client Status       = failed
Client Description  = Failed tasks
Desired Status      = run
Desired Description = <none>
Created             = 3m33s ago
Modified            = 3m4s ago

Task "haproxy" is "dead"
Task Resources
CPU        Memory           Disk     Addresses
0/200 MHz  6.9 MiB/128 MiB  300 MiB  haproxy_api: 10.0.64.160:4443
                                                                    haproxy_cloud_admin: 10.0.64.160:7559
                                                                    haproxy_mcfe: 10.0.64.160:443
                                                                    haproxy_reporter: 10.0.64.160:5555

Task Events:
Started At     = 2019-03-22T01:48:15Z
Finished At    = 2019-03-22T01:48:44Z
Total Restarts = 1
Last Restart   = 2019-03-22T01:48:43Z

Recent Events:
Time                  Type                   Description
2019-03-22T01:48:44Z  Killing                Template failed to send signals [user defined signal 1]: 1 error(s) occurred:

* Task not running
2019-03-22T01:48:43Z  Restarting             Task restarting in 34.83377889s
2019-03-22T01:48:43Z  Terminated             Exit Code: 0
2019-03-22T01:48:43Z  Signaling              Template re-rendered
2019-03-22T01:48:15Z  Started                Task started by client
2019-03-22T01:48:14Z  Downloading Artifacts  Client is downloading artifacts
2019-03-22T01:48:14Z  Task Setup             Building Task Directory
2019-03-22T01:48:14Z  Received               Task received by client

I don't actually expect the template to have changed in this case (after a stable cluster) -- But it does anyway?

prologic commented 5 years ago

Changing to change_mode = "restart" seems to work for me as a "work around". I'd still love to dig inot why change_mode = "signal" is so britle here; but alas I don't have the bandwidth today; so filing here for visibility.

notnoop commented 5 years ago

That's quite intriguing - I'm investigating this now. Mind if you post haproxy config file as well? I haven't been able to reproduce the "Task not running" part. Also, I assume this running on centos 7 as well?

It's very peculiar that the task was terminated immediately when the template was re-rendered but the event about signaling failure occurred 4 seconds later:

2019-03-22T01:13:27Z  Killing                Template failed to send signals [user defined signal 1]: 1 error(s) occurred:

* Task not running
2019-03-22T01:13:23Z  Restarting             Task restarting in 31.906118053s
2019-03-22T01:13:23Z  Terminated             Exit Code: 0
2019-03-22T01:13:23Z  Signaling              Template re-rendered

The overall state also seems wrong; the UI is showing "Running" but all task groups are in a "failed" state and not going anywhere.

This is the confusing UX problem raised in https://github.com/hashicorp/nomad/issues/5408#issuecomment-475083537 . When an alloc fails and expected to be restarted (note "Task restarting in.." event), the overall job is marked as running.

stale[bot] commented 5 years ago

Hey there

Since this issue hasn't had any activity in a while - we're going to automatically close it in 30 days. If you're still seeing this issue with the latest version of Nomad, please respond here and we'll keep this open and take another look at this.

Thanks!

stale[bot] commented 5 years ago

This issue will be auto-closed because there hasn't been any activity for a few months. Feel free to open a new one if you still experience this problem :+1:

pashinin commented 5 years ago

Nomad 0.9.3 - same problem:

Killing | Template failed to send signals [hangup]: 1 error(s) occurred: * Task not running

After a few fails task can start, but it's annoying that it fails with this error several times.

peimanja commented 5 years ago

Same issue here using Nomad 0.9.3:

Template failed to send signals [hangup]: 1 error(s) occurred: * Task not running

two out of three allocations fail with this error. one runs

habnabit commented 5 years ago

It's here: https://github.com/hashicorp/nomad/blob/ffb83e1ef182e04b8f625112cfe5cbaf1f314e08/client/allocrunner/taskrunner/template/template.go#L462-L465

If the signal fails to send for any reason (including because the task isn't running yet) then the task is marked as failed.

Maybe a new issue should be opened for this.

fffonion commented 4 years ago

We are using nginx and consul-template for service discovery and ran into the same problem, when the nginx nomad job is starting the service index in consul updated and result in several signals being sent. This makes the nginx nomad job impossible to start.

luckymike commented 4 years ago

Ran into this with many consul templates (nginx configs) on 0.11.3.

jorgemarey commented 3 years ago

Hi, we encountered this same problem. It seems that if the template makes the task fail (some formatting problem or something like that kill the proccess), fixing the template doesn't make the allocation start again, because nomad can't make the change in the configuration due to the allocation not being running. Could it be something that collides between the template signaling and the restart behaviour? evets

When the template render kicks in, the task is set to be killed (it's already), and nomad doesn't try to restart it again.

r3knit commented 3 years ago

Hi We encountered the same porblem on 1.0.3 When we are starting our cluster from scratch, all our nginx jobs with consul-templated configs have errors, are marked failed and have the same event list: image

We found that the problem occures only when several depended task are starting at the same time. So, we have putted some delay between them to avoid this.

That's a bit annoying and I don't like reds on the job's dashboard :)

We hope, that you will manage this somehow. For example, why don't you make nomad ignore that kind of error, or won't let template send signals for not yet running tasks?

tgross commented 3 years ago

Dropping a note here that came from an observation in a support case; the problem appears to be associated with the template rendering happening when a task is stopped waiting for the restart.delay window.

josegonzalez commented 3 years ago

Just dropping a note that I'm seeing this with the reference Nginx Tutorial here against the latest stable release of Nomad and Consul in my lab environment (amd64 instances): https://learn.hashicorp.com/tutorials/nomad/load-balancing-nginx

manicminer commented 2 years ago

Just another note to note I'm encountering this with Nomad v1.1.6 (b83d623fb5ff475d5e40df21e9e7a61834071078)

SunSparc commented 2 years ago

Still happening in Nomad 1.3.0-rc.1. Different signal, same problem. (Coincidentally, also with HAProxy.) At the very least, I think that the template rendering failure should not be fatal for the task/group/job.

Here are the "Recent Events" from a task that suffered from this today:

Time    Type    Description
May 18, '22 16:48:28 -0600  Killing Sent interrupt. Waiting 5s before force killing
May 18, '22 16:48:28 -0600  Not Restarting  Error was unrecoverable
May 18, '22 16:48:28 -0600  Killing Template failed to send signals [user defined signal 2]: 1 error occurred: * Task not running
May 18, '22 16:46:16 -0600  Restarting  Task restarting in 0s
May 18, '22 16:46:16 -0600  Terminated  Exit Code: 137, Signal: 9
May 18, '22 16:46:11 -0600  Restart Signaled    Template with change_mode restart re-rendered
May 18, '22 16:41:15 -0600  Signaling   Template re-rendered
May 18, '22 16:36:10 -0600  Started Task started by client
May 18, '22 16:36:09 -0600  Task Setup  Building Task Directory
May 18, '22 16:36:03 -0600  Received    Task received by client

I have not yet found a way to reproduce the problem consistently. If I do, I will post my findings.

axsuul commented 2 years ago

Also seeing this with 1.3.1 in that if it happens to a job even with

restart {
  attempts = 5
  delay = "5s"
  mode = "delay"
  interval = "30s"
}

It will actually kill the task and not attempt to restart it again. I see this as a bug and unexpected behavior.

$ nomad alloc status 1854d2d9-f0fb-9c78-15c6-7c7aedf390ba
Recent Events:
Time                       Type        Description
2022-08-06T15:49:17-07:00  Killing     Template failed to send signals [hangup]: 1 error occurred:
        * Task not running
2022-08-06T15:49:11-07:00  Restarting  Task restarting in 5.669470174s
2022-08-06T15:49:11-07:00  Terminated  Exit Code: 1, Exit Message: "Docker container exited with non-zero exit code: 1"
2022-08-06T15:49:11-07:00  Started     Task started by client
2022-08-06T15:49:10-07:00  Restarting  Task restarting in 5.422057036s
2022-08-06T15:49:10-07:00  Terminated  Exit Code: 1, Exit Message: "Docker container exited with non-zero exit code: 1"
2022-08-06T15:49:10-07:00  Started     Task started by client
2022-08-06T15:49:05-07:00  Driver      Downloading image
2022-08-06T15:49:04-07:00  Task Setup  Building Task Directory
2022-08-06T15:49:04-07:00  Received    Task received by client
mr-karan commented 1 year ago

If the task isn't running, then shouldn't the task be allowed to retried as defined in the task retry stanza? In that case, I feel https://github.com/hashicorp/nomad/blob/main/client/allocrunner/taskrunner/template/template.go#L517-L520 is an unnecessary action to do. This basically kills the task even if the task is defined with a restart.delay param.

Blefish commented 1 year ago

Had the same problem today, when starting the virtual machines, Nomad was attempting to call SIGHUP on the template as it thought the process was still running but it actually was not. It then went into "failed" and never tried to start the job at all.

fsiler commented 1 year ago

I'm having this problem (nginx with a dynamic template). It sure would be nice if there was some pattern or code fix to work around this.

Blefish commented 1 year ago

Has anyone found any workarounds to this problem? I'd love to be able to use signals for configuration updates without downtime, but Nomad marking the job as failed when rendering/signalling fails is blocking this.

Blefish commented 10 months ago

I'm suspecting this has something to do with the fact that the jobs are of type 'system', so they are not configurable by the usual reschedule{} stanza. On node bootup, if the signal is sent before Nomad registers itself with Consul, the job is properly restarted after the logs print out "client node registered". I think that the node is reevaluated and so the system jobs get placed again.

Blefish commented 10 months ago

A potential workaround that I am exploring at the moment:

template {
  data = <<EOF
#!/bin/sh
echo "Reloading pid 1"
kill -HUP 1
EOF
  destination  = "/local/reload.sh"
  perms        = "777"
}

template {
  data = <<EOF
{{ key "config" }}
EOF
  destination = "/local/config.cfg"
  change_mode = "script"
  change_script {
    command = "/local/reload.sh"
  }
}

According to the documentation, it should not kill the task if script fails: https://developer.hashicorp.com/nomad/docs/job-specification/change_script#fail_on_error

-EDIT- Running this for close to a month and works fine, allocations always start up when VM is booted.

Konubinix commented 9 months ago

Same here, the workaround with change_script appears to have been working ok for two weeks, much longer than I could experience with signal.

Note that you cannot use variable interpolation here, the change_script must be "/local/reload.sh" and not "${NOMAD_TASK_DIR}/reload.sh"

EugenKon commented 2 months ago

I have the exact issue as mentioned by https://github.com/hashicorp/nomad/issues/5459#issuecomment-475767591 I seems have the exact steps to reproduce it.

  1. Create job for your application and nginx
  2. nginx task depends on app, because nginx template is depend on the application job.

    job "portal" {
    type = "service"
    
    group "nginx" {
      count = 1
    
      restart {
        attempts = 5
        delay    = "15s"
        interval = "3m"
        mode     = "delay"
      }
      task "nginx-task" {
        driver = "docker"
        config {
          volumes = [
            "custom/nginx.conf:/etc/nginx/sites-enabled/internal",
          ]
        }
        template {
          source        = "/xxx/internal.nginx.conf.tpl"
          destination   = "custom/nginx.conf"
          change_mode   = "signal"
          change_signal = "SIGHUP"
        }
      }
    }
    group "api" {
      count = 1
    
      restart {
        attempts = 5
        delay    = "15s"
        interval = "3m"
        mode     = "delay"
      }
    
    task "api" { ... }
    }

    The nginx.conf has conf:

    {{- $has_api := service "api" -}}
    {{- if $has_api }}
    upstream api {
    {{- range $has_api }}
    server {{ .Address }}:{{ .Port }};
    {{- end }}
    
    keepalive 32;
    }
    {{- end -}}
  3. Kill api task by docker stop api-task-XXXX
  4. I expect that nginx.conf will be regenerated and SIGHUP is issued to nginx task
  5. Result:
    Jul 17, '24 15:37:58 -0400    Killing     Template failed to send signals [hangup]: 1 error occurred: * Task not running
    Jul 17, '24 15:37:41 -0400    Restarting  Task restarting in 18s
    Jul 17, '24 15:37:41 -0400    Terminated  Exit Code: 129, Exit Message: "Docker container exited with non-zero exit code: 129"
    Jul 17, '24 15:37:40 -0400    Signaling   Template re-rendered

Here we see the expected 'hangup' signal, but it done too late: 13s later. And we see not expected 'Terminated'. The nginx should not be terminated.

Lets see what is going during docker stop api-task-xxx Unexpected stopped nginx: image

Unexpected stopped nginx:

Jul 17 19:30:52 ip-172-31-6-94 dockerd[1445]: time="2024-07-17T19:30:52.762755889Z" level=info msg="ignoring event" container=9c4de1228188b214db797f107c1eeda54b35c51741f7d3aa4771664d05d7ba76 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 17 19:31:02 ip-172-31-6-94 dockerd[1445]: time="2024-07-17T19:31:02.555128397Z" level=info msg="ignoring event" container=feb8828be81b6aa1e07ee4b0555490fb1e5f320863d914851100dda4419e15ca module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"

Here docker ps output:

docker ps
CONTAINER ID   IMAGE                        COMMAND                  CREATED              STATUS              PORTS                                                                                                                                                                            NAMES
feb8828be81b   xxx   nginx-task-541cadde-da2e-7542-cf81-0cadf2520c20
9c4de1228188   xxx   api-task-7d84118c-cff5-16fe-78f9-855342828b86

When system comes to the normal state, we see that nginx-task ID was changed, but api-taks stays same.

docker ps
CONTAINER ID   IMAGE                        COMMAND                  CREATED              STATUS              PORTS                                                                                                                                                                            NAMES
e1897895f288   xxx nginx-task-564f7092-755c-1e5c-5fa3-e44b3036eb99
4b81a28522f6   xxx api-task-7d84118c-cff5-16fe-78f9-855342828b86
Nomad logs ``` Jul 17 19:37:28 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:28.558Z [DEBUG] http: request complete: method=GET path=/v1/agent/health?type=client duration="183.696µs" Jul 17 19:37:28 ip-172-31-6-94 nomad[1757]: http: request complete: method=GET path=/v1/agent/health?type=client duration="183.696µs" Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.350Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Terminated msg="Exit Code: 0" failed=false Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Terminated msg="Exit Code: 0" failed=false Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.351Z [DEBUG] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.353Z [INFO] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker plugin=/usr/bin/nomad id=599855 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.353Z [DEBUG] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker plugin=/usr/bin/nomad id=599855 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: plugin exited: driver=docker Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.450Z [DEBUG] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:04b0c06643fcd5531f16c105900310df3e7bc32e4ab34241bb7a244b07d8704b references=1 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:04b0c06643fcd5531f16c105900310df3e7bc32e4ab34241bb7a244b07d8704b references=1 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: restarting task: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task reason="Restart within policy" delay=18.306271133s Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.450Z [INFO] client.alloc_runner.task_runner: restarting task: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task reason="Restart within policy" delay=18.306271133s Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.450Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Restarting msg="Task restarting in 18.306271133s" failed=false Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Restarting msg="Task restarting in 18.306271133s" failed=false Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.456Z [DEBUG] agent: (runner) receiving dependency health.service(api|passing) Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.457Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.457Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.457Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.458Z [DEBUG] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.458Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.459Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.459Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.459Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.459Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.459Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) receiving dependency health.service(api|passing) Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.580Z [DEBUG] client: updated allocations: index=12820 total=110 pulled=31 filtered=79 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.580Z [DEBUG] client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client: updated allocations: index=12820 total=110 pulled=31 filtered=79 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:32.660Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:32 ip-172-31-6-94 nomad[1757]: client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.458Z [DEBUG] agent: (runner) received template "7376f9264838c77bd997077ede642b82" from quiescence Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.458Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.458Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "7376f9264838c77bd997077ede642b82" from quiescence Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.458Z [DEBUG] agent: (runner) rendering "/xxx.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/internal.nginx.conf" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "inx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/internal.nginx.conf" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.458Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) enabling global quiescence for "7376f9264838c77bd997077ede642b82" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) received template "78729eba1e2b937878cf101798d434d8" from quiescence Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.459Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.460Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "7376f9264838c77bd997077ede642b82" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "78729eba1e2b937878cf101798d434d8" from quiescence Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.460Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.461Z [DEBUG] agent: (runner) rendering "xxx.com.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/walk-inside.com.nginx.conf" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "xxx.com.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/walk-inside.com.nginx.conf" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.465Z [INFO] agent: (runner) rendered "com.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/walk-inside.com.nginx.conf" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.465Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) rendered "com.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/walk-inside.com.nginx.conf" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) enabling global quiescence for "78729eba1e2b937878cf101798d434d8" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) received template "bc46e1048e8296b5c2aa24476a8c36af" from quiescence Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "78729eba1e2b937878cf101798d434d8" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "bc46e1048e8296b5c2aa24476a8c36af" from quiescence Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.466Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.467Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.467Z [DEBUG] agent: (runner) rendering "(dynamic)" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/local/env" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "(dynamic)" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/local/env" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.467Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.467Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.467Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.468Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.468Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:37.468Z [DEBUG] agent: (runner) enabling global quiescence for "bc46e1048e8296b5c2aa24476a8c36af" Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:37 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "bc46e1048e8296b5c2aa24476a8c36af" Jul 17 19:37:38 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:38.559Z [DEBUG] http: request complete: method=GET path=/v1/agent/health?type=client duration="193.114µs" Jul 17 19:37:38 ip-172-31-6-94 nomad[1757]: http: request complete: method=GET path=/v1/agent/health?type=client duration="193.114µs" Jul 17 19:37:40 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:40.951Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task type=Signaling msg="Template re-rendered" failed=false Jul 17 19:37:40 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task type=Signaling msg="Template re-rendered" failed=false Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.299Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task type=Terminated msg="Exit Code: 129, Exit Message: \"Docker container exited with non-zero exit code: 129\"" failed=false Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task type=Terminated msg="Exit Code: 129, Exit Message: \"Docker container exited with non-zero exit code: 129\"" failed=false Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.301Z [DEBUG] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.304Z [INFO] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker plugin=/usr/bin/nomad id=600295 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker plugin=/usr/bin/nomad id=600295 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.305Z [DEBUG] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: plugin exited: driver=docker Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.353Z [DEBUG] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:10088cc6f54c33a3271ae011b14ebb4fa1b945679eafcabbcf8810e8b2213921 references=0 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:10088cc6f54c33a3271ae011b14ebb4fa1b945679eafcabbcf8810e8b2213921 references=0 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: restarting task: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task reason="Restart within policy" delay=17.506325836s Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.353Z [INFO] client.alloc_runner.task_runner: restarting task: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task reason="Restart within policy" delay=17.506325836s Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.353Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task type=Restarting msg="Task restarting in 17.506325836s" failed=false Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=564f7092-755c-1e5c-5fa3-e44b3036eb99 task=nginx-task type=Restarting msg="Task restarting in 17.506325836s" failed=false Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.365Z [DEBUG] consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.432Z [DEBUG] client: updated allocations: index=12821 total=110 pulled=31 filtered=79 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client: updated allocations: index=12821 total=110 pulled=31 filtered=79 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.432Z [DEBUG] client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:41.508Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:41 ip-172-31-6-94 nomad[1757]: client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.467Z [DEBUG] agent: (runner) received template "7376f9264838c77bd997077ede642b82" from quiescence Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.467Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.467Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "7376f9264838c77bd997077ede642b82" from quiescence Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.467Z [DEBUG] agent: (runner) rendering "inx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/internal.nginx.conf" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.468Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "inx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/internal.nginx.conf" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.468Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) enabling global quiescence for "7376f9264838c77bd997077ede642b82" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) received template "78729eba1e2b937878cf101798d434d8" from quiescence Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "7376f9264838c77bd997077ede642b82" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "78729eba1e2b937878cf101798d434d8" from quiescence Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.469Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) rendering "com.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/walk-inside.com.nginx.conf" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "com.nginx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/walk-inside.com.nginx.conf" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:42.470Z [DEBUG] agent: (runner) enabling global quiescence for "78729eba1e2b937878cf101798d434d8" Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:42 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "78729eba1e2b937878cf101798d434d8" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.470Z [DEBUG] agent: (runner) received template "7376f9264838c77bd997077ede642b82" from quiescence Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.470Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.470Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "7376f9264838c77bd997077ede642b82" from quiescence Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.471Z [DEBUG] agent: (runner) rendering "inx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/internal.nginx.conf" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.471Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "inx.conf.tpl" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/custom/internal.nginx.conf" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.471Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) enabling global quiescence for "7376f9264838c77bd997077ede642b82" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) received template "bc46e1048e8296b5c2aa24476a8c36af" from quiescence Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "7376f9264838c77bd997077ede642b82" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.472Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) received template "bc46e1048e8296b5c2aa24476a8c36af" from quiescence Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) rendering "(dynamic)" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/local/env" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) rendering "(dynamic)" => "/opt/nomad/data/alloc/564f7092-755c-1e5c-5fa3-e44b3036eb99/nginx-task/local/env" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:47.473Z [DEBUG] agent: (runner) enabling global quiescence for "bc46e1048e8296b5c2aa24476a8c36af" Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:47 ip-172-31-6-94 nomad[1757]: agent: (runner) enabling global quiescence for "bc46e1048e8296b5c2aa24476a8c36af" Jul 17 19:37:48 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:48.561Z [DEBUG] http: request complete: method=GET path=/v1/agent/health?type=client duration="180.332µs" Jul 17 19:37:48 ip-172-31-6-94 nomad[1757]: http: request complete: method=GET path=/v1/agent/health?type=client duration="180.332µs" Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:50.759Z [DEBUG] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:50.760Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task @module=logmon path=/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/alloc/logs/.api-task.stdout.fifo timestamp=2024-07-17T19:37:50.760Z Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:50.761Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task path=/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/alloc/logs/.api-task.stderr.fifo @module=logmon timestamp=2024-07-17T19:37:50.760Z Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task @module=logmon path=/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/alloc/logs/.api-task.stdout.fifo timestamp=2024-07-17T19:37:50.760Z Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task path=/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/alloc/logs/.api-task.stderr.fifo @module=logmon timestamp=2024-07-17T19:37:50.760Z Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:50.768Z [DEBUG] client.driver_mgr.docker: force pulling image instead of inspecting local: driver=docker image_ref=planitar/api:nomad Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: force pulling image instead of inspecting local: driver=docker image_ref=planitar/api:nomad Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:50.768Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Driver msg="Downloading image" failed=false Jul 17 19:37:50 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Driver msg="Downloading image" failed=false Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.494Z [DEBUG] client.driver_mgr.docker: docker pull succeeded: driver=docker image_ref=planitar/api:nomad Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: docker pull succeeded: driver=docker image_ref=planitar/api:nomad Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.495Z [DEBUG] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=planitar/api:nomad image_id=sha256:04b0c06643fcd5531f16c105900310df3e7bc32e4ab34241bb7a244b07d8704b references=2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.495Z [DEBUG] client.driver_mgr.docker: configured resources: driver=docker task_name=api-task memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: image reference count incremented: driver=docker image_name=planitar/api:nomad image_id=sha256:04b0c06643fcd5531f16c105900310df3e7bc32e4ab34241bb7a244b07d8704b references=2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.495Z [DEBUG] client.driver_mgr.docker: binding directories: driver=docker task_name=api-task binds="[]string{\"/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/alloc:/alloc\", \"/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/api-task/local:/local\", \"/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/api-task/secrets:/secrets\"}" Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.496Z [DEBUG] client.driver_mgr.docker: networking mode not specified; using default: driver=docker task_name=api-task Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.496Z [DEBUG] client.driver_mgr.docker: allocated static port: driver=docker task_name=api-task ip=172.31.6.94 port=30553 label=api-srv Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.496Z [DEBUG] client.driver_mgr.docker: exposed port: driver=docker task_name=api-task port=30553 label=api-srv Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: configured resources: driver=docker task_name=api-task memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.496Z [DEBUG] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=api-task labels=map[com.hashicorp.nomad.alloc_id:7d84118c-cff5-16fe-78f9-855342828b86] Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.496Z [DEBUG] client.driver_mgr.docker: setting container name: driver=docker task_name=api-task container_name=api-task-7d84118c-cff5-16fe-78f9-855342828b86 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: binding directories: driver=docker task_name=api-task binds="[]string{\"/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/alloc:/alloc\", \"/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/api-task/local:/local\", \"/opt/nomad/data/alloc/7d84118c-cff5-16fe-78f9-855342828b86/api-task/secrets:/secrets\"}" Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: networking mode not specified; using default: driver=docker task_name=api-task Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: allocated static port: driver=docker task_name=api-task ip=172.31.6.94 port=30553 label=api-srv Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: exposed port: driver=docker task_name=api-task port=30553 label=api-srv Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: applied labels on the container: driver=docker task_name=api-task labels=map[com.hashicorp.nomad.alloc_id:7d84118c-cff5-16fe-78f9-855342828b86] Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: setting container name: driver=docker task_name=api-task container_name=api-task-7d84118c-cff5-16fe-78f9-855342828b86 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.521Z [DEBUG] client: updated allocations: index=12822 total=110 pulled=31 filtered=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client: updated allocations: index=12822 total=110 pulled=31 filtered=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.521Z [DEBUG] client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.600Z [INFO] client.driver_mgr.docker: created container: driver=docker container_id=03a7ecd1f8a3facad3cb4907afe96ebe44c202fcfa44e527204b023949b4d9b2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: created container: driver=docker container_id=03a7ecd1f8a3facad3cb4907afe96ebe44c202fcfa44e527204b023949b4d9b2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.603Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.836Z [INFO] client.driver_mgr.docker: started container: driver=docker container_id=03a7ecd1f8a3facad3cb4907afe96ebe44c202fcfa44e527204b023949b4d9b2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.836Z [DEBUG] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/bin/nomad args=["/usr/bin/nomad", "docker_logger"] Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker: started container: driver=docker container_id=03a7ecd1f8a3facad3cb4907afe96ebe44c202fcfa44e527204b023949b4d9b2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/bin/nomad args=["/usr/bin/nomad", "docker_logger"] Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.837Z [DEBUG] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/bin/nomad pid=601087 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.837Z [DEBUG] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker plugin=/usr/bin/nomad Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/bin/nomad pid=601087 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker plugin=/usr/bin/nomad Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.866Z [DEBUG] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2585828868 network=unix timestamp=2024-07-17T19:37:51.866Z Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2585828868 network=unix timestamp=2024-07-17T19:37:51.866Z Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.866Z [DEBUG] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.867Z [DEBUG] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2024-07-17T19:37:51.867Z Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2024-07-17T19:37:51.867Z Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.871Z [INFO] client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Started msg="Task started by client" failed=false Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client.alloc_runner.task_runner: Task event: alloc_id=7d84118c-cff5-16fe-78f9-855342828b86 task=api-task type=Started msg="Task started by client" failed=false Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.885Z [DEBUG] agent: (runner) receiving dependency health.service(api|passing) Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.885Z [DEBUG] agent: (runner) initiating run Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.885Z [DEBUG] agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) receiving dependency health.service(api|passing) Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) initiating run Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.886Z [DEBUG] agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 7376f9264838c77bd997077ede642b82 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template 78729eba1e2b937878cf101798d434d8 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.886Z [DEBUG] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.887Z [DEBUG] agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) checking template bc46e1048e8296b5c2aa24476a8c36af Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.887Z [DEBUG] agent: (runner) diffing and updating dependencies Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.887Z [DEBUG] agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.887Z [DEBUG] agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.887Z [DEBUG] agent: (runner) watching 2 dependencies Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.887Z [DEBUG] agent: (runner) all templates rendered Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) diffing and updating dependencies Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(webshot|passing) is still needed Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) health.service(api|passing) is still needed Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) watching 2 dependencies Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: agent: (runner) all templates rendered Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.979Z [DEBUG] client: updated allocations: index=12823 total=110 pulled=31 filtered=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:51.979Z [DEBUG] client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client: updated allocations: index=12823 total=110 pulled=31 filtered=79 Jul 17 19:37:51 ip-172-31-6-94 nomad[1757]: client: allocation updates: added=0 removed=0 updated=31 ignored=79 Jul 17 19:37:52 ip-172-31-6-94 nomad[1757]: 2024-07-17T19:37:52.055Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 Jul 17 19:37:52 ip-172-31-6-94 nomad[1757]: client: allocation updates applied: added=0 removed=0 updated=31 ignored=79 errors=0 ```

Consul reports that nginx was deregistered: agent: Deregistered service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http

Consul logs ``` Jul 17 19:37:39 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=GET url=/v1/agent/checks from=127.0.0.1:48120 latency="773.004µs" Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:40.447Z [DEBUG] agent.http: Request finished:method=GET url=/v1/agent/self from=127.0.0.1:48120 latency=2.382945ms Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=GET url=/v1/agent/self from=127.0.0.1:48120 latency=2.382945ms Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:40.449Z [DEBUG] agent.http: Request finished:method=GET url=/v1/agent/checks from=127.0.0.1:48120 latency="536.341µs" Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=GET url=/v1/agent/checks from=127.0.0.1:48120 latency="536.341µs" Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:40.473Z [DEBUG] agent.dns: labels: querySuffixes=[] Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:40.473Z [DEBUG] agent.dns: labels: querySuffixes=[] Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: agent.dns: labels: querySuffixes=[] Jul 17 19:37:40 ip-172-31-6-94 consul[1696]: agent.dns: labels: querySuffixes=[] Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.355Z [DEBUG] agent.http: Request finished:method=GET url=/v1/agent/services from=127.0.0.1:48120 latency=1.057941ms Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=GET url=/v1/agent/services from=127.0.0.1:48120 latency=1.057941ms Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.356Z [DEBUG] agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.356Z [DEBUG] agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.356Z [DEBUG] agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.356Z [TRACE] agent.proxycfg.agent-state: syncing proxy services from local state Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent.proxycfg.agent-state: syncing proxy services from local state Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [INFO] agent: Deregistered service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-https Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Deregistered service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-https Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: removed check: check=_nomad-check-310ec0ffd53d485654b80c77265034dba0ca0dfb Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: removed service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-https Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.359Z [DEBUG] agent.http: Request finished:method=PUT url=/v1/agent/service/deregister/_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-wi-nginx-https from=127.0.0.1:48120 latency=4.057631ms Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.360Z [DEBUG] agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.360Z [DEBUG] agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.360Z [DEBUG] agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.360Z [TRACE] agent.proxycfg.agent-state: syncing proxy services from local state Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: removed check: check=_nomad-check-310ec0ffd53d485654b80c77265034dba0ca0dfb Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: removed service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-https Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=PUT url=/v1/agent/service/deregister/_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-https from=127.0.0.1:48120 latency=4.057631ms Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent.proxycfg.agent-state: syncing proxy services from local state Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.363Z [INFO] agent: Deregistered service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.363Z [DEBUG] agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Deregistered service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.363Z [DEBUG] agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.363Z [DEBUG] agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.363Z [DEBUG] agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.363Z [DEBUG] agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: removed check: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: removed service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.364Z [DEBUG] agent.http: Request finished:method=PUT url=/v1/agent/service/deregister/_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-wi-nginx-http-http from=127.0.0.1:48120 latency=4.198463ms Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:41.365Z [DEBUG] agent.http: Request finished:method=GET url=/v1/agent/checks from=127.0.0.1:48120 latency="717.521µs" Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: removed check: check=_nomad-check-77ee2ea0e0ede4984f62753377fd07e4f5efd098 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: removed service: service=_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Node info in sync Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-client-6fjxfhyz2jiim4hnvmvtsxcbwfm2h5dr Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-945798d49e323d77f992e7e446dbc387985002c1 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b04133fbc7d8783e1fd11c7e15e6eb2547744879 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-b9dda012195996edc2d9c087a8a1c2c770148269 Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=dns Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-7ac7c0dfa70f7ae6e4581f99b9acee5c0116a66f Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent: Check in sync: check=_nomad-check-60d1d99b67cc4f493d3f3ba938c93e5315999ebe Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=PUT url=/v1/agent/service/deregister/_nomad-task-564f7092-755c-1e5c-5fa3-e44b3036eb99-nginx-task-nginx-http-http from=127.0.0.1:48120 latency=4.198463ms Jul 17 19:37:41 ip-172-31-6-94 consul[1696]: agent.http: Request finished: method=GET url=/v1/agent/checks from=127.0.0.1:48120 latency="717.521µs" Jul 17 19:37:42 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:42.375Z [DEBUG] agent.dns: labels: querySuffixes=[] Jul 17 19:37:42 ip-172-31-6-94 consul[1696]: 2024-07-17T19:37:42.375Z [DEBUG] agent.dns: labels: querySuffixes=[] Jul 17 19:37:42 ip-172-31-6-94 consul[1696]: agent.dns: labels: querySuffixes=[] Jul 17 19:37:42 ip-172-31-6-94 consul[1696]: agent.dns: labels: querySuffixes=[] ```

Day 2

Logs from nginx when api container stoppped:

2024/07/18 14:51:48 [notice] 23#23: http file cache: /tmp/viewercache 0.000M, bsize: 4096
2024/07/18 14:51:48 [notice] 23#23: http file cache: /tmp/portalcache 0.000M, bsize: 4096
2024/07/18 14:51:48 [notice] 17#17: signal 17 (SIGCHLD) received from 23
2024/07/18 14:51:48 [notice] 17#17: cache loader process 23 exited with code 0
2024/07/18 14:51:48 [notice] 17#17: signal 29 (SIGIO) received
2024/07/18 14:53:05 [notice] 17#17: using the "epoll" event method
2024/07/18 14:53:05 [notice] 17#17: openresty/1.19.3.1
2024/07/18 14:53:05 [notice] 17#17: built by gcc 13.2.1 20231014 (Alpine 13.2.1_git20231014)
2024/07/18 14:53:05 [notice] 17#17: OS: Linux 6.8.0-1010-aws
2024/07/18 14:53:05 [notice] 17#17: getrlimit(RLIMIT_NOFILE): 1048576:1048576

Another try:

2024/07/18 15:09:47 [notice] 17#17: using the "epoll" event method
2024/07/18 15:09:47 [notice] 17#17: openresty/1.19.3.1
2024/07/18 15:09:47 [notice] 17#17: built by gcc 13.2.1 20231014 (Alpine 13.2.1_git20231014)
2024/07/18 15:09:47 [notice] 17#17: OS: Linux 6.8.0-1010-aws
2024/07/18 15:09:47 [notice] 17#17: getrlimit(RLIMIT_NOFILE): 1048576:1048576
2024/07/18 15:09:47 [notice] 17#17: start worker processes
2024/07/18 15:09:47 [notice] 17#17: start worker process 18
2024/07/18 15:09:47 [notice] 17#17: start worker process 19
2024/07/18 15:09:47 [notice] 17#17: start worker process 20
2024/07/18 15:09:47 [notice] 17#17: start worker process 21
2024/07/18 15:09:47 [notice] 17#17: start cache manager process 22
2024/07/18 15:09:47 [notice] 17#17: start cache loader process 23

Docker logs:

$ docker ps
CONTAINER ID   IMAGE                        COMMAND                  CREATED              STATUS              PORTS                                                                                                                                                                            NAMES
docker ps
CONTAINER ID   IMAGE                        COMMAND                  CREATED          STATUS          PORTS                                                                                                                                                                            NAMES
2a9332e2b06d   *** wi-nginx-task-67b5bbdf-42c4-6f11-b554-bcd5ff9cd3e0
e3c37b9bc2bc   *** wi-api-task-c5cd5053-7011-ca7b-0795-73901d33120e
...
Jul 18 15:09:22 ip-172-31-6-94 dockerd[1445]: time="2024-07-18T15:09:22.024003081Z" level=info msg="ignoring event" container=e3c37b9bc2bce229d45a8369c4572d2ecf26d5ef93c10231a78e202dec6344d4 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 18 15:09:29 ip-172-31-6-94 dockerd[1445]: time="2024-07-18T15:09:29.901144364Z" level=error msg="collecting stats for container /wi-nginx-task-67b5bbdf-42c4-6f11-b554-bcd5ff9cd3e0: no metrics received"
Jul 18 15:09:29 ip-172-31-6-94 dockerd[1445]: time="2024-07-18T15:09:29.911716221Z" level=info msg="ignoring event" container=2a9332e2b06d7de628189faabea5e3c492d137acd2fad47bc490898d2ae6f986 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"

This time in Consul I see that nginx still alive after 1 second docker received signal (I do not know from where) to stop nginx:

Jul 18 15:09:30 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=_nomad-task-67b5bbdf-42c4-6f
11-b554-bcd5ff9cd3e0-wi-nginx-task-wi-nginx-https
...
Jul 18 15:09:30 ip-172-31-6-94 consul[1696]: agent.proxycfg.agent-state: syncing proxy services from loca
l state
Jul 18 15:09:30 ip-172-31-6-94 consul[1696]: 2024-07-18T15:09:30.298Z [INFO]  agent: Deregistered service: service=_nomad-task-67b5bbdf-42c4-6f11-b554-bcd5ff9cd3e0-wi-nginx-task-wi-nginx-http-http
Jul 18 15:09:30 ip-172-31-6-94 consul[1696]: agent: Deregistered service: service=_nomad-task-67b5bbdf-42c4-6f11-b554-bcd5ff9cd3e0-wi-nginx-task-wi-nginx-http-http
Jul 18 15:09:30 ip-172-31-6-94 consul[1696]: agent: Service in sync: service=consul
Jul 18 15:09:30 ip-172-31-6-94 consul[1696]: 2024-07-18T15:09:30.298Z [DEBUG] agent: Service in sync: service=consul

From the logs above we do not see who kills NGINX (At least I do not see).

So I returned what was said on WebUI

Jul 18, '24 12:10:17 -0400  Killing     Template failed to send signals [hangup]: 1 error occurred: * Task not running
Jul 18, '24 12:10:04 -0400  Restarting  Task restarting in 18s
Jul 18, '24 12:10:04 -0400  Terminated  Exit Code: 129, Exit Message: "Docker container exited with non-zero exit code: 129"
Jul 18, '24 12:10:03 -0400  Signaling   Template re-rendered

From the link here: https://www.tencentcloud.com/document/product/457/35758#standard-linux-interruption-signals we know that 129 means SIGHUP.

So Nomad send signal as expected, but the container exits unexpectingly. Thus we should, probably refer to Docker to dig why it restart container or to entrypoint script:

c8b13dbf0245:/# ps
PID   USER     TIME  COMMAND
    1 root      0:00 /sbin/tini -- /opt/planitar/script/run.sh
    7 root      0:00 {run.sh} /bin/bash /opt/planitar/script/run.sh
   17 root      0:00 nginx: master process nginx -p /etc/nginx
   18 www       0:00 nginx: worker process
   19 www       0:00 nginx: worker process
   20 www       0:00 nginx: worker process
   21 www       0:00 nginx: worker process
   22 www       0:00 nginx: cache manager process
   24 root      0:00 bash
   34 root      0:00 ps

It is still unclear to my why nginx log does not have information about received?? signal.

@tgross Probably it will be interested to you this investigation becase this issue already 5yo.

EugenKon commented 2 months ago

The solution was to add handler for signal into DOCKER ENTRYPOINT script:

reload_nginx() {
  nginx -s reload
}
trap 'reload_nginx' SIGHUP