bsdpot / nomad-pot-driver

Nomad task driver for launching freebsd jails.
Apache License 2.0
84 stars 14 forks source link

pot task driver for Nomad does not work with Freebsd in AWS? #41

Closed OneOfTheJohns closed 1 year ago

OneOfTheJohns commented 1 year ago

Hi, im trying to use this task driver with Freebsd 12.4 in AWS as server and client os. I have go build the driver, have put it to the nomad plugins dir, i have installed pot pkg on client side, connected server and client, client does recognize the driver and gets the job request from server, this is the job code:

job "example" {
    datacenters = ["dc1"]
    type = "service"
    task "nginx-pot3" {
        driver = "pot"
        config {
        image = "18.203.**.**/test_1.0.xz"
        pot = "test3"
        tag = "1.0"
        command = "nginx"
        args = [
            "-g 'daemon off;'"
        ]
        network_mode = "public-bridge"
        port_map = {
            http = "80"
        }
        copy = [
            "/root/index.html:/usr/local/www/nginx-dist/index.html",
            "/root/nginx.conf:/usr/local/etc/nginx/nginx.conf"
        ]
        mount = [
            "/tmp/test:/root/test",
        ]
        mount_read_only = [
            "/tmp/test2:/root/test2"
        ]
        extra_hosts = [
            "artifactory.yourdomain.com:192.168.0.1",
            "mail.yourdomain.com:192.168.0.2"
        ]
    }
    }
}

But im getting such error on client side:

    2023-09-21T10:55:38.039Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-21T10:55:38.039Z
    2023-09-21T10:55:38.040Z [DEBUG] client.driver_mgr.nomad-pot-driver: Checking if pot is alive: driver=pot @module=pot timestamp=2023-09-21T10:55:38.039Z
    2023-09-21T10:55:38.054Z [DEBUG] client: updated allocations: index=1838 total=12 pulled=10 filtered=2
    2023-09-21T10:55:38.055Z [DEBUG] client: allocation updates: added=0 removed=0 updated=10 ignored=2
    2023-09-21T10:55:38.057Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError CheckContainerExists: driver=pot exitError=1 @module=pot timestamp=2023-09-21T10:55:38.057Z
    2023-09-21T10:55:38.057Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching createContainer command: driver=pot @module=pot log="prepare -U 18.203.**.**/test_1.0.xz -p test3 -t 1.0 -c \"nginx -g 'daemon off;'\" -N public-bridge -e 80: -a baa7f9a3_23c7bdb9-07e5-ba53-3860-1d56d4a274fc -n nginx-pot3 -v" timestamp=2023-09-21T10:55:38.057Z
    2023-09-21T10:55:38.081Z [ERROR] client.driver_mgr.nomad-pot-driver: Error creating container: driver=pot err="exit status 1" @module=pot timestamp=2023-09-21T10:55:38.081Z
    2023-09-21T10:55:38.082Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching DestroyContainer command: driver=pot @module=pot destroy -p nginx-pot3_baa7f9a3_23c7bdb9-07e5-ba53-3860-1d56d4a274fc -F=<unknown> timestamp=2023-09-21T10:55:38.081Z
    2023-09-21T10:55:38.095Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=10 ignored=2 errors=0
    2023-09-21T10:55:38.095Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=23c7bdb9-07e5-ba53-3860-1d56d4a274fc task=nginx-pot3 type="Driver Failure" msg="rpc error: code = Unknown desc = unable to create container: <nil>" failed=false
    2023-09-21T10:55:38.099Z [ERROR] client.alloc_runner.task_runner: running driver failed: alloc_id=23c7bdb9-07e5-ba53-3860-1d56d4a274fc task=nginx-pot3 error="rpc error: code = Unknown desc = unable to create container: <nil>"

I have created zfs zpool create -f zroot /dev/nvd1, and modified /usr/local/etc/pot/pot.conf with such line POT_ZFS_ROOT=zroot/pot , and pot init seemed to be worked, so i guess pot should be working properly. Any idea what im doing wrong?

grembo commented 1 year ago

@OneOfTheJohns

Did you configure the correct network interface in pot.conf?

Also: How to you start nomad?

Example from /etc/rc.conf (from some random machine):

nomad_enable="YES"
nomad_args="-config=/path/to/client.hcl -network-interface=em0"
nomad_user="root"
nomad_group="wheel"
nomad_env="PATH=/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/sbin:/bin"
nomad_debug="YES"
nomad_dir="/path/to/nomad/dir"

user/group and env are critical.

OneOfTheJohns commented 1 year ago

Thank you for your time @grembo Yes, the network interface seems to be correct, i have those lines in pot.conf

POT_EXTIF=ena0
POT_ZFS_ROOT=zroot/pot

I also have those lines in rc.conf:

nomad_enable="YES"
nomad_args="-config=/usr/local/etc/nomad/client1.hcl -network-interface=ena0"
nomad_user="root"
nomad_group="wheel"
nomad_env="PATH=/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/sbin:/bin"
nomad_debug="YES"
nomad_dir="/usr/local/etc/nomad"

not sure tho, what should actually be the value for nomad_dir

i start client using this command

nomad agent -client -data-dir /home/ec2-user/nomad-client -servers=172.31.0.132 -config=client1.hcl

i start server using this command

nomad agent -server -data-dir /home/ec2-user/nomad-server/ -node=server -config=server.hcl

The client1.hcl contains:

# Increase log verbosity
log_level = "DEBUG"

# Setup data dir
data_dir = "/tmp/client1"

# Give the agent a unique name. Defaults to hostname
name = "client1"

# Enable the client
client {
  enabled = true

  # For demo assume we are talking to server1. For production,
  # this should be like "nomad.service.consul:4647" and a system
  # like Consul used for service discovery.
  servers = ["172.31.0.132"]
}

# Modify our port to avoid a collision with server1
ports {
  http = 5656
}

plugin "pot" {
        plugin_dir="/home/ec2-user/nomad-client/plugins"
}

and server.hcl contains:

# Increase log verbosity
log_level = "DEBUG"

# Setup data dir
data_dir = "/tmp/server1"

# Give the agent a unique name. Defaults to hostname
name = "server1"

# Enable the server
server {
  enabled = true

  # Self-elect, should be 3 or 5 for production
  bootstrap_expect = 1
}

(which are pretty much non-modifed versions of some testing examples i got from pulling nomad repo i guess)

grembo commented 1 year ago

Hm, values in rc.conf don't catch on if you start nomad directly from the command line (these only work when using the service command).

Maybe it's best to run nomad in developer mode first to reduce potential sources of error (it's insecure in this state though, but for testing it should be ok):

consul agent -dev &
mkdir -p /var/tmp/nomad
/usr/local/bin/nomad agent -data-dir=/var/tmp/nomad -dev -network-interface lo0 -config /usr/local/etc/nomad/dev.hcl &

Content of dev.hcl:

consul {
  address = "127.0.0.1:8500"
}

I would recommend to install pot and the nomad driver using pkg.

Also, before doing any nomad testing, make sure you can actually create a pot on the command line:

pot create -p testpot -t single -b 12.4

In general, I would recommend to use FreeBSD 13.2 - if you have the opportunity.

OneOfTheJohns commented 1 year ago

Well, one of the issues was with job file, there was multiple issues with it, but what i have done to find out those issues, is that i have ran myself the pot prepare .... code that client side tries to run (can be found in debug logs of client), and by that found out the issues, as pot itself provides more sane errors than Nomad. Now with proper job file, client at least can create pots, but they are still in "unhealthy" state for some reason. here is the debug log from client side:

==> Nomad agent started! Log data will stream in below:

    2023-09-22T12:07:55.566Z [DEBUG] agent.plugin_loader: starting plugin: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver args=["/home/ec2-user/nomad-client/plugins/nomad-pot-driver"]
    2023-09-22T12:07:55.573Z [DEBUG] agent.plugin_loader.docker: using client connection initialized from environment: plugin_dir=/home/ec2-user/nomad-client/plugins
    2023-09-22T12:07:55.573Z [DEBUG] agent.plugin_loader.docker: using client connection initialized from environment: plugin_dir=/home/ec2-user/nomad-client/plugins
    2023-09-22T12:07:55.573Z [ERROR] agent.plugin_loader.docker: failed to list pause containers: plugin_dir=/home/ec2-user/nomad-client/plugins error=<nil>
    2023-09-22T12:07:55.582Z [DEBUG] agent.plugin_loader: plugin started: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver pid=61755
    2023-09-22T12:07:55.582Z [DEBUG] agent.plugin_loader: waiting for RPC address: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver
    2023-09-22T12:07:55.594Z [DEBUG] agent.plugin_loader.nomad-pot-driver: plugin address: plugin_dir=/home/ec2-user/nomad-client/plugins address=/tmp/plugin2918654382 network=unix timestamp=2023-09-22T12:07:55.593Z
    2023-09-22T12:07:55.594Z [DEBUG] agent.plugin_loader: using plugin: plugin_dir=/home/ec2-user/nomad-client/plugins version=2
    2023-09-22T12:07:55.595Z [DEBUG] agent.plugin_loader.stdio: received EOF, stopping recv loop: plugin_dir=/home/ec2-user/nomad-client/plugins err="rpc error: code = Unimplemented desc = unknown service plugin.GRPCStdio"
    2023-09-22T12:07:55.598Z [INFO]  agent.plugin_loader: plugin process exited: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver pid=61755
    2023-09-22T12:07:55.598Z [DEBUG] agent.plugin_loader: plugin exited: plugin_dir=/home/ec2-user/nomad-client/plugins
    2023-09-22T12:07:55.598Z [DEBUG] agent.plugin_loader: starting plugin: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver args=["/home/ec2-user/nomad-client/plugins/nomad-pot-driver"]
    2023-09-22T12:07:55.602Z [DEBUG] agent.plugin_loader: plugin started: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver pid=61757
    2023-09-22T12:07:55.602Z [DEBUG] agent.plugin_loader: waiting for RPC address: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver
    2023-09-22T12:07:55.609Z [DEBUG] agent.plugin_loader.nomad-pot-driver: plugin address: plugin_dir=/home/ec2-user/nomad-client/plugins network=unix address=/tmp/plugin2453146029 timestamp=2023-09-22T12:07:55.609Z
    2023-09-22T12:07:55.609Z [DEBUG] agent.plugin_loader: using plugin: plugin_dir=/home/ec2-user/nomad-client/plugins version=2
    2023-09-22T12:07:55.610Z [DEBUG] agent.plugin_loader.stdio: received EOF, stopping recv loop: plugin_dir=/home/ec2-user/nomad-client/plugins err="rpc error: code = Unimplemented desc = unknown service plugin.GRPCStdio"
    2023-09-22T12:07:55.612Z [INFO]  agent.plugin_loader: plugin process exited: plugin_dir=/home/ec2-user/nomad-client/plugins path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver pid=61757
    2023-09-22T12:07:55.612Z [DEBUG] agent.plugin_loader: plugin exited: plugin_dir=/home/ec2-user/nomad-client/plugins
    2023-09-22T12:07:55.613Z [ERROR] agent.plugin_loader.docker: failed to list pause containers: plugin_dir=/home/ec2-user/nomad-client/plugins error=<nil>
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=raw_exec type=driver plugin_version=0.1.0
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=exec type=driver plugin_version=0.1.0
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=pot type=driver plugin_version=v0.2.1
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=qemu type=driver plugin_version=0.1.0
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=java type=driver plugin_version=0.1.0
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=docker type=driver plugin_version=0.1.0
    2023-09-22T12:07:55.614Z [INFO]  agent: detected plugin: name=mock_driver type=driver plugin_version=0.1.0
    2023-09-22T12:07:55.614Z [INFO]  client: using state directory: state_dir=/home/ec2-user/nomad-client/client
    2023-09-22T12:07:55.614Z [INFO]  client: using alloc directory: alloc_dir=/home/ec2-user/nomad-client/alloc
    2023-09-22T12:07:55.614Z [INFO]  client: using dynamic ports: min=20000 max=32000 reserved=""
    2023-09-22T12:07:55.622Z [DEBUG] client.fingerprint_mgr: built-in fingerprints: fingerprinters=["arch", "cni", "consul", "cpu", "host", "landlock", "memory", "network", "nomad", "plugins_cni", "signal", "storage", "vault", "env_aws", "env_gce", "env_azure", "env_digitalocean"]
    2023-09-22T12:07:55.622Z [DEBUG] client.fingerprint_mgr: CNI config dir is not set or does not exist, skipping: cni_config_dir=/opt/cni/config
    2023-09-22T12:07:55.623Z [DEBUG] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=consul initial_period=15s
    2023-09-22T12:07:55.623Z [DEBUG] client.fingerprint_mgr.cpu: detected cpu frequency: MHz=2500
    2023-09-22T12:07:55.623Z [DEBUG] client.fingerprint_mgr.cpu: detected core count: cores=2
    2023-09-22T12:07:55.623Z [DEBUG] client.fingerprint_mgr.cpu: detected reservable cores: cpuset=[]
    2023-09-22T12:07:55.626Z [WARN]  client.fingerprint_mgr.landlock: failed to fingerprint kernel landlock feature: error="landlock not supported on this platform"
    2023-09-22T12:07:55.632Z [DEBUG] client.fingerprint_mgr.network: link speed could not be detected and no speed specified by user, falling back to default speed: interface=ena0 mbits=1000
    2023-09-22T12:07:55.632Z [DEBUG] client.fingerprint_mgr.network: detected interface IP: interface=ena0 IP=172.31.0.226
    2023-09-22T12:07:55.632Z [DEBUG] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=ena0 mbits=1000
    2023-09-22T12:07:55.641Z [DEBUG] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=lo0 mbits=1000
    2023-09-22T12:07:55.659Z [DEBUG] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=p4650d7ed99d91 mbits=1000
    2023-09-22T12:07:55.664Z [DEBUG] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=epair0b mbits=1000
    2023-09-22T12:07:55.676Z [DEBUG] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=bridge0 mbits=1000
    2023-09-22T12:07:55.680Z [WARN]  client.fingerprint_mgr.cni_plugins: failed to read CNI plugins directory: cni_path=/opt/cni/bin error="open /opt/cni/bin: no such file or directory"
    2023-09-22T12:07:55.687Z [DEBUG] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=vault initial_period=15s
    2023-09-22T12:07:55.688Z [DEBUG] client.fingerprint_mgr.env_digitalocean: could not read value for attribute: attribute=region resp_code=404
    2023-09-22T12:07:55.696Z [DEBUG] client.fingerprint_mgr.env_aws: read an empty value: attribute=network/interfaces/macs/06:4a:a4:bd:af:df/ipv6s
    2023-09-22T12:07:55.696Z [DEBUG] client.fingerprint_mgr.env_aws: lookup ec2 cpu: cores=2 ghz=2.5
    2023-09-22T12:07:55.696Z [DEBUG] client.fingerprint_mgr.env_aws: setting ec2 cpu: ticks=5000
    2023-09-22T12:07:55.697Z [DEBUG] client.fingerprint_mgr.env_gce: could not read value for attribute: attribute=machine-type resp_code=404
    2023-09-22T12:07:55.697Z [DEBUG] client.fingerprint_mgr.env_azure: could not read value for attribute: attribute=compute/azEnvironment resp_code=404
    2023-09-22T12:07:55.697Z [DEBUG] client.fingerprint_mgr: detected fingerprints: node_attrs=["arch", "cpu", "host", "network", "nomad", "signal", "storage", "env_aws"]
    2023-09-22T12:07:55.697Z [INFO]  client.plugin: starting plugin manager: plugin-type=csi
    2023-09-22T12:07:55.697Z [INFO]  client.plugin: starting plugin manager: plugin-type=driver
    2023-09-22T12:07:55.697Z [INFO]  client.plugin: starting plugin manager: plugin-type=device
    2023-09-22T12:07:55.697Z [DEBUG] client.device_mgr: exiting since there are no device plugins
    2023-09-22T12:07:55.697Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=exec health=undetected description="exec driver unsupported on client OS"
    2023-09-22T12:07:55.697Z [DEBUG] client.driver_mgr: starting plugin: driver=pot path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver args=["/home/ec2-user/nomad-client/plugins/nomad-pot-driver"]
    2023-09-22T12:07:55.701Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=raw_exec health=undetected description=disabled
    2023-09-22T12:07:55.701Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=qemu health=undetected description=""
    2023-09-22T12:07:55.701Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=java health=undetected description=""
    2023-09-22T12:07:55.702Z [ERROR] client.driver_mgr.docker: failed to list pause containers: driver=docker error=<nil>
    2023-09-22T12:07:55.702Z [DEBUG] client.driver_mgr.docker: could not connect to docker daemon: driver=docker endpoint=unix:///var/run/docker.sock error="Get \"http://unix.sock/version\": dial unix /var/run/docker.sock: connect: no such file or directory"
    2023-09-22T12:07:55.702Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=docker health=undetected description="Failed to connect to docker daemon"
    2023-09-22T12:07:55.702Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=mock_driver health=healthy description=Healthy
    2023-09-22T12:07:55.702Z [DEBUG] client.plugin: waiting on plugin manager initial fingerprint: plugin-type=driver
    2023-09-22T12:07:55.702Z [DEBUG] client.plugin: waiting on plugin manager initial fingerprint: plugin-type=device
    2023-09-22T12:07:55.702Z [DEBUG] client.plugin: finished plugin manager initial fingerprint: plugin-type=device
    2023-09-22T12:07:55.704Z [DEBUG] client.server_mgr: new server list: new_servers=[172.31.0.132:4647, 172.31.0.132:4647] old_servers=[]
    2023-09-22T12:07:55.704Z [DEBUG] client.driver_mgr: plugin started: driver=pot path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver pid=61770
    2023-09-22T12:07:55.704Z [DEBUG] client.driver_mgr: waiting for RPC address: driver=pot path=/home/ec2-user/nomad-client/plugins/nomad-pot-driver
    2023-09-22T12:07:55.710Z [DEBUG] client.driver_mgr.nomad-pot-driver: plugin address: driver=pot address=/tmp/plugin1057359489 network=unix timestamp=2023-09-22T12:07:55.710Z
    2023-09-22T12:07:55.710Z [DEBUG] client.driver_mgr: using plugin: driver=pot version=2
    2023-09-22T12:07:55.712Z [DEBUG] client.driver_mgr.stdio: received EOF, stopping recv loop: driver=pot err="rpc error: code = Unimplemented desc = unknown service plugin.GRPCStdio"
    2023-09-22T12:07:55.717Z [DEBUG] client.driver_mgr: initial driver fingerprint: driver=pot health=healthy description=healthy
    2023-09-22T12:07:55.718Z [DEBUG] client.driver_mgr: detected drivers: drivers="map[healthy:[mock_driver pot] undetected:[qemu java docker raw_exec exec]]"
    2023-09-22T12:07:55.718Z [DEBUG] client.plugin: finished plugin manager initial fingerprint: plugin-type=driver
    2023-09-22T12:07:55.722Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=19d85040-88fe-fb04-7a09-78ef0d4554ea task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.723Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=26e658d2-3d3b-9ec4-6680-809a1d24b4d6 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.725Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=27cc2a84-7126-f391-3f4c-11d3dd7233fb task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.726Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=29d3eee7-5475-a5a5-5bcd-db3eb9ba122c task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.727Z [DEBUG] client.server_mgr: new server list: new_servers=[172.31.0.132:4647] old_servers=[172.31.0.132:4647, 172.31.0.132:4647]
    2023-09-22T12:07:55.727Z [INFO]  client: node registration complete
    2023-09-22T12:07:55.727Z [DEBUG] client: evaluations triggered by node registration: num_evals=5
    2023-09-22T12:07:55.732Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:07:55.732Z
    2023-09-22T12:07:55.733Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=2e7ebfef-d94f-07a4-bd42-096036ceeee9 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.734Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=300b0edd-e60d-d81f-0a1f-653928eaa5c3 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.735Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=3f230dbc-e47e-d6c2-2967-ec065337c75d task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.736Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=44bc437a-9c8b-a613-0ace-931b62c2c055 task=nginx-pot8 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.740Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:07:55.740Z
    2023-09-22T12:07:55.741Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=453f63ba-6adf-15b7-052d-74ede4273da5 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.746Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:07:55.746Z
    2023-09-22T12:07:55.747Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=5671e505-ad09-cd40-f5d0-8ef29c63d6f7 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.748Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=5c82484d-f068-d68b-51f5-986b15665a81 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.749Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=5dffab8a-d586-96f0-97bb-d6091a56a1d9 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.750Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=5faa72d2-7a07-6176-d2af-3a3c5f0f3890 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.751Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=6dcde7da-61bf-7bb2-f98d-af3031d5addf task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.757Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=704ed4f8-3990-8dce-8a20-ae129debf675 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.758Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=73d587fd-e80e-d780-4cdd-033de2276d6f task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.761Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=79e78c54-ac46-faec-438b-6f89c1daf68f task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.766Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot exitStatus=1 @module=pot timestamp=2023-09-22T12:07:55.766Z
    2023-09-22T12:07:55.769Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=934192c1-15a5-21ae-2764-e6965473a816 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.772Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=95fd1e73-499f-9592-f148-9065410745b2 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.773Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=99431b97-1364-ec43-6e0d-abe36d26e035 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.774Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=9bd3299a-cae7-5963-afb5-1158bd49d508 task=nginx-pot8 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.780Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot exitStatus=1 @module=pot timestamp=2023-09-22T12:07:55.780Z
    2023-09-22T12:07:55.782Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=aa9977a8-b4ab-375b-50d7-ffbd7464a021 task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.783Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=ab497abb-1761-0961-c020-f255efa9b050 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.784Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=af7c36f9-7ce8-57b1-ecdd-85df5db20f2b task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.785Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=b3271abd-5ce4-cc8f-293e-2f32ad006ab0 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.786Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=b4c10cb3-0122-f00d-3f4c-c467c1f1addf task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.787Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=ca97bfb5-8cd7-f7e3-33a2-cbd1a8b6e50f task=nginx-pot7 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.792Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:07:55.791Z
    2023-09-22T12:07:55.793Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=d5ff9dd1-4baa-1339-e937-1ce92eade58c task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.794Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=fecfd534-847a-2fa3-d3b8-ede164240a58 task=nginx-pot3 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:55.794Z [INFO]  client: started client: node_id=b7dbc25f-53aa-65b7-fcca-0e7a804e4f6d
    2023-09-22T12:07:55.795Z [DEBUG] http: UI is enabled
    2023-09-22T12:07:55.795Z [DEBUG] http: UI is enabled
    2023-09-22T12:07:55.795Z [INFO]  client.gc: marking allocation for GC: alloc_id=3f230dbc-e47e-d6c2-2967-ec065337c75d
    2023-09-22T12:07:55.796Z [WARN]  client.host_stats: error fetching host disk usage stats: error="no such file or directory" partition=/opt/pot/jails/nginx-pot7_aea40ab7_ca97bfb5-8cd7-f7e3-33a2-cbd1a8b6e50f/m/local
    2023-09-22T12:07:55.796Z [WARN]  client.host_stats: error fetching host disk usage stats: error="no such file or directory" partition=/opt/pot/jails/nginx-pot7_aea40ab7_ca97bfb5-8cd7-f7e3-33a2-cbd1a8b6e50f/m/secrets
    2023-09-22T12:07:55.796Z [INFO]  client.gc: marking allocation for GC: alloc_id=44bc437a-9c8b-a613-0ace-931b62c2c055
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=5c82484d-f068-d68b-51f5-986b15665a81
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=704ed4f8-3990-8dce-8a20-ae129debf675
    2023-09-22T12:07:55.808Z [DEBUG] client: updated allocations: index=3135 total=29 pulled=20 filtered=9
    2023-09-22T12:07:55.808Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=9
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=af7c36f9-7ce8-57b1-ecdd-85df5db20f2b
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=6dcde7da-61bf-7bb2-f98d-af3031d5addf
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=5faa72d2-7a07-6176-d2af-3a3c5f0f3890
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=934192c1-15a5-21ae-2764-e6965473a816
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=d5ff9dd1-4baa-1339-e937-1ce92eade58c
    2023-09-22T12:07:55.800Z [INFO]  client.gc: marking allocation for GC: alloc_id=fecfd534-847a-2fa3-d3b8-ede164240a58
    2023-09-22T12:07:55.801Z [INFO]  client.gc: marking allocation for GC: alloc_id=99431b97-1364-ec43-6e0d-abe36d26e035
    2023-09-22T12:07:55.801Z [INFO]  client.gc: marking allocation for GC: alloc_id=ab497abb-1761-0961-c020-f255efa9b050
    2023-09-22T12:07:55.801Z [INFO]  client.gc: marking allocation for GC: alloc_id=b4c10cb3-0122-f00d-3f4c-c467c1f1addf
    2023-09-22T12:07:55.801Z [INFO]  client.gc: marking allocation for GC: alloc_id=ca97bfb5-8cd7-f7e3-33a2-cbd1a8b6e50f
    2023-09-22T12:07:55.801Z [INFO]  client.gc: marking allocation for GC: alloc_id=19d85040-88fe-fb04-7a09-78ef0d4554ea
    2023-09-22T12:07:55.809Z [INFO]  client.gc: marking allocation for GC: alloc_id=95fd1e73-499f-9592-f148-9065410745b2
    2023-09-22T12:07:55.812Z [DEBUG] client: evaluations triggered by node update: num_evals=5
    2023-09-22T12:07:55.812Z [DEBUG] client: state updated: node_status=ready
    2023-09-22T12:07:55.817Z [DEBUG] client: updated allocations: index=3156 total=30 pulled=21 filtered=9
    2023-09-22T12:07:55.821Z [INFO]  client.gc: marking allocation for GC: alloc_id=26e658d2-3d3b-9ec4-6680-809a1d24b4d6
    2023-09-22T12:07:55.843Z [INFO]  client.gc: marking allocation for GC: alloc_id=79e78c54-ac46-faec-438b-6f89c1daf68f
    2023-09-22T12:07:55.855Z [INFO]  client.gc: marking allocation for GC: alloc_id=5671e505-ad09-cd40-f5d0-8ef29c63d6f7
    2023-09-22T12:07:55.866Z [INFO]  client.gc: marking allocation for GC: alloc_id=73d587fd-e80e-d780-4cdd-033de2276d6f
    2023-09-22T12:07:55.880Z [INFO]  client.gc: marking allocation for GC: alloc_id=aa9977a8-b4ab-375b-50d7-ffbd7464a021
    2023-09-22T12:07:55.891Z [INFO]  client.gc: marking allocation for GC: alloc_id=5dffab8a-d586-96f0-97bb-d6091a56a1d9
    2023-09-22T12:07:55.903Z [INFO]  client.gc: marking allocation for GC: alloc_id=9bd3299a-cae7-5963-afb5-1158bd49d508
    2023-09-22T12:07:55.915Z [INFO]  client.gc: marking allocation for GC: alloc_id=29d3eee7-5475-a5a5-5bcd-db3eb9ba122c
    2023-09-22T12:07:55.924Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=9 errors=0
    2023-09-22T12:07:55.924Z [DEBUG] client: allocation updates: added=1 removed=0 updated=20 ignored=9
    2023-09-22T12:07:55.928Z [INFO]  client.gc: marking allocation for GC: alloc_id=b3271abd-5ce4-cc8f-293e-2f32ad006ab0
    2023-09-22T12:07:55.932Z [INFO]  client.gc: marking allocation for GC: alloc_id=300b0edd-e60d-d81f-0a1f-653928eaa5c3
    2023-09-22T12:07:55.944Z [INFO]  client.gc: marking allocation for GC: alloc_id=2e7ebfef-d94f-07a4-bd42-096036ceeee9
    2023-09-22T12:07:55.958Z [INFO]  client.gc: marking allocation for GC: alloc_id=453f63ba-6adf-15b7-052d-74ede4273da5
    2023-09-22T12:07:55.969Z [INFO]  client.gc: marking allocation for GC: alloc_id=27cc2a84-7126-f391-3f4c-11d3dd7233fb
    2023-09-22T12:07:56.059Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Received msg="Task received by client" failed=false
    2023-09-22T12:07:56.059Z [DEBUG] client: allocation updates applied: added=1 removed=0 updated=20 ignored=9 errors=0
    2023-09-22T12:07:56.060Z [DEBUG] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8
    2023-09-22T12:07:56.065Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type="Task Setup" msg="Building Task Directory" failed=false
    2023-09-22T12:07:56.069Z [DEBUG] client: updated allocations: index=3163 total=30 pulled=20 filtered=10
    2023-09-22T12:07:56.069Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:07:56.102Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"]
    2023-09-22T12:07:56.112Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 path=/usr/local/bin/nomad pid=61778
    2023-09-22T12:07:56.112Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 path=/usr/local/bin/nomad
    2023-09-22T12:07:56.144Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 @module=logmon address=/tmp/plugin2889885005 network=unix timestamp=2023-09-22T12:07:56.143Z
    2023-09-22T12:07:56.144Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 version=2
    2023-09-22T12:07:56.146Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 @module=logmon path=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/alloc/logs/.nginx-pot8.stdout.fifo timestamp=2023-09-22T12:07:56.146Z
    2023-09-22T12:07:56.146Z [DEBUG] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 @module=logmon path=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/alloc/logs/.nginx-pot8.stderr.fifo timestamp=2023-09-22T12:07:56.146Z
    2023-09-22T12:07:56.217Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:07:56.226Z [INFO]  client.driver_mgr.nomad-pot-driver: starting task: driver=pot @module=pot driver_cfg="{Image:18.203.160.62 Pot:test Tag:1.0 Alloc: Command:nginx Args:[] PortMap:map[] Name: NetworkMode:public-bridge Debug:false Verbose:false Mount:[] MountReadOnly:[] Copy:[] ExtraHosts:[]}" timestamp=2023-09-22T12:07:56.225Z
    2023-09-22T12:07:56.235Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:07:56.235Z
    2023-09-22T12:07:56.235Z [DEBUG] client.driver_mgr.nomad-pot-driver: Checking if pot is alive: driver=pot @module=pot timestamp=2023-09-22T12:07:56.235Z
    2023-09-22T12:07:56.258Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError CheckContainerExists: driver=pot @module=pot exitError=1 timestamp=2023-09-22T12:07:56.258Z
    2023-09-22T12:07:56.258Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching createContainer command: driver=pot @module=pot log="prepare -U 18.203.160.62 -p test -t 1.0 -c \"nginx\" -N public-bridge -a 81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -n nginx-pot8 -v" timestamp=2023-09-22T12:07:56.258Z
    2023-09-22T12:07:56.337Z [DEBUG] client: updated allocations: index=3164 total=30 pulled=20 filtered=10
    2023-09-22T12:07:56.337Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:07:56.419Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:07:56.498Z [DEBUG] client.driver_mgr.nomad-pot-driver: Mounting files on jail: : driver=pot /usr/local/bin/pot mount-in -p nginx-pot8_81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -d /home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/local -m /local=<unknown> @module=pot timestamp=2023-09-22T12:07:56.497Z
    2023-09-22T12:07:56.549Z [DEBUG] client.driver_mgr.nomad-pot-driver: Mounting files on jail: : driver=pot /usr/local/bin/pot mount-in -p nginx-pot8_81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -d /home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/secrets -m /secrets=<unknown> @module=pot timestamp=2023-09-22T12:07:56.549Z
    2023-09-22T12:07:56.607Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting env variables inside the pot: /usr/local/bin/pot set-env -p nginx-pot8_81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2  -E PAGER=less -E HOME=/root -E USER=ec2-user -E TERM_PROGRAM_VERSION=3.3a -E NOMAD_MEMORY_LIMIT=300 -E REMOTEHOST=82-196-191-90.dyn.estpak.ee -E BLOCKSIZE=K -E TERM_PROGRAM=tmux -E LOGNAME=ec2-user -E GROUP=wheel -E NOMAD_REGION=global -E MAIL=/var/mail/ec2-user -E OLDPWD=/usr/home/ec2-user -E HOSTTYPE=FreeBSD -E NOMAD_TASK_NAME=nginx-pot8 -E NOMAD_SECRETS_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/secrets -E ENV=/home/ec2-user/.shrc -E VENDOR=amd -E OSTYPE=FreeBSD -E NOMAD_SHORT_ALLOC_ID=cb34cd8f -E NOMAD_ALLOC_ID=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -E PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin -E NOMAD_JOB_ID=example8 -E NOMAD_JOB_NAME=example8 -E NOMAD_DC=dc1 -E MACHTYPE=x86_64 -E TMUX=/tmp/tmux-0/default,31733,0 -E SSH_CLIENT=90.191.196.82 40820 22 -E SSH_CONNECTION=90.191.196.82 40820 172.31.0.226 22 -E SSH_TTY=/dev/pts/1 -E NOMAD_TASK_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/local -E NOMAD_CPU_LIMIT=100 -E NOMAD_NAMESPACE=default -E PWD=/usr/local/etc/nomad -E NOMAD_ALLOC_NAME=example8.nginx-pot8[0] -E SHELL=/bin/csh -E HOST=freebsd -E TERM=screen-256color -E NOMAD_ALLOC_INDEX=0 -E NOMAD_ALLOC_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/alloc -E NOMAD_GROUP_NAME=nginx-pot8 -E EDITOR=vi -E SHLVL=1 -E TMUX_PANE=%2: driver=pot @module=pot timestamp=2023-09-22T12:07:56.607Z
    2023-09-22T12:07:56.659Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting env variables inside the pot: : driver=pot @module=pot timestamp=2023-09-22T12:07:56.659Z
    2023-09-22T12:07:56.661Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting memory soft limit on jail: /usr/local/bin/pot set-rss -M 300M -p nginx-pot8_81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2: driver=pot @module=pot timestamp=2023-09-22T12:07:56.661Z
    2023-09-22T12:07:56.682Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching StartContainer command: driver=pot start nginx-pot8_81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2=<unknown> @module=pot timestamp=2023-09-22T12:07:56.682Z
    2023-09-22T12:07:56.683Z [DEBUG] client.driver_mgr.nomad-pot-driver: Starting container: driver=pot @module=pot psState="&{Pid:62247 ExitCode:0 Signal:0 Time:2023-09-22 12:07:56.683174421 +0000 UTC m=+0.977500498}" timestamp=2023-09-22T12:07:56.683Z
    2023-09-22T12:07:56.691Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Started msg="Task started by client" failed=false
    2023-09-22T12:07:56.797Z [DEBUG] client: updated allocations: index=3165 total=30 pulled=20 filtered=10
    2023-09-22T12:07:56.797Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:07:56.881Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:07:58.811Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching DestroyContainer command: driver=pot @module=pot destroy -p nginx-pot8_81ded15e_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -F=<unknown> timestamp=2023-09-22T12:07:58.811Z
    2023-09-22T12:07:59.718Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Terminated msg="Exit Code: 0" failed=false
    2023-09-22T12:07:59.722Z [INFO]  client.alloc_runner.task_runner: restarting task: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 reason="Restart within policy" delay=18.157976165s
    2023-09-22T12:07:59.723Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Restarting msg="Task restarting in 18.157976165s" failed=false
    2023-09-22T12:07:59.860Z [DEBUG] client: updated allocations: index=3167 total=30 pulled=20 filtered=10
    2023-09-22T12:07:59.860Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:07:59.941Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:08:01.748Z [DEBUG] client: state changed, updating node and re-registering
    2023-09-22T12:08:01.764Z [INFO]  client: node registration complete
==> Newer Nomad version available: 1.6.2 (currently running: 1.5.3)
    2023-09-22T12:08:17.885Z [DEBUG] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8
    2023-09-22T12:08:17.888Z [INFO]  client.driver_mgr.nomad-pot-driver: starting task: driver=pot driver_cfg="{Image:18.203.160.62 Pot:test Tag:1.0 Alloc: Command:nginx Args:[] PortMap:map[] Name: NetworkMode:public-bridge Debug:false Verbose:false Mount:[] MountReadOnly:[] Copy:[] ExtraHosts:[]}" @module=pot timestamp=2023-09-22T12:08:17.888Z
    2023-09-22T12:08:17.892Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:08:17.892Z
    2023-09-22T12:08:17.892Z [DEBUG] client.driver_mgr.nomad-pot-driver: Checking if pot is alive: driver=pot @module=pot timestamp=2023-09-22T12:08:17.892Z
    2023-09-22T12:08:17.903Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError CheckContainerExists: driver=pot exitError=1 @module=pot timestamp=2023-09-22T12:08:17.903Z
    2023-09-22T12:08:17.903Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching createContainer command: driver=pot log="prepare -U 18.203.160.62 -p test -t 1.0 -c \"nginx\" -N public-bridge -a c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -n nginx-pot8 -v" @module=pot timestamp=2023-09-22T12:08:17.903Z
    2023-09-22T12:08:18.216Z [DEBUG] client.driver_mgr.nomad-pot-driver: Mounting files on jail: : driver=pot @module=pot /usr/local/bin/pot mount-in -p nginx-pot8_c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -d /home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/local -m /local=<unknown> timestamp=2023-09-22T12:08:18.216Z
    2023-09-22T12:08:18.267Z [DEBUG] client.driver_mgr.nomad-pot-driver: Mounting files on jail: : driver=pot @module=pot /usr/local/bin/pot mount-in -p nginx-pot8_c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -d /home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/secrets -m /secrets=<unknown> timestamp=2023-09-22T12:08:18.267Z
    2023-09-22T12:08:18.324Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting env variables inside the pot: /usr/local/bin/pot set-env -p nginx-pot8_c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2  -E TERM_PROGRAM_VERSION=3.3a -E OSTYPE=FreeBSD -E NOMAD_SECRETS_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/secrets -E NOMAD_NAMESPACE=default -E PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin -E TERM=screen-256color -E USER=ec2-user -E NOMAD_JOB_NAME=example8 -E NOMAD_ALLOC_ID=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -E NOMAD_REGION=global -E SSH_CLIENT=90.191.196.82 40820 22 -E NOMAD_ALLOC_NAME=example8.nginx-pot8[0] -E LOGNAME=ec2-user -E ENV=/home/ec2-user/.shrc -E HOSTTYPE=FreeBSD -E NOMAD_JOB_ID=example8 -E SHELL=/bin/csh -E REMOTEHOST=82-196-191-90.dyn.estpak.ee -E PAGER=less -E NOMAD_MEMORY_LIMIT=300 -E NOMAD_TASK_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/local -E PWD=/usr/local/etc/nomad -E NOMAD_SHORT_ALLOC_ID=cb34cd8f -E HOME=/root -E NOMAD_GROUP_NAME=nginx-pot8 -E EDITOR=vi -E NOMAD_CPU_LIMIT=100 -E SHLVL=1 -E NOMAD_ALLOC_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/alloc -E TMUX=/tmp/tmux-0/default,31733,0 -E BLOCKSIZE=K -E TERM_PROGRAM=tmux -E VENDOR=amd -E NOMAD_DC=dc1 -E SSH_CONNECTION=90.191.196.82 40820 172.31.0.226 22 -E MACHTYPE=x86_64 -E NOMAD_TASK_NAME=nginx-pot8 -E MAIL=/var/mail/ec2-user -E OLDPWD=/usr/home/ec2-user -E SSH_TTY=/dev/pts/1 -E NOMAD_ALLOC_INDEX=0 -E GROUP=wheel -E HOST=freebsd -E TMUX_PANE=%2: driver=pot @module=pot timestamp=2023-09-22T12:08:18.323Z
    2023-09-22T12:08:18.356Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting env variables inside the pot: : driver=pot @module=pot timestamp=2023-09-22T12:08:18.356Z
    2023-09-22T12:08:18.359Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting memory soft limit on jail: /usr/local/bin/pot set-rss -M 300M -p nginx-pot8_c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2: driver=pot @module=pot timestamp=2023-09-22T12:08:18.358Z
    2023-09-22T12:08:18.378Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching StartContainer command: driver=pot @module=pot start nginx-pot8_c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2=<unknown> timestamp=2023-09-22T12:08:18.378Z
    2023-09-22T12:08:18.380Z [DEBUG] client.driver_mgr.nomad-pot-driver: Starting container: driver=pot @module=pot psState="&{Pid:63468 ExitCode:0 Signal:0 Time:2023-09-22 12:08:18.380000362 +0000 UTC m=+22.674326579}" timestamp=2023-09-22T12:08:18.380Z
    2023-09-22T12:08:18.385Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Started msg="Task started by client" failed=false
    2023-09-22T12:08:18.516Z [DEBUG] client: updated allocations: index=3169 total=30 pulled=20 filtered=10
    2023-09-22T12:08:18.516Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:08:18.599Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:08:20.413Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching DestroyContainer command: driver=pot @module=pot destroy -p nginx-pot8_c31c7138_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -F=<unknown> timestamp=2023-09-22T12:08:20.412Z
    2023-09-22T12:08:21.417Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Terminated msg="Exit Code: 0" failed=false
    2023-09-22T12:08:21.422Z [INFO]  client.alloc_runner.task_runner: restarting task: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 reason="Restart within policy" delay=17.164426139s
    2023-09-22T12:08:21.422Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Restarting msg="Task restarting in 17.164426139s" failed=false
    2023-09-22T12:08:21.577Z [DEBUG] client: updated allocations: index=3170 total=30 pulled=20 filtered=10
    2023-09-22T12:08:21.578Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:08:21.666Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:08:38.591Z [DEBUG] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8
    2023-09-22T12:08:38.594Z [INFO]  client.driver_mgr.nomad-pot-driver: starting task: driver=pot driver_cfg="{Image:18.203.160.62 Pot:test Tag:1.0 Alloc: Command:nginx Args:[] PortMap:map[] Name: NetworkMode:public-bridge Debug:false Verbose:false Mount:[] MountReadOnly:[] Copy:[] ExtraHosts:[]}" @module=pot timestamp=2023-09-22T12:08:38.594Z
    2023-09-22T12:08:38.599Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError checkContainerAlive: driver=pot @module=pot exitStatus=1 timestamp=2023-09-22T12:08:38.599Z
    2023-09-22T12:08:38.599Z [DEBUG] client.driver_mgr.nomad-pot-driver: Checking if pot is alive: driver=pot @module=pot timestamp=2023-09-22T12:08:38.599Z
    2023-09-22T12:08:38.615Z [ERROR] client.driver_mgr.nomad-pot-driver: ExitError CheckContainerExists: driver=pot exitError=1 @module=pot timestamp=2023-09-22T12:08:38.615Z
    2023-09-22T12:08:38.615Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching createContainer command: driver=pot log="prepare -U 18.203.160.62 -p test -t 1.0 -c \"nginx\" -N public-bridge -a ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -n nginx-pot8 -v" @module=pot timestamp=2023-09-22T12:08:38.615Z
    2023-09-22T12:08:38.892Z [DEBUG] client.driver_mgr.nomad-pot-driver: Mounting files on jail: : driver=pot @module=pot /usr/local/bin/pot mount-in -p nginx-pot8_ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -d /home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/local -m /local=<unknown> timestamp=2023-09-22T12:08:38.891Z
    2023-09-22T12:08:38.952Z [DEBUG] client.driver_mgr.nomad-pot-driver: Mounting files on jail: : driver=pot @module=pot /usr/local/bin/pot mount-in -p nginx-pot8_ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -d /home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/secrets -m /secrets=<unknown> timestamp=2023-09-22T12:08:38.952Z
    2023-09-22T12:08:39.005Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting env variables inside the pot: /usr/local/bin/pot set-env -p nginx-pot8_ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2  -E NOMAD_REGION=global -E LOGNAME=ec2-user -E TMUX_PANE=%2 -E PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin -E NOMAD_ALLOC_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/alloc -E SHELL=/bin/csh -E EDITOR=vi -E SHLVL=1 -E SSH_TTY=/dev/pts/1 -E NOMAD_JOB_ID=example8 -E NOMAD_NAMESPACE=default -E HOME=/root -E BLOCKSIZE=K -E ENV=/home/ec2-user/.shrc -E TERM_PROGRAM_VERSION=3.3a -E NOMAD_TASK_NAME=nginx-pot8 -E NOMAD_SHORT_ALLOC_ID=cb34cd8f -E NOMAD_DC=dc1 -E OLDPWD=/usr/home/ec2-user -E USER=ec2-user -E NOMAD_ALLOC_INDEX=0 -E GROUP=wheel -E SSH_CONNECTION=90.191.196.82 40820 172.31.0.226 22 -E TMUX=/tmp/tmux-0/default,31733,0 -E VENDOR=amd -E NOMAD_SECRETS_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/secrets -E REMOTEHOST=82-196-191-90.dyn.estpak.ee -E TERM_PROGRAM=tmux -E NOMAD_ALLOC_NAME=example8.nginx-pot8[0] -E NOMAD_MEMORY_LIMIT=300 -E SSH_CLIENT=90.191.196.82 40820 22 -E NOMAD_CPU_LIMIT=100 -E TERM=screen-256color -E PAGER=less -E NOMAD_JOB_NAME=example8 -E MAIL=/var/mail/ec2-user -E OSTYPE=FreeBSD -E MACHTYPE=x86_64 -E NOMAD_GROUP_NAME=nginx-pot8 -E HOST=freebsd -E PWD=/usr/local/etc/nomad -E NOMAD_ALLOC_ID=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -E NOMAD_TASK_DIR=/home/ec2-user/nomad-client/alloc/cb34cd8f-388d-7a38-e3fe-e01b62c93aa2/nginx-pot8/local -E HOSTTYPE=FreeBSD: driver=pot @module=pot timestamp=2023-09-22T12:08:39.004Z
    2023-09-22T12:08:39.046Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting env variables inside the pot: : driver=pot @module=pot timestamp=2023-09-22T12:08:39.046Z
    2023-09-22T12:08:39.049Z [DEBUG] client.driver_mgr.nomad-pot-driver: Setting memory soft limit on jail: /usr/local/bin/pot set-rss -M 300M -p nginx-pot8_ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2: driver=pot @module=pot timestamp=2023-09-22T12:08:39.049Z
    2023-09-22T12:08:39.067Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching StartContainer command: driver=pot start nginx-pot8_ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2=<unknown> @module=pot timestamp=2023-09-22T12:08:39.067Z
    2023-09-22T12:08:39.069Z [DEBUG] client.driver_mgr.nomad-pot-driver: Starting container: driver=pot @module=pot psState="&{Pid:64725 ExitCode:0 Signal:0 Time:2023-09-22 12:08:39.068816432 +0000 UTC m=+43.363142599}" timestamp=2023-09-22T12:08:39.068Z
    2023-09-22T12:08:39.073Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Started msg="Task started by client" failed=false
    2023-09-22T12:08:39.239Z [DEBUG] client: updated allocations: index=3171 total=30 pulled=20 filtered=10
    2023-09-22T12:08:39.239Z [DEBUG] client: allocation updates: added=0 removed=0 updated=20 ignored=10
    2023-09-22T12:08:39.326Z [DEBUG] client: allocation updates applied: added=0 removed=0 updated=20 ignored=10 errors=0
    2023-09-22T12:08:41.115Z [DEBUG] client.driver_mgr.nomad-pot-driver: launching DestroyContainer command: driver=pot @module=pot destroy -p nginx-pot8_ac3fa2ca_cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 -F=<unknown> timestamp=2023-09-22T12:08:41.115Z
    2023-09-22T12:08:42.090Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type=Terminated msg="Exit Code: 0" failed=false
    2023-09-22T12:08:42.095Z [INFO]  client.alloc_runner.task_runner: not restarting task: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 reason="Exceeded allowed attempts 2 in interval 30m0s and mode is \"fail\""
    2023-09-22T12:08:42.095Z [INFO]  client.alloc_runner.task_runner: Task event: alloc_id=cb34cd8f-388d-7a38-e3fe-e01b62c93aa2 task=nginx-pot8 type="Not Restarting" msg="Exceeded allowed attempts 2 in interval 30m0s and mode is \"fail\"" failed=true
grembo commented 1 year ago

@OneOfTheJohns Could you share your job definition?

OneOfTheJohns commented 1 year ago

Yes, quite simple one, but supposedly all essential parameters are defined there

job "example8" {
        datacenters = ["dc1"]
        type = "service"
        task "nginx-pot8" {
                driver = "pot"
                config {
                        image = "18.203.**.**"
                        pot = "test"
                        tag = "1.0"
                        command = "nginx"
                        network_mode = "public-bridge"
                }
        }
}
grembo commented 1 year ago

@OneOfTheJohns Based on your example I tried the one below, which runs fine here (not on AWS, but very simple dev setup - no port mapping, health checks etc. of course):

job "example9" {
        datacenters = ["dc1"]
        type = "service"
        task "nginx-pot0" {
                driver = "pot"
                config {
                        image = "https://potluck.honeyguide.net/nginx-nomad"
                        pot = "nginx-nomad-amd64-13_1"
                        tag = "1.1.13"
                        command = "nginx"
                        args = ["-g","'daemon off;'"]
                        network_mode = "public-bridge"
                }
        }
}

The key is that when running nomad jobs, the container needs an executable that stays in the foreground (as this PID is killed to stop the job and used to determine if things stopped working).

In the case of nginx, args = ["-g","'daemon off;'"] gets the job done.

You can see this in nomad-ui, which shows Exit Code: 0 for the task right after starting it:

nomad-ui.png
OneOfTheJohns commented 1 year ago

Ahhhh, well now i know, thank you very much for your help.