saltstack / salt

Software to automate the management and configuration of any infrastructure or application at scale. Get access to the Salt software package repository here:
https://repo.saltproject.io/
Apache License 2.0
14.12k stars 5.47k forks source link

lingering connections on the master #34719

Closed d3xt3r01 closed 6 years ago

d3xt3r01 commented 8 years ago

There is just ONE out of 10 minions that has this issue After a few days, hundreds of "ESTABLISHED" connections will linger on the master, yet only one on the minion.

salt-master # salt --versions-report
Salt Version:
           Salt: 2016.3.1

Dependency Versions:
           cffi: 1.5.2
       cherrypy: 3.3.0
       dateutil: 2.4.2
          gitdb: Not Installed
      gitpython: Not Installed
          ioflo: Not Installed
         Jinja2: 2.8
        libgit2: Not Installed
        libnacl: Not Installed
       M2Crypto: Not Installed
           Mako: Not Installed
   msgpack-pure: Not Installed
 msgpack-python: 0.4.6
   mysql-python: 1.2.5
      pycparser: 2.14
       pycrypto: 2.6.1
         pygit2: Not Installed
         Python: 2.7.10 (default, Nov 18 2015, 16:28:37)
   python-gnupg: Not Installed
         PyYAML: 3.11
          PyZMQ: 14.4.1
           RAET: Not Installed
          smmap: Not Installed
        timelib: Not Installed
        Tornado: 4.2.1
            ZMQ: 4.1.4

System Versions:
           dist: gentoo 2.2
        machine: x86_64
        release: 4.5.2-aufs
         system: Linux
        version: Gentoo Base System 2.2
salt-minion # salt-minion --versions-report
Salt Version:
           Salt: 2016.3.1

Dependency Versions:
           cffi: Not Installed
       cherrypy: Not Installed
       dateutil: Not Installed
          gitdb: Not Installed
      gitpython: Not Installed
          ioflo: Not Installed
         Jinja2: 2.7.2
        libgit2: Not Installed
        libnacl: Not Installed
       M2Crypto: 0.21.1
           Mako: Not Installed
   msgpack-pure: Not Installed
 msgpack-python: 0.4.7
   mysql-python: Not Installed
      pycparser: Not Installed
       pycrypto: 2.6.1
         pygit2: Not Installed
         Python: 2.7.5 (default, Nov 20 2015, 02:00:19)
   python-gnupg: Not Installed
         PyYAML: 3.11
          PyZMQ: 14.7.0
           RAET: Not Installed
          smmap: Not Installed
        timelib: Not Installed
        Tornado: 4.2.1
            ZMQ: 4.0.5

System Versions:
           dist: centos 7.2.1511 Core
        machine: x86_64
        release: 3.10.0-327.10.1.el7.x86_64
         system: Linux
        version: CentOS Linux 7.2.1511 Core
salt-master # grep -vE '(^$|\#)' /etc/salt/master
interface: x.x.1.95
ipv6: False
open_mode: False
hash_type: sha256
log_level: trace
salt-minion # grep -vE '(^$|\#)' /etc/salt/minion
master: salt_master_host
master_alive_interval: 30
ipv6: False
id: europe-ireland-azure
hash_type: sha256
log_level: trace
salt-master # while true ; do date ; netstat -plantu | grep -i python | grep x.x.196.12 ; sleep 30 ; done
Sat Jul 16 19:39:02 EEST 2016
tcp        0      0 x.x.1.95:4505       x.x.196.12:50436      ESTABLISHED 14060/python2.7
Sat Jul 16 19:39:32 EEST 2016
tcp        0      0 x.x.1.95:4505       x.x.196.12:50436      ESTABLISHED 14060/python2.7
Sat Jul 16 19:40:02 EEST 2016
tcp        0      0 x.x.1.95:4505       x.x.196.12:50436      ESTABLISHED 14060/python2.7
Sat Jul 16 19:40:32 EEST 2016
tcp        0      0 x.x.1.95:4505       x.x.196.12:50436      ESTABLISHED 14060/python2.7
tcp        0      0 x.x.1.95:4505       x.x.196.12:50588      ESTABLISHED 14060/python2.7
Sat Jul 16 19:41:02 EEST 2016
tcp        0      0 x.x.1.95:4505       x.x.196.12:50436      ESTABLISHED 14060/python2.7
tcp        0      0 x.x.1.95:4505       x.x.196.12:50588      ESTABLISHED 14060/python2.7
Sat Jul 16 19:41:32 EEST 2016
tcp        0      0 x.x.1.95:4505       x.x.196.12:50436      ESTABLISHED 14060/python2.7
tcp        0      0 x.x.1.95:4505       x.x.196.12:50588      ESTABLISHED 14060/python2.7
salt-minion # while true ; do date ; netstat -plantu | grep -i python | grep x.x.171.49 ; sleep 30 ; done
Sat Jul 16 19:39:02 EEST 2016
tcp        0      0 x.x.0.5:50436       x.x.171.49:4505       ESTABLISHED 48610/python2.7
Sat Jul 16 19:39:32 EEST 2016
tcp        0      0 x.x.0.5:50436       x.x.171.49:4505       ESTABLISHED 48610/python2.7
Sat Jul 16 19:40:02 EEST 2016
tcp        0      0 x.x.0.5:50436       x.x.171.49:4505       ESTABLISHED 48610/python2.7
Sat Jul 16 19:40:32 EEST 2016
tcp        0      0 x.x.0.5:50588       x.x.171.49:4505       ESTABLISHED 48610/python2.7
Sat Jul 16 19:41:02 EEST 2016
tcp        0      0 x.x.0.5:50588       x.x.171.49:4505       ESTABLISHED 48610/python2.7
Sat Jul 16 19:41:32 EEST 2016
tcp        0      0 x.x.0.5:50588       x.x.171.49:4505       ESTABLISHED 48610/python2.7
salt-minion # tail -f /var/log/salt/minion
2016-07-16 16:39:00,826 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:39:10,827 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:39:20,828 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:39:24,461 [salt.utils.schedule][INFO    ][48610] Running scheduled job: __master_alive
2016-07-16 16:39:24,462 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
2016-07-16 16:39:24,462 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with a max number of 1
2016-07-16 16:39:24,503 [salt.utils.schedule][DEBUG   ][48919] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20160716163924472948', 'schedule': '__master_alive', 'pid': 48919, 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:39:24,552 [salt.utils.event ][DEBUG   ][48919] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_6879782621_pub.ipc
2016-07-16 16:39:24,553 [salt.utils.event ][DEBUG   ][48919] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:24,561 [salt.transport.ipc][DEBUG   ][48919] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:24,571 [salt.transport.ipc][TRACE   ][48919] IPCClient: Connecting to socket: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:24,572 [salt.transport.ipc][TRACE   ][48610] IPCServer: Handling connection to address:
2016-07-16 16:39:24,573 [salt.utils.event ][DEBUG   ][48919] Sending event - data = {'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': 'req', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'cmd': '_return', 'pid': 4891
9, '_stamp': '2016-07-16T16:39:24.572495', 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:39:24,582 [salt.minion      ][DEBUG   ][48610] Handling event tag '__schedule_return'
2016-07-16 16:39:24,582 [salt.minion      ][INFO    ][48610] Returning information for job: req
2016-07-16 16:39:24,582 [salt.transport.zeromq][DEBUG   ][48610] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes')
2016-07-16 16:39:24,591 [salt.crypt       ][DEBUG   ][48610] Re-using AsyncAuth for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506')
2016-07-16 16:39:24,592 [salt.transport.zeromq][TRACE   ][48610] Inserted key into loop_instance_map id 56920904 for key ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes') and process 48610
2016-07-16 16:39:24,601 [salt.minion      ][TRACE   ][48610] ret_val = <tornado.concurrent.Future object at 0x3809ad0>
2016-07-16 16:39:24,602 [salt.utils.schedule][DEBUG   ][48919] schedule.handle_func: Removing /var/cache/salt/minion/proc/20160716163924472948
2016-07-16 16:39:24,611 [salt.transport.ipc][TRACE   ][48610] Client disconnected from IPC /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:30,829 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:39:40,831 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:39:50,832 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:39:54,461 [salt.utils.schedule][INFO    ][48610] Running scheduled job: __master_alive
2016-07-16 16:39:54,462 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
2016-07-16 16:39:54,462 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with a max number of 1
2016-07-16 16:39:54,494 [salt.utils.schedule][DEBUG   ][48924] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20160716163954473026', 'schedule': '__master_alive', 'pid': 48924, 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:39:54,610 [salt.utils.event ][DEBUG   ][48924] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_6879782621_pub.ipc
2016-07-16 16:39:54,618 [salt.utils.event ][DEBUG   ][48924] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:54,619 [salt.transport.ipc][DEBUG   ][48924] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:54,628 [salt.transport.ipc][TRACE   ][48924] IPCClient: Connecting to socket: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:39:54,629 [salt.transport.ipc][TRACE   ][48610] IPCServer: Handling connection to address:
2016-07-16 16:39:54,637 [salt.utils.event ][DEBUG   ][48924] Sending event - data = {'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': 'req', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'cmd': '_return', 'pid': 4892
4, '_stamp': '2016-07-16T16:39:54.637218', 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:39:54,639 [salt.minion      ][DEBUG   ][48610] Handling event tag '__schedule_return'
2016-07-16 16:39:54,639 [salt.minion      ][INFO    ][48610] Returning information for job: req
2016-07-16 16:39:54,647 [salt.utils.schedule][DEBUG   ][48924] schedule.handle_func: Removing /var/cache/salt/minion/proc/20160716163954473026
2016-07-16 16:39:54,656 [salt.transport.zeromq][DEBUG   ][48610] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes')
2016-07-16 16:39:54,656 [salt.crypt       ][DEBUG   ][48610] Re-using AsyncAuth for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506')
2016-07-16 16:39:54,657 [salt.transport.zeromq][TRACE   ][48610] Inserted key into loop_instance_map id 56920904 for key ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes') and process 48610
2016-07-16 16:39:54,665 [salt.minion      ][TRACE   ][48610] ret_val = <tornado.concurrent.Future object at 0x3809e90>
2016-07-16 16:39:54,666 [salt.transport.ipc][TRACE   ][48610] Client disconnected from IPC /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:00,833 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:40:10,834 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:40:20,836 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:40:24,460 [salt.utils.schedule][INFO    ][48610] Running scheduled job: __master_alive
2016-07-16 16:40:24,461 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
2016-07-16 16:40:24,461 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with a max number of 1
2016-07-16 16:40:24,532 [salt.utils.schedule][DEBUG   ][48937] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20160716164024472176', 'schedule': '__master_alive', 'pid': 48937, 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:40:24,563 [salt.utils.event ][DEBUG   ][48937] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_6879782621_pub.ipc
2016-07-16 16:40:24,571 [salt.utils.event ][DEBUG   ][48937] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:24,572 [salt.transport.ipc][DEBUG   ][48937] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:24,581 [salt.transport.ipc][TRACE   ][48937] IPCClient: Connecting to socket: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:24,582 [salt.transport.ipc][TRACE   ][48610] IPCServer: Handling connection to address:
2016-07-16 16:40:24,600 [salt.utils.event ][DEBUG   ][48937] Sending event - data = {'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': 'req', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'cmd': '_return', 'pid': 4893
7, '_stamp': '2016-07-16T16:40:24.599641', 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:40:24,601 [salt.minion      ][DEBUG   ][48610] Handling event tag '__schedule_return'
2016-07-16 16:40:24,601 [salt.minion      ][INFO    ][48610] Returning information for job: req
2016-07-16 16:40:24,601 [salt.transport.zeromq][DEBUG   ][48610] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes')
2016-07-16 16:40:24,609 [salt.utils.schedule][DEBUG   ][48937] schedule.handle_func: Removing /var/cache/salt/minion/proc/20160716164024472176
2016-07-16 16:40:24,618 [salt.crypt       ][DEBUG   ][48610] Re-using AsyncAuth for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506')
2016-07-16 16:40:24,619 [salt.transport.zeromq][TRACE   ][48610] Inserted key into loop_instance_map id 56920904 for key ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes') and process 48610
2016-07-16 16:40:24,619 [salt.minion      ][TRACE   ][48610] ret_val = <tornado.concurrent.Future object at 0x3809b50>
2016-07-16 16:40:24,627 [salt.transport.ipc][TRACE   ][48610] Client disconnected from IPC /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:30,837 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:40:40,839 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:40:50,840 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:40:54,461 [salt.utils.schedule][INFO    ][48610] Running scheduled job: __master_alive
2016-07-16 16:40:54,462 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
2016-07-16 16:40:54,462 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with a max number of 1
2016-07-16 16:40:54,508 [salt.utils.schedule][DEBUG   ][48942] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20160716164054484934', 'schedule': '__master_alive', 'pid': 48942, 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:40:54,594 [salt.utils.event ][DEBUG   ][48942] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_6879782621_pub.ipc
2016-07-16 16:40:54,616 [salt.utils.event ][DEBUG   ][48942] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:54,617 [salt.transport.ipc][DEBUG   ][48942] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:54,627 [salt.transport.ipc][TRACE   ][48942] IPCClient: Connecting to socket: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:40:54,627 [salt.transport.ipc][TRACE   ][48610] IPCServer: Handling connection to address:
2016-07-16 16:40:54,636 [salt.utils.event ][DEBUG   ][48942] Sending event - data = {'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': 'req', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'cmd': '_return', 'pid': 4894
2, '_stamp': '2016-07-16T16:40:54.636055', 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:40:54,637 [salt.utils.schedule][DEBUG   ][48942] schedule.handle_func: Removing /var/cache/salt/minion/proc/20160716164054484934
2016-07-16 16:40:54,646 [salt.minion      ][DEBUG   ][48610] Handling event tag '__schedule_return'
2016-07-16 16:40:54,647 [salt.minion      ][INFO    ][48610] Returning information for job: req
2016-07-16 16:40:54,648 [salt.transport.zeromq][DEBUG   ][48610] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes')
2016-07-16 16:40:54,656 [salt.crypt       ][DEBUG   ][48610] Re-using AsyncAuth for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506')
2016-07-16 16:40:54,656 [salt.transport.zeromq][TRACE   ][48610] Inserted key into loop_instance_map id 56920904 for key ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes') and process 48610
2016-07-16 16:40:54,657 [salt.minion      ][TRACE   ][48610] ret_val = <tornado.concurrent.Future object at 0x3809e90>
2016-07-16 16:40:54,676 [salt.transport.ipc][TRACE   ][48610] Client disconnected from IPC /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:41:00,842 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:41:10,843 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:41:20,845 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:41:24,460 [salt.utils.schedule][INFO    ][48610] Running scheduled job: __master_alive
2016-07-16 16:41:24,461 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
2016-07-16 16:41:24,461 [salt.utils.schedule][DEBUG   ][48610] schedule: This job was scheduled with a max number of 1
2016-07-16 16:41:24,510 [salt.utils.schedule][DEBUG   ][48947] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20160716164124471269', 'schedule': '__master_alive', 'pid': 48947, 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:41:24,599 [salt.utils.event ][DEBUG   ][48947] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_6879782621_pub.ipc
2016-07-16 16:41:24,607 [salt.utils.event ][DEBUG   ][48947] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:41:24,608 [salt.transport.ipc][DEBUG   ][48947] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:41:24,609 [salt.transport.ipc][TRACE   ][48947] IPCClient: Connecting to socket: /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:41:24,618 [salt.transport.ipc][TRACE   ][48610] IPCServer: Handling connection to address:
2016-07-16 16:41:24,619 [salt.utils.event ][DEBUG   ][48947] Sending event - data = {'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': 'req', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'cmd': '_return', 'pid': 4894
7, '_stamp': '2016-07-16T16:41:24.618759', 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 16:41:24,628 [salt.minion      ][DEBUG   ][48610] Handling event tag '__schedule_return'
2016-07-16 16:41:24,628 [salt.minion      ][INFO    ][48610] Returning information for job: req
2016-07-16 16:41:24,628 [salt.transport.zeromq][DEBUG   ][48610] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes')
2016-07-16 16:41:24,629 [salt.crypt       ][DEBUG   ][48610] Re-using AsyncAuth for ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506')
2016-07-16 16:41:24,638 [salt.utils.schedule][DEBUG   ][48947] schedule.handle_func: Removing /var/cache/salt/minion/proc/20160716164124471269
2016-07-16 16:41:24,647 [salt.transport.zeromq][TRACE   ][48610] Inserted key into loop_instance_map id 56920904 for key ('/etc/salt/pki/minion', 'europe-ireland-azure', 'tcp://x.x.171.49:4506', 'aes') and process 48610
2016-07-16 16:41:24,648 [salt.minion      ][TRACE   ][48610] ret_val = <tornado.concurrent.Future object at 0x3747e50>
2016-07-16 16:41:24,648 [salt.transport.ipc][TRACE   ][48610] Client disconnected from IPC /var/run/salt/minion/minion_event_6879782621_pull.ipc
2016-07-16 16:41:30,846 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:41:40,847 [salt.utils.process][TRACE   ][48610] Process manager iteration
2016-07-16 16:41:50,848 [salt.utils.process][TRACE   ][48610] Process manager iteration
salt-master # tail -f /var/log/salt/master
2016-07-16 19:39:04,799 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:09,803 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:14,808 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:19,811 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:24,786 [salt.master      ][TRACE   ][14150] AES payload received with command _return
2016-07-16 19:39:24,786 [salt.loaded.int.returner.local_cache][DEBUG   ][14150] Adding minions for job 20160716193924785954: ['europe-ireland-azure']
2016-07-16 19:39:24,786 [salt.utils.job   ][INFO    ][14150] Got return from europe-ireland-azure for job 20160716193924785954
2016-07-16 19:39:24,787 [salt.utils.event ][DEBUG   ][14150] Sending event - data = {'tgt_type': 'glob', 'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': '20160716193924785954', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_
alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48919, '_stamp': '2016-07-16T16:39:24.786534', 'arg': [{'connected': True, 'master': 'salt_master_host'}], 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 19:39:24,787 [salt.master      ][TRACE   ][14150] Master function call _return took 0.00113105773926 seconds
2016-07-16 19:39:24,787 [salt.utils.event ][TRACE   ][14128] get_event() received = {'tag': 'salt/job/20160716193924785954/ret/europe-ireland-azure', 'data': {'tgt_type': 'glob', 'fun_args': [{'master': 'salt_master_host', 'connected': True}], 'jid': '201607161939247859
54', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48919, '_stamp': '2016-07-16T16:39:24.786534', 'arg': [{'master': 'salt_master_host', 'connected': True}], 'fun': 'status.master', '
id': 'europe-ireland-azure'}}
2016-07-16 19:39:24,787 [salt.utils.reactor][DEBUG   ][14128] Gathering reactors for tag salt/job/20160716193924785954/ret/europe-ireland-azure
2016-07-16 19:39:29,791 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:34,795 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:37,816 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/master
2016-07-16 19:39:37,854 [salt.config      ][DEBUG   ][14064] Including configuration from '/etc/salt/minion.d/_schedule.conf'
2016-07-16 19:39:37,854 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/minion.d/_schedule.conf
2016-07-16 19:39:37,856 [salt.config      ][DEBUG   ][14064] Guessing ID. The id can be explicitly set in /etc/salt/minion
2016-07-16 19:39:37,863 [salt.config      ][INFO    ][14064] Found minion id from generate_minion_id(): salt_master_host
2016-07-16 19:39:37,864 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/master
2016-07-16 19:39:37,904 [salt.loader      ][TRACE   ][14064] Loading core.hwaddr_interfaces grain
2016-07-16 19:39:37,910 [salt.loader      ][TRACE   ][14064] Loading core.hostname grain
2016-07-16 19:39:37,911 [salt.loader      ][TRACE   ][14064] Loading core.get_master grain
2016-07-16 19:39:37,911 [salt.loader      ][TRACE   ][14064] Loading core.pythonversion grain
2016-07-16 19:39:37,911 [salt.loader      ][TRACE   ][14064] Loading core.path grain
2016-07-16 19:39:37,911 [salt.loader      ][TRACE   ][14064] Loading core.get_server_id grain
2016-07-16 19:39:37,911 [salt.loader      ][TRACE   ][14064] Loading core.ip6 grain
2016-07-16 19:39:37,917 [salt.loader      ][TRACE   ][14064] Loading core.ip4 grain
2016-07-16 19:39:37,924 [salt.loader      ][TRACE   ][14064] Loading core.saltversion grain
2016-07-16 19:39:37,924 [salt.loader      ][TRACE   ][14064] Loading core.saltpath grain
2016-07-16 19:39:37,924 [salt.loader      ][TRACE   ][14064] Loading core.pythonexecutable grain
2016-07-16 19:39:37,924 [salt.loader      ][TRACE   ][14064] Loading core.fqdn_ip4 grain
2016-07-16 19:39:37,924 [salt.loader      ][TRACE   ][14064] Loading core.dns grain
2016-07-16 19:39:37,924 [salt.loader      ][TRACE   ][14064] Loading core.fqdn_ip6 grain
2016-07-16 19:39:37,925 [salt.loader      ][TRACE   ][14064] Loading core.ip6_interfaces grain
2016-07-16 19:39:37,925 [salt.loader      ][TRACE   ][14064] Loading core.ip4_interfaces grain
2016-07-16 19:39:37,925 [salt.loader      ][TRACE   ][14064] Loading core.append_domain grain
2016-07-16 19:39:37,925 [salt.loader      ][TRACE   ][14064] Loading core.os_data grain
2016-07-16 19:39:37,963 [salt.loader      ][TRACE   ][14064] Loading core.zmqversion grain
2016-07-16 19:39:37,963 [salt.loader      ][TRACE   ][14064] Loading core.saltversioninfo grain
2016-07-16 19:39:37,963 [salt.loader      ][TRACE   ][14064] Loading core.pythonpath grain
2016-07-16 19:39:37,963 [salt.loader      ][TRACE   ][14064] Loading core.id_ grain
2016-07-16 19:39:37,963 [salt.loader      ][TRACE   ][14064] Loading core.locale_info grain
2016-07-16 19:39:37,964 [salt.loader      ][TRACE   ][14064] Loading core.get_machine_id grain
2016-07-16 19:39:37,964 [salt.loader      ][TRACE   ][14064] Loading core.ip_interfaces grain
2016-07-16 19:39:37,964 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sda reports itself as an HDD
2016-07-16 19:39:37,964 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdb reports itself as an HDD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdc reports itself as an SSD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdd reports itself as an SSD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sde reports itself as an HDD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdf reports itself as an HDD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdg reports itself as an HDD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdh reports itself as an HDD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdi reports itself as an HDD
2016-07-16 19:39:37,965 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdj reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdk reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdl reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sr0 reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop0 reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop1 reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop2 reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop3 reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop4 reports itself as an HDD
2016-07-16 19:39:37,966 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop5 reports itself as an HDD
2016-07-16 19:39:37,967 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop6 reports itself as an HDD
2016-07-16 19:39:37,967 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop7 reports itself as an HDD
2016-07-16 19:39:37,976 [salt.utils.lazy  ][DEBUG   ][14064] LazyLoaded local_cache.clean_old_jobs
2016-07-16 19:39:37,987 [salt.utils.lazy  ][DEBUG   ][14064] Could not LazyLoad config.merge
2016-07-16 19:39:37,987 [salt.fileserver  ][DEBUG   ][14064] Updating roots fileserver cache
2016-07-16 19:39:37,987 [salt.utils.verify][DEBUG   ][14064] This salt-master instance has accepted 9 minion keys.
2016-07-16 19:39:39,799 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:44,804 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:49,807 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:54,811 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:39:54,844 [salt.master      ][TRACE   ][14149] AES payload received with command _return
2016-07-16 19:39:54,845 [salt.loaded.int.returner.local_cache][DEBUG   ][14149] Adding minions for job 20160716193954844333: ['europe-ireland-azure']
2016-07-16 19:39:54,845 [salt.utils.job   ][INFO    ][14149] Got return from europe-ireland-azure for job 20160716193954844333
2016-07-16 19:39:54,845 [salt.utils.event ][DEBUG   ][14149] Sending event - data = {'tgt_type': 'glob', 'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': '20160716193954844333', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_
alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48924, '_stamp': '2016-07-16T16:39:54.844897', 'arg': [{'connected': True, 'master': 'salt_master_host'}], 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 19:39:54,845 [salt.master      ][TRACE   ][14149] Master function call _return took 0.00110101699829 seconds
2016-07-16 19:39:54,846 [salt.utils.event ][TRACE   ][14128] get_event() received = {'tag': 'salt/job/20160716193954844333/ret/europe-ireland-azure', 'data': {'tgt_type': 'glob', 'fun_args': [{'master': 'salt_master_host', 'connected': True}], 'jid': '201607161939548443
33', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48924, '_stamp': '2016-07-16T16:39:54.844897', 'arg': [{'master': 'salt_master_host', 'connected': True}], 'fun': 'status.master', '
id': 'europe-ireland-azure'}}
2016-07-16 19:39:54,846 [salt.utils.reactor][DEBUG   ][14128] Gathering reactors for tag salt/job/20160716193954844333/ret/europe-ireland-azure
2016-07-16 19:39:59,850 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:04,853 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:09,857 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:14,861 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:19,865 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:24,808 [salt.master      ][TRACE   ][14166] AES payload received with command _return
2016-07-16 19:40:24,808 [salt.loaded.int.returner.local_cache][DEBUG   ][14166] Adding minions for job 20160716194024807797: ['europe-ireland-azure']
2016-07-16 19:40:24,808 [salt.utils.job   ][INFO    ][14166] Got return from europe-ireland-azure for job 20160716194024807797
2016-07-16 19:40:24,808 [salt.utils.event ][DEBUG   ][14166] Sending event - data = {'tgt_type': 'glob', 'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': '20160716194024807797', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_
alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48937, '_stamp': '2016-07-16T16:40:24.808386', 'arg': [{'connected': True, 'master': 'salt_master_host'}], 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 19:40:24,809 [salt.master      ][TRACE   ][14166] Master function call _return took 0.00112318992615 seconds
2016-07-16 19:40:24,809 [salt.utils.event ][TRACE   ][14128] get_event() received = {'tag': 'salt/job/20160716194024807797/ret/europe-ireland-azure', 'data': {'tgt_type': 'glob', 'fun_args': [{'master': 'salt_master_host', 'connected': True}], 'jid': '201607161940248077
97', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48937, '_stamp': '2016-07-16T16:40:24.808386', 'arg': [{'master': 'salt_master_host', 'connected': True}], 'fun': 'status.master', '
id': 'europe-ireland-azure'}}
2016-07-16 19:40:24,809 [salt.utils.reactor][DEBUG   ][14128] Gathering reactors for tag salt/job/20160716194024807797/ret/europe-ireland-azure
2016-07-16 19:40:29,814 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:34,817 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:38,047 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/master
2016-07-16 19:40:38,084 [salt.config      ][DEBUG   ][14064] Including configuration from '/etc/salt/minion.d/_schedule.conf'
2016-07-16 19:40:38,084 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/minion.d/_schedule.conf
2016-07-16 19:40:38,086 [salt.config      ][DEBUG   ][14064] Guessing ID. The id can be explicitly set in /etc/salt/minion
2016-07-16 19:40:38,093 [salt.config      ][INFO    ][14064] Found minion id from generate_minion_id(): salt_master_host
2016-07-16 19:40:38,094 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/master
2016-07-16 19:40:38,133 [salt.loader      ][TRACE   ][14064] Loading core.hwaddr_interfaces grain
2016-07-16 19:40:38,140 [salt.loader      ][TRACE   ][14064] Loading core.hostname grain
2016-07-16 19:40:38,140 [salt.loader      ][TRACE   ][14064] Loading core.get_master grain
2016-07-16 19:40:38,140 [salt.loader      ][TRACE   ][14064] Loading core.pythonversion grain
2016-07-16 19:40:38,140 [salt.loader      ][TRACE   ][14064] Loading core.path grain
2016-07-16 19:40:38,141 [salt.loader      ][TRACE   ][14064] Loading core.get_server_id grain
2016-07-16 19:40:38,141 [salt.loader      ][TRACE   ][14064] Loading core.ip6 grain
2016-07-16 19:40:38,147 [salt.loader      ][TRACE   ][14064] Loading core.ip4 grain
2016-07-16 19:40:38,153 [salt.loader      ][TRACE   ][14064] Loading core.saltversion grain
2016-07-16 19:40:38,154 [salt.loader      ][TRACE   ][14064] Loading core.saltpath grain
2016-07-16 19:40:38,154 [salt.loader      ][TRACE   ][14064] Loading core.pythonexecutable grain
2016-07-16 19:40:38,154 [salt.loader      ][TRACE   ][14064] Loading core.fqdn_ip4 grain
2016-07-16 19:40:38,154 [salt.loader      ][TRACE   ][14064] Loading core.dns grain
2016-07-16 19:40:38,154 [salt.loader      ][TRACE   ][14064] Loading core.fqdn_ip6 grain
2016-07-16 19:40:38,155 [salt.loader      ][TRACE   ][14064] Loading core.ip6_interfaces grain
2016-07-16 19:40:38,155 [salt.loader      ][TRACE   ][14064] Loading core.ip4_interfaces grain
2016-07-16 19:40:38,155 [salt.loader      ][TRACE   ][14064] Loading core.append_domain grain
2016-07-16 19:40:38,155 [salt.loader      ][TRACE   ][14064] Loading core.os_data grain
2016-07-16 19:40:38,194 [salt.loader      ][TRACE   ][14064] Loading core.zmqversion grain
2016-07-16 19:40:38,194 [salt.loader      ][TRACE   ][14064] Loading core.saltversioninfo grain
2016-07-16 19:40:38,194 [salt.loader      ][TRACE   ][14064] Loading core.pythonpath grain
2016-07-16 19:40:38,194 [salt.loader      ][TRACE   ][14064] Loading core.id_ grain
2016-07-16 19:40:38,195 [salt.loader      ][TRACE   ][14064] Loading core.locale_info grain
2016-07-16 19:40:38,195 [salt.loader      ][TRACE   ][14064] Loading core.get_machine_id grain
2016-07-16 19:40:38,195 [salt.loader      ][TRACE   ][14064] Loading core.ip_interfaces grain
2016-07-16 19:40:38,195 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sda reports itself as an HDD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdb reports itself as an HDD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdc reports itself as an SSD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdd reports itself as an SSD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sde reports itself as an HDD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdf reports itself as an HDD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdg reports itself as an HDD
2016-07-16 19:40:38,196 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdh reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdi reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdj reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdk reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdl reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sr0 reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop0 reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop1 reports itself as an HDD
2016-07-16 19:40:38,197 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop2 reports itself as an HDD
2016-07-16 19:40:38,198 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop3 reports itself as an HDD
2016-07-16 19:40:38,198 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop4 reports itself as an HDD
2016-07-16 19:40:38,198 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop5 reports itself as an HDD
2016-07-16 19:40:38,198 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop6 reports itself as an HDD
2016-07-16 19:40:38,198 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop7 reports itself as an HDD
2016-07-16 19:40:38,207 [salt.utils.lazy  ][DEBUG   ][14064] LazyLoaded local_cache.clean_old_jobs
2016-07-16 19:40:38,219 [salt.utils.lazy  ][DEBUG   ][14064] Could not LazyLoad config.merge
2016-07-16 19:40:38,219 [salt.fileserver  ][DEBUG   ][14064] Updating roots fileserver cache
2016-07-16 19:40:38,219 [salt.utils.verify][DEBUG   ][14064] This salt-master instance has accepted 9 minion keys.
2016-07-16 19:40:39,821 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:44,825 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:49,830 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:54,834 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:40:54,848 [salt.master      ][TRACE   ][14152] AES payload received with command _return
2016-07-16 19:40:54,849 [salt.loaded.int.returner.local_cache][DEBUG   ][14152] Adding minions for job 20160716194054848306: ['europe-ireland-azure']
2016-07-16 19:40:54,849 [salt.utils.job   ][INFO    ][14152] Got return from europe-ireland-azure for job 20160716194054848306
2016-07-16 19:40:54,849 [salt.utils.event ][DEBUG   ][14152] Sending event - data = {'tgt_type': 'glob', 'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': '20160716194054848306', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_
alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48942, '_stamp': '2016-07-16T16:40:54.848946', 'arg': [{'connected': True, 'master': 'salt_master_host'}], 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 19:40:54,850 [salt.master      ][TRACE   ][14152] Master function call _return took 0.00127696990967 seconds
2016-07-16 19:40:54,850 [salt.utils.event ][TRACE   ][14128] get_event() received = {'tag': 'salt/job/20160716194054848306/ret/europe-ireland-azure', 'data': {'tgt_type': 'glob', 'fun_args': [{'master': 'salt_master_host', 'connected': True}], 'jid': '201607161940548483
06', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48942, '_stamp': '2016-07-16T16:40:54.848946', 'arg': [{'master': 'salt_master_host', 'connected': True}], 'fun': 'status.master', '
id': 'europe-ireland-azure'}}
2016-07-16 19:40:54,850 [salt.utils.reactor][DEBUG   ][14128] Gathering reactors for tag salt/job/20160716194054848306/ret/europe-ireland-azure
2016-07-16 19:40:59,854 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:04,858 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:09,863 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:14,867 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:19,871 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:24,828 [salt.master      ][TRACE   ][14158] AES payload received with command _return
2016-07-16 19:41:24,829 [salt.loaded.int.returner.local_cache][DEBUG   ][14158] Adding minions for job 20160716194124828721: ['europe-ireland-azure']
2016-07-16 19:41:24,829 [salt.utils.job   ][INFO    ][14158] Got return from europe-ireland-azure for job 20160716194124828721
2016-07-16 19:41:24,829 [salt.utils.event ][DEBUG   ][14158] Sending event - data = {'tgt_type': 'glob', 'fun_args': [{'connected': True, 'master': 'salt_master_host'}], 'jid': '20160716194124828721', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_
alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48947, '_stamp': '2016-07-16T16:41:24.829318', 'arg': [{'connected': True, 'master': 'salt_master_host'}], 'fun': 'status.master', 'id': 'europe-ireland-azure'}
2016-07-16 19:41:24,830 [salt.master      ][TRACE   ][14158] Master function call _return took 0.00117897987366 seconds
2016-07-16 19:41:24,830 [salt.utils.event ][TRACE   ][14128] get_event() received = {'tag': 'salt/job/20160716194124828721/ret/europe-ireland-azure', 'data': {'tgt_type': 'glob', 'fun_args': [{'master': 'salt_master_host', 'connected': True}], 'jid': '201607161941248287
21', 'return': None, 'retcode': 0, 'success': True, 'schedule': '__master_alive', 'tgt': 'europe-ireland-azure', 'cmd': '_return', 'pid': 48947, '_stamp': '2016-07-16T16:41:24.829318', 'arg': [{'master': 'salt_master_host', 'connected': True}], 'fun': 'status.master', '
id': 'europe-ireland-azure'}}
2016-07-16 19:41:24,830 [salt.utils.reactor][DEBUG   ][14128] Gathering reactors for tag salt/job/20160716194124828721/ret/europe-ireland-azure
2016-07-16 19:41:29,834 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:34,838 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:38,279 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/master
2016-07-16 19:41:38,317 [salt.config      ][DEBUG   ][14064] Including configuration from '/etc/salt/minion.d/_schedule.conf'
2016-07-16 19:41:38,317 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/minion.d/_schedule.conf
2016-07-16 19:41:38,318 [salt.config      ][DEBUG   ][14064] Guessing ID. The id can be explicitly set in /etc/salt/minion
2016-07-16 19:41:38,326 [salt.config      ][INFO    ][14064] Found minion id from generate_minion_id(): salt_master_host
2016-07-16 19:41:38,327 [salt.config      ][DEBUG   ][14064] Reading configuration from /etc/salt/master
2016-07-16 19:41:38,366 [salt.loader      ][TRACE   ][14064] Loading core.hwaddr_interfaces grain
2016-07-16 19:41:38,373 [salt.loader      ][TRACE   ][14064] Loading core.hostname grain
2016-07-16 19:41:38,373 [salt.loader      ][TRACE   ][14064] Loading core.get_master grain
2016-07-16 19:41:38,373 [salt.loader      ][TRACE   ][14064] Loading core.pythonversion grain
2016-07-16 19:41:38,373 [salt.loader      ][TRACE   ][14064] Loading core.path grain
2016-07-16 19:41:38,373 [salt.loader      ][TRACE   ][14064] Loading core.get_server_id grain
2016-07-16 19:41:38,373 [salt.loader      ][TRACE   ][14064] Loading core.ip6 grain
2016-07-16 19:41:38,379 [salt.loader      ][TRACE   ][14064] Loading core.ip4 grain
2016-07-16 19:41:38,386 [salt.loader      ][TRACE   ][14064] Loading core.saltversion grain
2016-07-16 19:41:38,386 [salt.loader      ][TRACE   ][14064] Loading core.saltpath grain
2016-07-16 19:41:38,386 [salt.loader      ][TRACE   ][14064] Loading core.pythonexecutable grain
2016-07-16 19:41:38,386 [salt.loader      ][TRACE   ][14064] Loading core.fqdn_ip4 grain
2016-07-16 19:41:38,387 [salt.loader      ][TRACE   ][14064] Loading core.dns grain
2016-07-16 19:41:38,387 [salt.loader      ][TRACE   ][14064] Loading core.fqdn_ip6 grain
2016-07-16 19:41:38,387 [salt.loader      ][TRACE   ][14064] Loading core.ip6_interfaces grain
2016-07-16 19:41:38,387 [salt.loader      ][TRACE   ][14064] Loading core.ip4_interfaces grain
2016-07-16 19:41:38,387 [salt.loader      ][TRACE   ][14064] Loading core.append_domain grain
2016-07-16 19:41:38,388 [salt.loader      ][TRACE   ][14064] Loading core.os_data grain
2016-07-16 19:41:38,425 [salt.loader      ][TRACE   ][14064] Loading core.zmqversion grain
2016-07-16 19:41:38,425 [salt.loader      ][TRACE   ][14064] Loading core.saltversioninfo grain
2016-07-16 19:41:38,425 [salt.loader      ][TRACE   ][14064] Loading core.pythonpath grain
2016-07-16 19:41:38,425 [salt.loader      ][TRACE   ][14064] Loading core.id_ grain
2016-07-16 19:41:38,425 [salt.loader      ][TRACE   ][14064] Loading core.locale_info grain
2016-07-16 19:41:38,425 [salt.loader      ][TRACE   ][14064] Loading core.get_machine_id grain
2016-07-16 19:41:38,426 [salt.loader      ][TRACE   ][14064] Loading core.ip_interfaces grain
2016-07-16 19:41:38,426 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sda reports itself as an HDD
2016-07-16 19:41:38,426 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdb reports itself as an HDD
2016-07-16 19:41:38,426 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdc reports itself as an SSD
2016-07-16 19:41:38,426 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdd reports itself as an SSD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sde reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdf reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdg reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdh reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdi reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdj reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdk reports itself as an HDD
2016-07-16 19:41:38,427 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sdl reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device sr0 reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop0 reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop1 reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop2 reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop3 reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop4 reports itself as an HDD
2016-07-16 19:41:38,428 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop5 reports itself as an HDD
2016-07-16 19:41:38,429 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop6 reports itself as an HDD
2016-07-16 19:41:38,429 [salt.loaded.int.grains.disks][TRACE   ][14064] Device loop7 reports itself as an HDD
2016-07-16 19:41:38,437 [salt.utils.lazy  ][DEBUG   ][14064] LazyLoaded local_cache.clean_old_jobs
2016-07-16 19:41:38,448 [salt.utils.lazy  ][DEBUG   ][14064] Could not LazyLoad config.merge
2016-07-16 19:41:38,448 [salt.fileserver  ][DEBUG   ][14064] Updating roots fileserver cache
2016-07-16 19:41:38,448 [salt.utils.verify][DEBUG   ][14064] This salt-master instance has accepted 9 minion keys.
2016-07-16 19:41:39,842 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:44,847 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
2016-07-16 19:41:49,850 [salt.utils.event ][TRACE   ][14128] _get_event() waited 5 seconds and received nothing
d3xt3r01 commented 8 years ago

Tried setting tcp_keepalive: True on master ( I know it's true by default but thought to give it a shot ) with no luck. The minion sees the connection with keep alive. The master does not. I'm sure noone's messing with the packets. I applied the same settings on all the minions. It's this azure box that's giving me headaches.

# on master
tcp        0      0 x.x.1.95:4505       x.x.196.12:51714      ESTABLISHED 28096/python2.7      off (0.00/0/0)
tcp        0      0 x.x.1.95:4505       x.x.196.12:51645      ESTABLISHED 28096/python2.7      off (0.00/0/0)
# on minion
tcp        0      0 x.x.0.5:51714          x.x.171.49:4505      ESTABLISHED 51787/python         keepalive (12.51/0/2)
cachedout commented 8 years ago

We hear all the time about this sort of problem inside Azure. It sounds like they have some networking equipment that might be causing these sorts of issues. It's hard to pin this down to an issue with Salt, given that it's only one side which thinks that the TCP connection is established.

d3xt3r01 commented 8 years ago

Any workarounds of some sort ? In my case, only 1 minion is in azure.

Can the master have keepalives then ? That should close useless connections ...

d3xt3r01 commented 8 years ago

Something like in master

# in /usr/lib/python2.7/site-packages/salt
         context = zmq.Context(1)
         context.setsockopt(zmq.TCP_KEEPALIVE,1) ###this to be added

Ofcourse .. this should be configurable just like in the minion.

cachedout commented 8 years ago

I'm not sure why keepalives would make this problem any better. If an intermediate device is holding open the connection, wouldn't a keepalive only make this problem worse?

d3xt3r01 commented 8 years ago

At least the master would know to close the connection. Also, if that would be customizable, we could lower the settings maybe under a threshold that azure's nat tables wouldn't expire...

Still, the keepalive option should be available on the master too just for consistency.

d3xt3r01 commented 8 years ago

Also, I don't think an intermediate device is keeping the connection open. What I think happens is:

Minion sends the keepalive ..too late Somewhere nat tables expired Master doesn't get it .. but given he doesn't have keepalive on the tcp connection, he doesn't care if the minion died, rebooted or doesn't exist anymore... Minion sees that it can't ping the master, restarts a new connection 2 connections appear on the master

I tested this by putting a tcpdump on the master on a socket that was supposed to be dead ( but the master saw as alive ). Nothing was going through there .. so the master ( without the keepalive ) couldn't know that the socket died for some reason on the other end.

The cycle repeats and tons of connections get open on the master.

Ch3LL commented 8 years ago

@d3xt3r01 it looks like there are keepalive settings as documented here that you could configure in this case. Does that help your use case?

d3xt3r01 commented 8 years ago

This was fixed by doing:

tcp_keepalive: True
tcp_keepalive_idle: 60
master_alive_interval: 30

Can't tcp keepalive be added to the master so defaults would work ? Just for consistency.

UtahDave commented 8 years ago

@d3xt3r01 was the above set in the minion config or the master config?

d3xt3r01 commented 8 years ago

@UtahDave Hey ! It's on the minions. The master doesn't have/support tcp_keepalive ! :-(

stale[bot] commented 6 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

If this issue is closed prematurely, please leave a comment and we will gladly reopen the issue.