Closed Muzammilshus closed 7 years ago
Maybe your top.sls is not valid YAML?
Try adding a colon at the end of 'G@osfinger:CentOS-5 and *data*'
good catch @roscoecairney
@Muzammilshus is the top file a direct copy of your top file, or is the missing :
just a typo from the copy over?
Thanks, Daniel
So i applied your suggested change and i got it to push for the bottom top.sls,
base:
'group1':
- match: nodegroup
- directories
'G@osfinger:CentOS-5 and *data*':
- match: compound
- directories
but now when i want to push all my packages i get this error below. also I have killed all salt-master, salt-minion services multiple times, cleared salt master cache, to try to fix it, but keep getting this error
base:
'group1':
- match: nodegroup
- directories
- cpjarshfiles
- itp_prop
- javaextract
'G@osfinger:CentOS-5 and *data*':
- match: compound
- directories
- cpjarshfiles
- itp_prop
- javaextract
[root@SaltMaster master]# salt -C 'G@osfinger:CentOS-5 and *data*' test.ping -l debug
[DEBUG ] Reading configuration from /etc/salt/master
[DEBUG ] Missing configuration file: /root/.saltrc
[DEBUG ] Configuration file path: /etc/salt/master
[WARNING ] Insecure logging configuration detected! Sensitive data may be logged.
[DEBUG ] Reading configuration from /etc/salt/master
[DEBUG ] Missing configuration file: /root/.saltrc
[DEBUG ] MasterEvent PUB socket URI: /var/run/salt/master/master_event_pub.ipc
[DEBUG ] MasterEvent PULL socket URI: /var/run/salt/master/master_event_pull.ipc
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/master', 'SaltMaster.bnd.corp_master', 'tcp://127.0.0.1:4506', 'clear')
[DEBUG ] Initializing new IPCClient for path: /var/run/salt/master/master_event_pub.ipc
[DEBUG ] LazyLoaded local_cache.get_load
[DEBUG ] Reading minion list from /var/cache/salt/master/jobs/f5/0ed095a8da99adbe6e536483bfc27545c62bdce6b299bd395868ed95193c0f/.minions.p
[DEBUG ] get_iter_returns for jid 20170201111615094768 sent to set(['21070fldata.bnd.corp']) will timeout at 11:16:20.104691
[DEBUG ] jid 20170201111615094768 return from 21070fldata.bnd.corp
[DEBUG ] LazyLoaded nested.output
21070fldata.bnd.corp:
True
[DEBUG ] jid 20170201111615094768 found all minions set(['21070fldata.bnd.corp'])
[root@SaltMaster master]# salt -C 'G@osfinger:CentOS-5 and *data*' state.apply -l debug
[DEBUG ] Reading configuration from /etc/salt/master
[DEBUG ] Missing configuration file: /root/.saltrc
[DEBUG ] Configuration file path: /etc/salt/master
[WARNING ] Insecure logging configuration detected! Sensitive data may be logged.
[DEBUG ] Reading configuration from /etc/salt/master
[DEBUG ] Missing configuration file: /root/.saltrc
[DEBUG ] MasterEvent PUB socket URI: /var/run/salt/master/master_event_pub.ipc
[DEBUG ] MasterEvent PULL socket URI: /var/run/salt/master/master_event_pull.ipc
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/master', 'SaltMaster.bnd.corp_master', 'tcp://127.0.0.1:4506', 'clear')
[DEBUG ] Initializing new IPCClient for path: /var/run/salt/master/master_event_pub.ipc
[DEBUG ] LazyLoaded local_cache.get_load
[DEBUG ] Reading minion list from /var/cache/salt/master/jobs/55/7ac027121e1e480f611c53844e83ba75b30e64d50ce76cb52da2ec0297d287/.minions.p
[DEBUG ] get_iter_returns for jid 20170201111622182734 sent to set(['21070fldata.bnd.corp']) will timeout at 11:16:27.192665
[DEBUG ] jid 20170201111622182734 return from 21070fldata.bnd.corp
[DEBUG ] LazyLoaded highstate.output
21070fldata.bnd.corp:
Data failed to compile:
----------
The function "state.apply" is running as PID 22675 and was started at 2017, Feb 01 11:06:59.618911 with jid 20170201110659618911
[DEBUG ] jid 20170201111622182734 found all minions set(['21070fldata.bnd.corp'])
ERROR: Minions returned with non-zero exit code
Can you show the contents of these files?
/srv/salt/cpjarshfiles/init.sls /srv/salt/itp_prop/init.sls /srv/salt/javaextract/init.sls
Can you run it when salt-run jobs.active
doesn't show any running state jobs on those minions?
Thanks, Daniel
@gtmanfred, I think that was the issue, even though no jobs were running, i cleared the job cache in the minion and it ran but took 17 mins for 5 minions. I was sending over a java tarball around 300mb with jdk files. Is this normal ?
How long does it take to transfer it without salt, using scp?
It should also cache the file on the minion so it shouldn't have to transfer it every time.
There should also be output to show how long each stateid took to run in the highstate output.
@gtmanfred around 20 mins per minion with rsync, not scp. Ill do a test run tonight and post the results with salt
Description of Issue/Question
I'm new to salt and im having difficulties targeting minion with grains. For some reason i keep getting errors when i push it to minions with
salt -N group1 state.apply
orsalt -C 'G@osfinger:CentOS-5 and *data*' state.apply
Setup
/srv/salt/directories/init.sls
/srv/salt/top.sls
Also deleted cache and master pub key and restarted minion service cd /var/cache/salt/master/; rm rf * ; service salt-minion restart rm -rf /etc/salt/pki/minion/minion_master.pub; service salt-master restart
[root@SaltMaster minion]# salt --versions-report Salt Version: Salt: 2016.11.2
Dependency Versions: cffi: Not Installed cherrypy: Not Installed dateutil: Not Installed gitdb: Not Installed gitpython: Not Installed ioflo: Not Installed Jinja2: 2.7.2 libgit2: Not Installed libnacl: Not Installed M2Crypto: Not Installed Mako: Not Installed msgpack-pure: Not Installed msgpack-python: 0.4.6 mysql-python: Not Installed pycparser: Not Installed pycrypto: 2.6.1 pygit2: Not Installed Python: 2.7.5 (default, Nov 6 2016, 00:28:07) python-gnupg: Not Installed PyYAML: 3.11 PyZMQ: 15.3.0 RAET: Not Installed smmap: Not Installed timelib: Not Installed Tornado: 4.2.1 ZMQ: 4.1.4
System Versions: dist: centos 7.3.1611 Core machine: x86_64 release: 3.10.0-514.el7.x86_64 system: Linux version: CentOS Linux 7.3.1611 Core
[root@21010storage minion]# rpm -qa | grep salt salt-2015.8.7-1.el6.noarch salt-minion-2015.8.7-1.el6.noarc [root@21070fldata minion.d]# rpm -qa | grep salt salt-2016.3.3-1.el5 salt-minion-2016.3.3-1.el5