saltstack / salt

Software to automate the management and configuration of any infrastructure or application at scale. Get access to the Salt software package repository here:
https://repo.saltproject.io/
Apache License 2.0
14.1k stars 5.47k forks source link

debian issue with service.running in 0.11.1 #3306

Closed alienzrcoming closed 11 years ago

alienzrcoming commented 11 years ago
salt --versions-report
           Salt: 0.11.1
         Python: 2.6.6 (r266:84292, Dec 26 2010, 22:31:48)
         Jinja2: 2.5.5
       M2Crypto: 0.20.1
 msgpack-python: 0.1.10
   msgpack-pure: not installed
       pycrypto: 2.1.0
         PyYAML: 3.09
          PyZMQ: 2.2.0

I upgraded my master and a few minions to 0.11.1 and have found that my service.running states have started bombing on the upgraded minions with an error like this:

----------
    State: - service
    Name:      haproxy-1.4.22
    Function:  running
        Result:    False
        Comment:   An exception occured in this state: Traceback (most recent call last):
  File "/usr/lib/python2.6/dist-packages/salt/state.py", line 1208, in call
    ret = self.states[cdata['full']](*cdata['args'])
  File "/usr/lib/python2.6/dist-packages/salt/states/service.py", line 358, in mod_watch
    if __salt__['service.status'](name, sig):
KeyError: 'service.status'

        Changes:   

these states have no problem succeeding on older minions.

example state with service.running:

/etc/haproxy/haproxy.cfg-1.4.22:
  file.managed:
    - source:    salt://environments/test/web/content-cms/files/haproxy.cfg.jinja
    - user:      root
    - group:     root
    - mode:      644
    - watch_in: 
      - service: haproxy-1.4.22

haproxy-1.4.22:
  service.running:
    - require:
      - file:    /etc/haproxy/haproxy.cfg-1.4.22

example state run on new minion:

salt-call state.sls environments.test.web.content-cms.haproxy
[INFO    ] Loaded configuration file: /etc/salt/minion
[INFO    ] Executing command 'ps -efH' in directory '/root'
[INFO    ] Loading fresh modules for state activity
[INFO    ] Fetching file 'salt://environments/test/web/content-cms/haproxy.sls'
[INFO    ] Executing state file.managed for /etc/haproxy/haproxy.cfg-1.4.22
[INFO    ] Fetching file 'salt://environments/test/web/content-cms/files/haproxy.cfg.jinja'
[INFO    ] File changed:
New file
[INFO    ] Executing state service.running for haproxy-1.4.22
[ERROR   ] No changes made for haproxy-1.4.22
[INFO    ] Executing state service.mod_watch for haproxy-1.4.22
[ERROR   ] No changes made for haproxy-1.4.22
local:
----------
    State: - file
    Name:      /etc/haproxy/haproxy.cfg-1.4.22
    Function:  managed
        Result:    True
        Comment:   File /etc/haproxy/haproxy.cfg-1.4.22 updated
        Changes:   diff: New file

----------
    State: - service
    Name:      haproxy-1.4.22
    Function:  running
        Result:    False
        Comment:   An exception occured in this state: Traceback (most recent call last):
  File "/usr/lib/python2.6/dist-packages/salt/state.py", line 1208, in call
    ret = self.states[cdata['full']](*cdata['args'])
  File "/usr/lib/python2.6/dist-packages/salt/states/service.py", line 358, in mod_watch
    if __salt__['service.status'](name, sig):
KeyError: 'service.status'

        Changes:   

master & minions are running debian squeeze/6.0.6 and 0.11.1 using madduck's salt packages
ii  salt-common                         0.11.1-1~bpo60+1~madduck.1   shared libraries that salt requires for all packages
ii  salt-minion                         0.11.1-1~bpo60+1~madduck.1   client package for salt, the distributed remote execution system

master:
ii  salt-common                         0.11.1-1~bpo60+1~madduck.1   shared libraries that salt requires for all packages
ii  salt-master                         0.11.1-1~bpo60+1~madduck.1   remote manager to administer servers via salt

as a sanity check, i also wrote the service state like this, but the error persists:

haproxy-1.4.22:
  service:
    - running
    - require:
      - file:    /etc/haproxy/haproxy.cfg-1.4.22

any ideas?

thanks in advance

Thomas Hatch Post reply

1:26 PM (12 minutes ago)

Looks like a Debian specific issue, can you file a bug report and I will get this taken care of for 0.12.1

thatch45 commented 11 years ago

Will fix!

thatch45 commented 11 years ago

This should be fixed in 0.12.1.

UtahDave commented 11 years ago

@alienzrcoming, can you verify that this is working in 0.12.1?

alienzrcoming commented 11 years ago

yep, it is. thanks!

UtahDave commented 11 years ago

Great! Thanks for letting us know.