kubernetes-sigs / kubespray

Deploy a Production Ready Kubernetes Cluster
Apache License 2.0
16.01k stars 6.44k forks source link

FAILED - RETRYING: Update package management cache (YUM) #2517

Closed duckbill closed 5 years ago

duckbill commented 6 years ago

Is this a BUG REPORT or FEATURE REQUEST? (choose one):

Environment:

Kubespray version (commit) (git rev-parse --short HEAD): 6ade7c0a8d39588ad47af9ee6826b8f293720796

Network plugin used:

default Copy of your inventory file:

# ## Configure 'ip' variable to bind kubernetes services on a
# ## different ip than the default iface
node1 ansible_ssh_host=192.168.1.105 ansible_user=root ip=192.168.1.105
# node2 ansible_ssh_host=95.54.0.13  # ip=10.3.0.2
# node3 ansible_ssh_host=95.54.0.14  # ip=10.3.0.3
# node4 ansible_ssh_host=95.54.0.15  # ip=10.3.0.4
# node5 ansible_ssh_host=95.54.0.16  # ip=10.3.0.5
# node6 ansible_ssh_host=95.54.0.17  # ip=10.3.0.6

# ## configure a bastion host if your nodes are not directly reachable
# bastion ansible_ssh_host=x.x.x.x

[kube-master]
node1
# node2

[etcd]
node1
# node2
# node3

[kube-node]
node1
# node3
# node4
# node5
# node6

[k8s-cluster:children]
kube-node
kube-master

Command used to invoke ansible: ansible-playbook -i inventory/inventory cluster.yml -b -v --private-key=~/.ssh/id_rsa

Output of ansible run:

Anything else do we need to know:

I want to install kubernetes on single node, and I had done "ssh-copy-id -i root@node"(node is 192.168.1.105 is also localhost),while it failed!

priyasrinivas commented 6 years ago

Any updates on this? We are having the same issue.

mvernimmen commented 6 years ago

It takes many minutes for the yum cache update to fail. When running ansible (I've tried 2.5.x and 2.7.x) in debug mode, it shows the following output:

will be installed\n---> Package aalib.x86_64 0:1.4.0-0.22.rc5.el7 will be installed\n---> Package aalib-devel.x86_64 0:1.4.0-0.22.rc5.el7 will be installed\n---> Package aalib-libs.x86_64 0:1.4.0-0.22.rc5.el7 will be installed\n---> Package abattis-cantarell-fonts.noarch 0:0.0.25-1.el7 will be installed\n---> Package abc.x86_64 0:1.01-9.hg20160905.el7 will be installed\n---> Package abc-devel.x86_64 0:1.01-9.hg20160905.el7 will be installed\n---> Package abc-libs.x86_64 0:1.01-9.hg20160905.el7 will be installed\n---> Package abcde.noarch 0:2.5.4-3.el7 will be installed\n---> Package abduco.x86_64 0:0.6-1.el7 will be installed\n---> Package abi-compliance-checker.noarch 0:2.3-1.el7 will be installed\n---> Package abi-dumper.noarch 0:1.1-3.el7 will be installed\n---> Package abi-tracker.noarch 0:1.11-1.el7 will be installed\n---> Package abook.x86_64 0:0.6.1-2.el7 will be installed\n---> Package abrt.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-ccpp.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-kerneloops.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-pstoreoops.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-python.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-python3.noarch 0:2.1.11-49.el7 will be installed\n---> Package abrt-addon-upload-watch.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-vmcore.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-addon-xorg.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-cli.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-console-notification.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-dbus.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-desktop.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-devel.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-gui.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-gui-devel.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-gui-libs.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-java-connector.x86_64 0:1.0.6-12.el7 will be installed\n---> Package abrt-libs.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-python.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-python-doc.noarch 0:2.1.11-50.sl7 will be installed\n---> Package abrt-retrace-client.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package abrt-server-info-page.noarch 0:1.6-1.el7 will be installed\n---> Package abrt-tui.x86_64 0:2.1.11-50.sl7 will be installed\n---> Package accountsservice.x86_64 0:0.6.45-7.el7 will be installed\n---> Package accountsservice-devel.x86_64 0:0.6.45-7.el7 will be installed\n---> Package accountsservice-libs.x86_64 0:0.6.45-7.el7 will be installed\n---> Package ack.noarch 0:2.22-1.el7 will be installed\n---> Package acme-tiny.noarch 0:4.0.4-1.el7 will be installed\n---> 

<SNIP>

cmockery2\n--> Processing Conflict: libcmocka-1.1.1-0.el7.x86_64 conflicts cmockery2\n--> Processing Conflict: firebird-classic-2.5.7.27050.0-1.el7.x86_64 conflicts firebird-superclassic\n--> Processing Conflict: firebird-classic-2.5.7.27050.0-1.el7.x86_64 conflicts firebird-superclassic\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dmlite-plugins-adapter = 1.10.2\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dmlite-plugins-adapter = 1.10.2\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm-rfio-server(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm-rfio-server(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm-perl(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm-perl(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm-devel(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm-devel(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm(x86-64)\n--> Processing Conflict: dmlite-dpmdisk-domeonly-1.10.2-1.el7.x86_64 conflicts dpm(x86-64)\n--> Processing Conflict: php-pecl-rrd-1.1.3-1.el7.x86_64 conflicts rrdtool-php\n--> Processing Conflict: php-pecl-rrd-1.1.3-1.el7.x86_64 conflicts rrdtool-php\n--> Processing Conflict: percona-xtrabackup-24-2.4.5-1.el7.x86_64 conflicts percona-xtrabackup\n--> Processing Conflict: percona-xtrabackup-24-2.4.5-1.el7.x86_64 conflicts percona-xtrabackup\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dmlite-plugins-adapter\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dmlite-plugins-adapter\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-srm-server-mysql(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-srm-server-mysql(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-server-mysql(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-server-mysql(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-name-server-mysql(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-name-server-mysql(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-devel(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-devel(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-rfio-server(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-rfio-server(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-perl(x86-64)\n--> Processing Conflict: dmlite-dpmhead-dome-0.8.8-1.el7.x86_64 conflicts dpm-perl(x86-64)\n--> Processing Conflict: clamav-data-0.99.4-1.el7.noarch conflicts data(clamav) < full\n--> Processing Conflict: clamav-data-0.99.4-1.el7.noarch conflicts data(clamav) < full\n--> Processing Conflict: kf5-krunner-devel-5.36.0-1.el7.x86_64 conflicts kapptemplate < 16.03.80\n--> Processing Conflict: bodhi-server-2.11.0-3.el7.noarch conflicts python-webob\n--> Processing Conflict: php-pecl-solr2-2.3.0-1.el7.x86_64 conflicts php-pecl-solr < 2\n--> Processing Conflict: php-pecl-solr2-2.3.0-1.el7.x86_64 conflicts php-pecl-solr < 2\n--> Processing Conflict: zabbix20-2.0.21-1.el7.x86_64 conflicts zabbix\n--> Processing Conflict: zabbix20-2.0.21-1.el7.x86_64 conflicts zabbix\n--> Processing Conflict: compat-qpid-cpp-client-rdma-0.24-19.el7.x86_64 conflicts qpid-cpp-client-rdma\n--> Processing Conflict: compat-qpid-cpp-client-rdma-0.24-19.el7.x86_64 conflicts qpid-cpp-client-rdma\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-384\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-384\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-192\n--> Processing Conflict: SL_enable_serialconsole-1152-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-192\n--> Processing Conflict: php-pecl-http1-devel-1.7.6-4.el7.x86_64 conflicts php-pecl-http-devel\n--> Processing Conflict: php-pecl-http1-devel-1.7.6-4.el7.x86_64 conflicts php-pecl-http-devel\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-1152\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-1152\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-384\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-384\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-192\n--> Processing Conflict: SL_enable_serialconsole-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-192\n--> Processing Conflict: compat-qpid-cpp-client-devel-0.24-19.el7.x86_64 conflicts qpid-cpp-client-devel\n--> Processing Conflict: compat-qpid-cpp-client-devel-0.24-19.el7.x86_64 conflicts qpid-cpp-client-devel\n--> Processing Conflict: php-pecl-http-2.5.6-1.el7.x86_64 conflicts php-pecl-http1\n--> Processing Conflict: php-pecl-http-2.5.6-1.el7.x86_64 conflicts php-pecl-http1\n--> Processing Conflict: php-pecl-http1-1.7.6-4.el7.x86_64 conflicts php-pecl-http\n--> Processing Conflict: php-pecl-http1-1.7.6-4.el7.x86_64 conflicts php-pecl-http\n--> Processing Conflict: php-pecl-http1-1.7.6-4.el7.x86_64 conflicts php-pecl-event\n--> Processing Conflict: php-pecl-http1-1.7.6-4.el7.x86_64 conflicts php-pecl-event\n--> Processing Conflict: tomcatjss-7.2.1-6.el7.noarch conflicts tomcat-native\n--> Processing Conflict: tomcatjss-7.2.1-6.el7.noarch conflicts tomcat-native\n--> Processing Conflict: libsodium13-devel-1.0.5-1.el7.x86_64 conflicts libsodium-devel\n--> Processing Conflict: libsodium13-devel-1.0.5-1.el7.x86_64 conflicts libsodium-devel\n--> Processing Conflict: compat-qpid-cpp-client-0.24-19.el7.x86_64 conflicts qpid-cpp-client\n--> Processing Conflict: compat-qpid-cpp-client-0.24-19.el7.x86_64 conflicts qpid-cpp-client\n--> Processing Conflict: compat-qpid-cpp-server-xml-0.24-19.el7.x86_64 conflicts qpid-cpp-server-xml\n--> Processing Conflict: compat-qpid-cpp-server-xml-0.24-19.el7.x86_64 conflicts qpid-cpp-server-xml\n--> Processing Conflict: firebird-classic-common-2.5.7.27050.0-1.el7.x86_64 conflicts firebird-superserver\n--> Processing Conflict: firebird-classic-common-2.5.7.27050.0-1.el7.x86_64 conflicts firebird-superserver\n--> Processing Conflict: compat-qpid-cpp-server-ha-0.24-19.el7.x86_64 conflicts qpid-cpp-server-ha\n--> Processing Conflict: compat-qpid-cpp-server-ha-0.24-19.el7.x86_64 conflicts qpid-cpp-server-ha\n--> Processing Conflict: php-pecl-json-post-1.0.0-2.el7.x86_64 conflicts php-pecl(http) < 2.4\n--> Processing Conflict: php-pecl-json-post-1.0.0-2.el7.x86_64 conflicts php-pecl(http) < 2.4\n--> Processing Conflict: radcli-compat-devel-1.2.9-1.el7.x86_64 conflicts radiusclient-ng-devel\n--> Processing Conflict: radcli-compat-devel-1.2.9-1.el7.x86_64 conflicts radiusclient-ng-devel\n--> Processing Conflict: compat-qpid-cpp-server-rdma-0.24-19.el7.x86_64 conflicts qpid-cpp-server-rdma\n--> Processing Conflict: compat-qpid-cpp-server-rdma-0.24-19.el7.x86_64 conflicts qpid-cpp-server-rdma\n--> Processing Conflict: php-pecl-http-devel-2.5.6-1.el7.x86_64 conflicts php-pecl-http1-devel\n--> Processing Conflict: php-pecl-http-devel-2.5.6-1.el7.x86_64 conflicts php-pecl-http1-devel\n--> Processing Conflict: php-mysql-5.4.16-45.el7.x86_64 conflicts php-mysqlnd\n--> Processing Conflict: qcint-1.8.6-1.el7.x86_64 conflicts libcint\n--> Processing Conflict: qcint-1.8.6-1.el7.x86_64 conflicts libcint\n--> Processing Conflict: php-pecl-gmagick-1.1.7-0.2.RC2.el7.x86_64 conflicts php-magickwand\n--> Processing Conflict: php-pecl-gmagick-1.1.7-0.2.RC2.el7.x86_64 conflicts php-magickwand\n--> Processing Conflict: php-pecl-gmagick-1.1.7-0.2.RC2.el7.x86_64 conflicts php-pecl-imagick\n--> Processing Conflict: php-pecl-gmagick-1.1.7-0.2.RC2.el7.x86_64 conflicts php-pecl-imagick\n--> Processing Conflict: compat-qpid-tools-0.24-19.el7.noarch conflicts qpid-tools\n--> Processing Conflict: compat-qpid-tools-0.24-19.el7.noarch conflicts qpid-tools\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-1152\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-1152\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-96\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-384\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole-384\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole\n--> Processing Conflict: SL_enable_serialconsole-192-4.2-2.sl7.noarch conflicts SL_enable_serialconsole\n--> Finished Dependency Resolution\n You could try using --skip-broken to work around the problem\n You could try running: rpm -Va --nofiles --nodigest\n"
    ],
    "retries": 5
}

When I run the sudo /usr/bin/python -tt /bin/repoquery --show-duplicates --plugins --quiet --disablerepo --enablerepo --qf %{epoch}:%{name}-%{version}-%{release}.%{arch} --whatprovides * command that seems to be run by ansible during that step, I just get a long list of packages. No mention of duplicates or problems. Because the module being used at this step is /usr/local/Cellar/ansible/2.5.0/libexec/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py (on my machine), perhaps this isn't a problem with kubespray but a problem with ansible? Or would this be triggered because of the way kubespray uses that module?

As workaround I've commented out the update_cache task in ./roles/kubernetes/preinstall/tasks/main.yml

jvleminc commented 5 years ago

Same issues happened to me on Ubuntu hosts; when running the apt update cache manually, the following error was thrown:

ubuntu@host0:~$ apt-get update
Reading package lists... Done
W: chmod 0700 of directory /var/lib/apt/lists/partial failed - SetupAPTPartialDirectory (1: Operation not permitted)
E: Could not open lock file /var/lib/apt/lists/lock - open (13: Permission denied)
E: Unable to lock directory /var/lib/apt/lists/
W: Problem unlinking the file /var/cache/apt/pkgcache.bin - RemoveCaches (13: Permission denied)
W: Problem unlinking the file /var/cache/apt/srcpkgcache.bin - RemoveCaches (13: Permission denied)

Running the same command manually in the node with 'sudo' fixed this and afterwards running kubespray again didn't throw the error anymore.

BTW, the filesection in Kubesprayfile structure is now "opt/kubespray/roles/kubernetes/preinstall/tasks/0070-system-packages.yml (called from /opt/kubespray/roles/kubernetes/preinstall/tasks/main.yml), section "Update package management cache (APT)".

adnavare commented 5 years ago

@jvleminc : Were you able to fix it without manually running with sudo on the target server? I am seeing the same issue while running the ansible script from my controller node onto node1 i.e. target node running Ubuntu 18.04. The error is below:

labuser@10.54.30.43> (1, '\n{"msg": "Failed to update apt cache: ", "failed": true, "exception": "WARNING: The below traceback may not be related to the actual failure.\n File \"/tmp/ansible_apt_payload_XQxnkz/main.py\", line 1026, in main\n cache.update()\n File \"/usr/lib/python2.7/dist-packages/apt/cache.py\", line 548, in update\n raise FetchFailedException()\n", "invocation": {"module_args": {"dpkg_options": "force-confdef,force-confold", "autoremove": false, "force": false, "force_apt_get": false, "install_recommends": null, "package": null, "autoclean": false, "purge": false, "allow_unauthenticated": false, "state": "present", "upgrade": null, "update_cache": true, "default_release": null, "only_upgrade": false, "deb": null, "cache_valid_time": 3600}}}\n', '') The full traceback is: WARNING: The below traceback may not be related to the actual failure. File "/tmp/ansible_apt_payload_XQxnkz/main.py", line 1026, in main cache.update() File "/usr/lib/python2.7/dist-packages/apt/cache.py", line 548, in update raise FetchFailedException()

fatal: [node2]: FAILED! => { "changed": false, "invocation": { "module_args": { "allow_unauthenticated": false, "autoclean": false, "autoremove": false, "cache_valid_time": 3600, "deb": null, "default_release": null, "dpkg_options": "force-confdef,force-confold", "force": false, "force_apt_get": false, "install_recommends": null, "only_upgrade": false, "package": null, "purge": false, "state": "present", "update_cache": true, "upgrade": null } }, "msg": "Failed to update apt cache: "

jvleminc commented 5 years ago

@adnavare No, I fixed it by running "sudo apt-get update" on the nodes where the original command failed and afterwards I ran kubespray again without issues.

fejta-bot commented 5 years ago

Issues go stale after 90d of inactivity. Mark the issue as fresh with /remove-lifecycle stale. Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta. /lifecycle stale

fejta-bot commented 5 years ago

Stale issues rot after 30d of inactivity. Mark the issue as fresh with /remove-lifecycle rotten. Rotten issues close after an additional 30d of inactivity.

If this issue is safe to close now please do so with /close.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta. /lifecycle rotten

fejta-bot commented 5 years ago

Rotten issues close after 30d of inactivity. Reopen the issue with /reopen. Mark the issue as fresh with /remove-lifecycle rotten.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta. /close

k8s-ci-robot commented 5 years ago

@fejta-bot: Closing this issue.

In response to [this](https://github.com/kubernetes-sigs/kubespray/issues/2517#issuecomment-506582297): >Rotten issues close after 30d of inactivity. >Reopen the issue with `/reopen`. >Mark the issue as fresh with `/remove-lifecycle rotten`. > >Send feedback to sig-testing, kubernetes/test-infra and/or [fejta](https://github.com/fejta). >/close Instructions for interacting with me using PR comments are available [here](https://git.k8s.io/community/contributors/guide/pull-requests.md). If you have questions or suggestions related to my behavior, please file an issue against the [kubernetes/test-infra](https://github.com/kubernetes/test-infra/issues/new?title=Prow%20issue:) repository.