Closed carlcsaposs-canonical closed 1 year ago
This may be due to the usage of the python kubernetes-client
in these tests (as they require ~/.kube/config
to access kubectl
), but may be missing from the GCE environment
This may be due to the usage of the python
kubernetes-client
in these tests (as they require~/.kube/config
to accesskubectl
), but may be missing from the GCE environment
Thank you! I was able to get the tests to pass after running microk8s config > ~/.kube/config
Bug Description
Failing integration tests
Environment
main@ 0d8ea2df70d96a9e242708fdcddb13fbedd2df10
Google Compute Engine Ubuntu 22.04
GCE environment:
```json { "creationTimestamp": "2022-11-28T10:47:13.287-08:00", "description": "", "id": "3183457864939781342", "kind": "compute#instanceTemplate", "name": "dev-env", "properties": { "confidentialInstanceConfig": { "enableConfidentialCompute": false }, "description": "", "scheduling": { "onHostMaintenance": "MIGRATE", "provisioningModel": "STANDARD", "automaticRestart": true, "instanceTerminationAction": "STOP", "maxRunDuration": { "seconds": "36000", "nanos": 0 }, "preemptible": false }, "tags": {}, "disks": [ { "type": "PERSISTENT", "deviceName": "instance-template-1", "autoDelete": true, "index": 0, "boot": true, "kind": "compute#attachedDisk", "mode": "READ_WRITE", "initializeParams": { "sourceImage": "projects/ubuntu-os-cloud/global/images/ubuntu-2204-jammy-v20221123", "diskType": "pd-ssd", "diskSizeGb": "50" } } ], "networkInterfaces": [ { "stackType": "IPV4_ONLY", "name": "nic0", "network": "projects/dev-env-369519/global/networks/default", "accessConfigs": [ { "name": "External NAT", "type": "ONE_TO_ONE_NAT", "kind": "compute#accessConfig", "networkTier": "PREMIUM" } ], "kind": "compute#networkInterface" } ], "reservationAffinity": { "consumeReservationType": "ANY_RESERVATION" }, "canIpForward": false, "keyRevocationActionType": "NONE", "machineType": "n2-standard-4", "metadata": { "fingerprint": "rhj2YiywKp4=", "kind": "compute#metadata", "items": [ { "value": "#cloud-config\npackage_upgrade: true\npackages:\n - gnome-keyring\n - tox\nsnap:\n commands:\n - snap refresh\n - snap install juju --classic --channel=2.9/stable\n - snap install charmcraft --classic\n - snap install lxd\n - snap install microk8s --classic\n - snap alias microk8s.kubectl kubectl\n - snap install jhack\n - snap connect jhack:dot-local-share-juju snapd\nruncmd:\n - adduser ubuntu lxd\n - newgrp lxd\n - lxd init --auto\n - lxc network set lxdbr0 ipv6.address none\n - adduser ubuntu microk8s\n - newgrp microk8s\n - microk8s status --wait-ready\n - microk8s enable dns\n - kubectl rollout status -n kube-system -w --timeout=5m deployments/coredns\n - microk8s enable hostpath-storage\n - kubectl rollout status -n kube-system -w --timeout=5m deployments/hostpath-provisioner\n", "key": "user-data" }, { "value": "ubuntu:ssh-ed25519 [redacted] ubuntu", "key": "ssh-keys" } ] }, "shieldedVmConfig": { "enableSecureBoot": false, "enableVtpm": true, "enableIntegrityMonitoring": true }, "shieldedInstanceConfig": { "enableSecureBoot": false, "enableVtpm": true, "enableIntegrityMonitoring": true }, "displayDevice": { "enableDisplay": false } }, "selfLink": "projects/dev-env-369519/global/instanceTemplates/dev-env" } ```Cloud-init:
```yaml #cloud-config package_upgrade: true packages: - gnome-keyring - tox snap: commands: - snap refresh - snap install juju --classic --channel=2.9/stable - snap install charmcraft --classic - snap install lxd - snap install microk8s --classic - snap alias microk8s.kubectl kubectl - snap install jhack - snap connect jhack:dot-local-share-juju snapd runcmd: - adduser ubuntu lxd - newgrp lxd - lxd init --auto - lxc network set lxdbr0 ipv6.address none - adduser ubuntu microk8s - newgrp microk8s - microk8s status --wait-ready - microk8s enable dns - kubectl rollout status -n kube-system -w --timeout=5m deployments/coredns - microk8s enable hostpath-storage - kubectl rollout status -n kube-system -w --timeout=5m deployments/hostpath-provisioner ```Commands run as `ubuntu` user after cloud-init:
```shell juju bootstrap microk8s micro --agent-version=2.9.29 juju bootstrap localhost lxd --agent-version=2.9.29 juju model-defaults logging-config='Relevant log output
`tox -e integration-replication`
```shell ubuntu@dev-env-1:~/repos/mysql-k8s-operator$ tox -e integration-replication integration-replication installed: anyio==3.6.2,asttokens==2.2.0,attrs==22.1.0,backcall==0.2.0,bcrypt==4.0.1,cachetools==5.2.0,certifi==2022.9.24,cffi==1.15.1,charset-normalizer==2.1.1,cryptography==38.0.2,decorator==5.1.1,exceptiongroup==1.0.4,executing==1.2.0,google-auth==2.14.1,h11==0.14.0,httpcore==0.16.2,httpx==0.23.1,idna==3.4,iniconfig==1.1.1,ipdb==0.13.9,ipython==8.7.0,jedi==0.18.2,Jinja2==3.1.2,jsonschema==4.16.0,juju==2.9.11,jujubundlelib==0.5.7,kubernetes==25.3.0,lightkube==0.11.0,lightkube-models==1.24.1.4,macaroonbakery==1.3.1,MarkupSafe==2.1.1,matplotlib-inline==0.1.6,mypy-extensions==0.4.3,mysql-connector-python==8.0.31,oauthlib==3.2.2,ops==1.5.4,packaging==21.3,paramiko==2.12.0,parso==0.8.3,pexpect==4.8.0,pickleshare==0.7.5,pluggy==1.0.0,prompt-toolkit==3.0.33,protobuf==3.20.1,ptyprocess==0.7.0,pure-eval==0.2.2,pyasn1==0.4.8,pyasn1-modules==0.2.8,pycparser==2.21,Pygments==2.13.0,pymacaroons==0.13.0,PyNaCl==1.5.0,pyparsing==3.0.9,pyRFC3339==1.1,pyrsistent==0.19.2,pytest==7.2.0,pytest-asyncio==0.20.2,pytest-operator==0.22.0,pytest-order==1.0.1,python-dateutil==2.8.2,pytz==2022.6,PyYAML==6.0,requests==2.28.1,requests-oauthlib==1.3.1,rfc3986==1.5.0,rsa==4.9,six==1.16.0,sniffio==1.3.0,stack-data==0.6.2,tenacity==8.0.1,theblues==0.5.2,toml==0.10.2,tomli==2.0.1,toposort==1.7,traitlets==5.6.0,typing-inspect==0.8.0,typing_extensions==4.4.0,urllib3==1.26.13,wcwidth==0.2.5,websocket-client==1.4.2,websockets==10.4 integration-replication run-test-pre: PYTHONHASHSEED='556451204' integration-replication run-test: commands[0] | pytest -v --tb native --ignore=/home/ubuntu/repos/mysql-k8s-operator/tests/unit --log-cli-level=INFO -s -m replication_tests =========================================================================== test session starts =========================================================================== platform linux -- Python 3.10.6, pytest-7.2.0, pluggy-1.0.0 -- /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/bin/python cachedir: .tox/integration-replication/.pytest_cache rootdir: /home/ubuntu/repos/mysql-k8s-operator, configfile: pyproject.toml plugins: operator-0.22.0, order-1.0.1, anyio-3.6.2, asyncio-0.20.2 asyncio: mode=auto collected 25 items / 20 deselected / 5 selected tests/integration/high_availability/test_replication.py::test_build_and_deploy ----------------------------------------------------------------------------- live log setup ------------------------------------------------------------------------------ INFO pytest_operator.plugin:plugin.py:625 Adding model micro:test-replication-zhcx on cloud microk8s ------------------------------------------------------------------------------ live log call ------------------------------------------------------------------------------ INFO pytest_operator.plugin:plugin.py:504 Using tmp_path: /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/tmp/pytest/test-replication-zhcx0 INFO pytest_operator.plugin:plugin.py:948 Building charm mysql-k8s INFO pytest_operator.plugin:plugin.py:953 Built charm mysql-k8s in 47.81s INFO juju.model:model.py:1924 Deploying local:focal/mysql-k8s-0 INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [allocating] waiting: installing agent mysql-k8s/1 [allocating] waiting: installing agent mysql-k8s/2 [allocating] waiting: installing agent INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [executing] maintenance: Initialising mysqld mysql-k8s/1 [idle] maintenance: Initialising mysqld mysql-k8s/2 [idle] maintenance: Initialising mysqld INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [idle] active: mysql-k8s/1 [executing] active: mysql-k8s/2 [idle] waiting: Waiting for instance to join the cluster INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [idle] active: mysql-k8s/1 [idle] active: mysql-k8s/2 [idle] active: INFO pytest_operator.plugin:plugin.py:504 Using tmp_path: /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/tmp/pytest/test-replication-zhcx0 INFO pytest_operator.plugin:plugin.py:948 Building charm application INFO pytest_operator.plugin:plugin.py:953 Built charm application in 15.86s INFO juju.model:model.py:1924 Deploying local:focal/application-0 INFO juju.model:model.py:2526 Waiting for model: application/0 [allocating] waiting: installing agent INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [executing] active: mysql-k8s/1 [executing] active: mysql-k8s/2 [executing] active: application/0 [executing] waiting: PASSED tests/integration/high_availability/test_replication.py::test_check_consistency PASSED tests/integration/high_availability/test_replication.py::test_no_replication_across_clusters ------------------------------------------------------------------------------ live log call ------------------------------------------------------------------------------ INFO juju.model:model.py:1924 Deploying local:focal/mysql-k8s-1 INFO juju.model:model.py:2526 Waiting for model: another-mysql/0 [allocating] waiting: installing agent another-mysql/1 [allocating] waiting: installing agent another-mysql/2 [allocating] waiting: installing agent INFO juju.model:model.py:2526 Waiting for model: another-mysql/0 [executing] maintenance: Initialising mysqld another-mysql/1 [executing] maintenance: Initialising mysqld another-mysql/2 [executing] maintenance: Initialising mysqld INFO juju.model:model.py:2526 Waiting for model: another-mysql/0 [executing] active: another-mysql/1 [idle] waiting: Waiting for instance to join the cluster another-mysql/2 [idle] waiting: Waiting for instance to join the cluster INFO juju.model:model.py:2526 Waiting for model: another-mysql/0 [idle] active: another-mysql/1 [executing] active: another-mysql/2 [executing] active: PASSED tests/integration/high_availability/test_replication.py::test_scaling_without_data_loss ------------------------------------------------------------------------------ live log call ------------------------------------------------------------------------------ INFO juju.model:model.py:2526 Waiting for model: mysql-k8s (waiting for exactly 4 units, current : 3) INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [idle] active: mysql-k8s/1 [idle] active: mysql-k8s/2 [idle] active: mysql-k8s/3 [executing] waiting: Waiting for instance to join the cluster INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [idle] active: mysql-k8s/1 [idle] active: mysql-k8s/2 [idle] active: mysql-k8s/3 [idle] active: INFO juju.model:model.py:2526 Waiting for model: mysql-k8s (waiting for exactly 3 units, current : 4) INFO juju.model:model.py:2526 Waiting for model: mysql-k8s (waiting for exactly 3 units, current : 4) PASSED tests/integration/high_availability/test_replication.py::test_kill_primary_check_reelection FAILED ---------------------------------------------------------------------------- live log teardown ---------------------------------------------------------------------------- INFO pytest_operator.plugin:plugin.py:768 Model status: Model Controller Cloud/Region Version SLA Timestamp test-replication-zhcx micro microk8s/localhost 2.9.29 unsupported 16:58:50Z App Version Status Scale Charm Channel Rev Address Exposed Message another-mysql 8.0.31-0ubuntu0.22.04.1 active 0 mysql-k8s 1 10.152.183.119 no application active 1 application 0 10.152.183.23 no mysql-k8s 8.0.31-0ubuntu0.22.04.1 active 3 mysql-k8s 0 10.152.183.46 no Unit Workload Agent Address Ports Message another-mysql/0 terminated executing 10.1.61.173 (remove) another-mysql/1 terminated executing 10.1.61.157 (remove) another-mysql/2 terminated executing 10.1.61.166 (remove) application/0* active idle 10.1.61.143 mysql-k8s/0 active idle 10.1.61.151 mysql-k8s/1* active idle 10.1.61.154 mysql-k8s/2 active idle 10.1.61.162 INFO pytest_operator.plugin:plugin.py:774 Juju error logs: controller-0: 16:56:01 ERROR juju.worker.caasapplicationprovisioner.runner exited "another-mysql": getting OCI image resources: unable to fetch OCI image resources for another-mysql: application "another-mysql" dying or dead controller-0: 16:56:19 ERROR juju.worker.caasapplicationprovisioner.runner exited "mysql-k8s": Operation cannot be fulfilled on pods "mysql-k8s-3": the object has been modified; please apply your changes to the latest version and try again INFO pytest_operator.plugin:plugin.py:793 juju-crashdump command was not found. INFO pytest_operator.plugin:plugin.py:861 Resetting model test-replication-zhcx... INFO pytest_operator.plugin:plugin.py:850 Destroying applications mysql-k8s INFO pytest_operator.plugin:plugin.py:850 Destroying applications application INFO pytest_operator.plugin:plugin.py:850 Destroying applications another-mysql INFO pytest_operator.plugin:plugin.py:866 Not waiting on reset to complete. INFO pytest_operator.plugin:plugin.py:839 Forgetting main... ================================================================================ FAILURES ================================================================================= ___________________________________________________________________ test_kill_primary_check_reelection ____________________________________________________________________ Traceback (most recent call last): File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/config/kubeconfig.py", line 175, in from_service_account token = account_dir.joinpath("token").read_text() File "/usr/lib/python3.10/pathlib.py", line 1134, in read_text with self.open(mode='r', encoding=encoding, errors=errors) as f: File "/usr/lib/python3.10/pathlib.py", line 1119, in open return self._accessor.open(self, mode, buffering, encoding, errors, FileNotFoundError: [Errno 2] No such file or directory: '/var/run/secrets/kubernetes.io/serviceaccount/token' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/config/kubeconfig.py", line 210, in from_env return KubeConfig.from_service_account(service_account=service_account) File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/config/kubeconfig.py", line 178, in from_service_account raise exceptions.ConfigError(str(e)) lightkube.core.exceptions.ConfigError: [Errno 2] No such file or directory: '/var/run/secrets/kubernetes.io/serviceaccount/token' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/ubuntu/repos/mysql-k8s-operator/tests/integration/high_availability/test_replication.py", line 58, in test_kill_primary_check_reelection client = lightkube.Client() File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/core/client.py", line 43, in __init__ self._client = GenericSyncClient(config, namespace=namespace, timeout=timeout, lazy=lazy, File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/core/generic_client.py", line 82, in __init__ config = KubeConfig.from_env().get() File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/config/kubeconfig.py", line 212, in from_env return KubeConfig.from_file(os.environ.get('KUBECONFIG', default_config)) File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/lib/python3.10/site-packages/lightkube/config/kubeconfig.py", line 142, in from_file raise exceptions.ConfigError(f"Configuration file {fname} not found") lightkube.core.exceptions.ConfigError: Configuration file ~/.kube/config not found -------------------------------------------------------------------------- Captured log teardown -------------------------------------------------------------------------- INFO pytest_operator.plugin:plugin.py:768 Model status: Model Controller Cloud/Region Version SLA Timestamp test-replication-zhcx micro microk8s/localhost 2.9.29 unsupported 16:58:50Z App Version Status Scale Charm Channel Rev Address Exposed Message another-mysql 8.0.31-0ubuntu0.22.04.1 active 0 mysql-k8s 1 10.152.183.119 no application active 1 application 0 10.152.183.23 no mysql-k8s 8.0.31-0ubuntu0.22.04.1 active 3 mysql-k8s 0 10.152.183.46 no Unit Workload Agent Address Ports Message another-mysql/0 terminated executing 10.1.61.173 (remove) another-mysql/1 terminated executing 10.1.61.157 (remove) another-mysql/2 terminated executing 10.1.61.166 (remove) application/0* active idle 10.1.61.143 mysql-k8s/0 active idle 10.1.61.151 mysql-k8s/1* active idle 10.1.61.154 mysql-k8s/2 active idle 10.1.61.162 INFO pytest_operator.plugin:plugin.py:774 Juju error logs: controller-0: 16:56:01 ERROR juju.worker.caasapplicationprovisioner.runner exited "another-mysql": getting OCI image resources: unable to fetch OCI image resources for another-mysql: application "another-mysql" dying or dead controller-0: 16:56:19 ERROR juju.worker.caasapplicationprovisioner.runner exited "mysql-k8s": Operation cannot be fulfilled on pods "mysql-k8s-3": the object has been modified; please apply your changes to the latest version and try again INFO pytest_operator.plugin:plugin.py:793 juju-crashdump command was not found. INFO pytest_operator.plugin:plugin.py:861 Resetting model test-replication-zhcx... INFO pytest_operator.plugin:plugin.py:850 Destroying applications mysql-k8s INFO pytest_operator.plugin:plugin.py:850 Destroying applications application INFO pytest_operator.plugin:plugin.py:850 Destroying applications another-mysql INFO pytest_operator.plugin:plugin.py:866 Not waiting on reset to complete. INFO pytest_operator.plugin:plugin.py:839 Forgetting main... ============================================================================ warnings summary ============================================================================= tests/integration/relations/test_osm_mysql.py:30 /home/ubuntu/repos/mysql-k8s-operator/tests/integration/relations/test_osm_mysql.py:30: PytestUnknownMarkWarning: Unknown pytest.mark.osm_mysql_tests - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.osm_mysql_tests tests/integration/relations/test_osm_mysql.py:139 /home/ubuntu/repos/mysql-k8s-operator/tests/integration/relations/test_osm_mysql.py:139: PytestUnknownMarkWarning: Unknown pytest.mark.osm_mysql_tests - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.osm_mysql_tests -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ========================================================================= short test summary info ========================================================================= FAILED tests/integration/high_availability/test_replication.py::test_kill_primary_check_reelection - lightkube.core.exceptions.ConfigError: Configuration file ~/.kube/config not found =================================================== 1 failed, 4 passed, 20 deselected, 2 warnings in 529.34s (0:08:49) ==================================================== ERROR: InvocationError for command /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-replication/bin/pytest -v --tb native --ignore=/home/ubuntu/repos/mysql-k8s-operator/tests/unit --log-cli-level=INFO -s -m replication_tests (exited with code 1) _________________________________________________________________________________ summary _________________________________________________________________________________ ERROR: integration-replication: commands failed ````tox -e integration-self-healing`
``` ubuntu@dev-env-1:~/repos/mysql-k8s-operator$ tox -e integration-self-healing integration-self-healing create: /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing integration-self-healing installdeps: juju==2.9.11, kubernetes, lightkube, mysql-connector-python, pytest, pytest-operator, pytest-order, -r/home/ubuntu/repos/mysql-k8s-operator/requirements.txt integration-self-healing installed: anyio==3.6.2,asttokens==2.2.0,attrs==22.1.0,backcall==0.2.0,bcrypt==4.0.1,cachetools==5.2.0,certifi==2022.9.24,cffi==1.15.1,charset-normalizer==2.1.1,cryptography==38.0.2,decorator==5.1.1,exceptiongroup==1.0.4,executing==1.2.0,google-auth==2.14.1,h11==0.14.0,httpcore==0.16.2,httpx==0.23.1,idna==3.4,iniconfig==1.1.1,ipdb==0.13.9,ipython==8.7.0,jedi==0.18.2,Jinja2==3.1.2,jsonschema==4.16.0,juju==2.9.11,jujubundlelib==0.5.7,kubernetes==25.3.0,lightkube==0.11.0,lightkube-models==1.24.1.4,macaroonbakery==1.3.1,MarkupSafe==2.1.1,matplotlib-inline==0.1.6,mypy-extensions==0.4.3,mysql-connector-python==8.0.31,oauthlib==3.2.2,ops==1.5.4,packaging==21.3,paramiko==2.12.0,parso==0.8.3,pexpect==4.8.0,pickleshare==0.7.5,pluggy==1.0.0,prompt-toolkit==3.0.33,protobuf==3.20.1,ptyprocess==0.7.0,pure-eval==0.2.2,pyasn1==0.4.8,pyasn1-modules==0.2.8,pycparser==2.21,Pygments==2.13.0,pymacaroons==0.13.0,PyNaCl==1.5.0,pyparsing==3.0.9,pyRFC3339==1.1,pyrsistent==0.19.2,pytest==7.2.0,pytest-asyncio==0.20.2,pytest-operator==0.22.0,pytest-order==1.0.1,python-dateutil==2.8.2,pytz==2022.6,PyYAML==6.0,requests==2.28.1,requests-oauthlib==1.3.1,rfc3986==1.5.0,rsa==4.9,six==1.16.0,sniffio==1.3.0,stack-data==0.6.2,tenacity==8.0.1,theblues==0.5.2,toml==0.10.2,tomli==2.0.1,toposort==1.7,traitlets==5.6.0,typing-inspect==0.8.0,typing_extensions==4.4.0,urllib3==1.26.13,wcwidth==0.2.5,websocket-client==1.4.2,websockets==10.4 integration-self-healing run-test-pre: PYTHONHASHSEED='258991560' integration-self-healing run-test: commands[0] | pytest -v --tb native --ignore=/home/ubuntu/repos/mysql-k8s-operator/tests/unit --log-cli-level=INFO -s -m self_healing_tests =========================================================================== test session starts =========================================================================== platform linux -- Python 3.10.6, pytest-7.2.0, pluggy-1.0.0 -- /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing/bin/python cachedir: .tox/integration-self-healing/.pytest_cache rootdir: /home/ubuntu/repos/mysql-k8s-operator, configfile: pyproject.toml plugins: operator-0.22.0, order-1.0.1, anyio-3.6.2, asyncio-0.20.2 asyncio: mode=auto collected 25 items / 21 deselected / 4 selected tests/integration/high_availability/test_self_healing.py::test_build_and_deploy ----------------------------------------------------------------------------- live log setup ------------------------------------------------------------------------------ INFO pytest_operator.plugin:plugin.py:625 Adding model micro:test-self-healing-zom0 on cloud microk8s ------------------------------------------------------------------------------ live log call ------------------------------------------------------------------------------ INFO pytest_operator.plugin:plugin.py:504 Using tmp_path: /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing/tmp/pytest/test-self-healing-zom00 INFO pytest_operator.plugin:plugin.py:948 Building charm mysql-k8s INFO pytest_operator.plugin:plugin.py:953 Built charm mysql-k8s in 47.22s INFO juju.model:model.py:1924 Deploying local:focal/mysql-k8s-0 INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [allocating] waiting: installing agent mysql-k8s/1 [allocating] waiting: installing agent mysql-k8s/2 [allocating] waiting: installing agent INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [executing] maintenance: Initialising mysqld mysql-k8s/1 [idle] maintenance: Initialising mysqld mysql-k8s/2 [idle] maintenance: Initialising mysqld INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [executing] active: mysql-k8s/1 [idle] active: mysql-k8s/2 [idle] waiting: Waiting for instance to join the cluster INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [idle] active: mysql-k8s/1 [idle] active: mysql-k8s/2 [idle] active: INFO pytest_operator.plugin:plugin.py:504 Using tmp_path: /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing/tmp/pytest/test-self-healing-zom00 INFO pytest_operator.plugin:plugin.py:948 Building charm application INFO pytest_operator.plugin:plugin.py:953 Built charm application in 37.79s INFO juju.model:model.py:1924 Deploying local:focal/application-0 INFO juju.model:model.py:2526 Waiting for model: application/0 [allocating] waiting: installing agent INFO juju.model:model.py:2526 Waiting for model: mysql-k8s/0 [executing] active: mysql-k8s/1 [executing] active: mysql-k8s/2 [executing] active: application/0 [executing] waiting: PASSED tests/integration/high_availability/test_self_healing.py::test_kill_db_process FAILED tests/integration/high_availability/test_self_healing.py::test_freeze_db_process XFAIL (aborted) tests/integration/high_availability/test_self_healing.py::test_graceful_crash_of_primary XFAIL (aborted) ---------------------------------------------------------------------------- live log teardown ---------------------------------------------------------------------------- INFO pytest_operator.plugin:plugin.py:768 Model status: Model Controller Cloud/Region Version SLA Timestamp test-self-healing-zom0 micro microk8s/localhost 2.9.29 unsupported 17:57:13Z App Version Status Scale Charm Channel Rev Address Exposed Message application active 1 application 0 10.152.183.216 no mysql-k8s 8.0.31-0ubuntu0.22.04.1 active 3 mysql-k8s 0 10.152.183.251 no Unit Workload Agent Address Ports Message application/0* active executing 10.1.61.170 mysql-k8s/0* active idle 10.1.61.156 mysql-k8s/1 active idle 10.1.61.163 mysql-k8s/2 active idle 10.1.61.172 INFO pytest_operator.plugin:plugin.py:774 Juju error logs: INFO pytest_operator.plugin:plugin.py:793 juju-crashdump command was not found. INFO pytest_operator.plugin:plugin.py:861 Resetting model test-self-healing-zom0... INFO pytest_operator.plugin:plugin.py:850 Destroying applications mysql-k8s INFO pytest_operator.plugin:plugin.py:850 Destroying applications application INFO pytest_operator.plugin:plugin.py:866 Not waiting on reset to complete. INFO pytest_operator.plugin:plugin.py:839 Forgetting main... ================================================================================ FAILURES ================================================================================= __________________________________________________________________________ test_kill_db_process ___________________________________________________________________________ Traceback (most recent call last): File "/home/ubuntu/repos/mysql-k8s-operator/tests/integration/high_availability/test_self_healing.py", line 57, in test_kill_db_process await send_signal_to_pod_container_process( File "/home/ubuntu/repos/mysql-k8s-operator/tests/integration/high_availability/high_availability_helpers.py", line 257, in send_signal_to_pod_container_process kubernetes.config.load_kube_config() File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing/lib/python3.10/site-packages/kubernetes/config/kube_config.py", line 813, in load_kube_config loader = _get_kube_config_loader( File "/home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing/lib/python3.10/site-packages/kubernetes/config/kube_config.py", line 770, in _get_kube_config_loader raise ConfigException( kubernetes.config.config_exception.ConfigException: Invalid kube-config file. No configuration found. ============================================================================ warnings summary ============================================================================= tests/integration/relations/test_osm_mysql.py:30 /home/ubuntu/repos/mysql-k8s-operator/tests/integration/relations/test_osm_mysql.py:30: PytestUnknownMarkWarning: Unknown pytest.mark.osm_mysql_tests - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.osm_mysql_tests tests/integration/relations/test_osm_mysql.py:139 /home/ubuntu/repos/mysql-k8s-operator/tests/integration/relations/test_osm_mysql.py:139: PytestUnknownMarkWarning: Unknown pytest.mark.osm_mysql_tests - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.osm_mysql_tests -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ========================================================================= short test summary info ========================================================================= FAILED tests/integration/high_availability/test_self_healing.py::test_kill_db_process - kubernetes.config.config_exception.ConfigException: Invalid kube-config file. No configuration found. ============================================== 1 failed, 1 passed, 21 deselected, 2 xfailed, 2 warnings in 265.28s (0:04:25) ============================================== ERROR: InvocationError for command /home/ubuntu/repos/mysql-k8s-operator/.tox/integration-self-healing/bin/pytest -v --tb native --ignore=/home/ubuntu/repos/mysql-k8s-operator/tests/unit --log-cli-level=INFO -s -m self_healing_tests (exited with code 1) _________________________________________________________________________________ summary _________________________________________________________________________________ ERROR: integration-self-healing: commands failed ```