Closed gustavosr98 closed 8 months ago
I tried again with a Juju bundle rather than from Juju CLI
bundle: kubernetes
name: kubeflow
applications:
kfp-db:
charm: ./mysql-k8s_1852a95.charm
scale: 1
trust: true
constraints: mem=2G
resources:
mysql-image: 701143232170.dkr.ecr.eu-west-1.amazonaws.com/canonical/charmed-mysql:3b6a4a63971acec3b71a0178cd093014a695ddf7c31d91d56ebb110eec6cdbe1
Same result -> https://pastebin.ubuntu.com/p/G3QZVjMdXK/
Unable to reproduce with juju: 2.9.45 microk8s: MicroK8s v1.27.5 revision 5891
$ juju download mysql-k8s --channel 8.0/stable
Series "focal" is not supported for charm "mysql-k8s", trying series "jammy"
Fetching charm "mysql-k8s" using "8.0/stable" channel and base "amd64/ubuntu/22.04"
Install the "mysql-k8s" charm with:
juju deploy ./mysql-k8s_1852a95.charm
$ juju deploy ./mysql-k8s_1852a95.charm --resource mysql-image=ghcr.io/canonical/charmed-mysql@sha256:3b6a4a63971acec3b71a0178cd093014a695ddf7c31d91d56ebb110eec6cdbe1
Located local charm "mysql-k8s", revision 0
Deploying "mysql-k8s" from local charm "mysql-k8s", revision 0 on jammy
Contents of manifest.yaml in mysql-k8s_1852a95.charm show that charm was built on 22.04 (so juju download is probably working correctly)
analysis:
attributes:
- name: language
result: python
- name: framework
result: operator
bases:
- architectures:
- amd64
channel: '22.04'
name: ubuntu
charmcraft-started-at: '2023-09-01T11:54:53.671520Z'
charmcraft-version: 2.3.0
@gustavosr98 What's the ubuntu version of the host machine (i.e. the machine with the juju CLI)?
$ juju deploy ./mysql-k8s_1852a95.charm kfp-db --resource mysql-image=ghcr.io/canonical/charmed-mysql@sha256:3b6a4a63971acec3b71a0178cd093014a695ddf7c31d91d56ebb110eec6cdbe1 --constraints="mem=2G" --debug
06:11:35 INFO juju.cmd supercommand.go:56 running juju [2.9.45 afb8ee760af71d0bca8c3e4e0dc28af2dabc9b1d gc go1.20.8]
06:11:35 DEBUG juju.cmd supercommand.go:57 args: []string{"/snap/juju/24550/bin/juju", "deploy", "./mysql-k8s_1852a95.charm", "kfp-db", "--resource", "mysql-image=ghcr.io/canonical/charmed-mysql@sha256:3b6a4a63971acec3b71a0178cd093014a695ddf7c31d91d56ebb110eec6cdbe1", "--constraints=mem=2G", "--debug"}
06:11:35 DEBUG juju.jujuclient proxy.go:65 unmarshalled proxy config for "kubernetes-port-forward"
06:11:35 INFO juju.juju api.go:86 connecting to API addresses: [10.152.183.99:17070]
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 DEBUG juju.api apiclient.go:624 starting proxier for connection
06:11:35 DEBUG juju.api apiclient.go:628 tunnel proxy in use at localhost on port 33817
06:11:35 DEBUG juju.api apiclient.go:1151 successfully dialed "wss://localhost:33817/api"
06:11:35 INFO juju.api apiclient.go:1053 cannot resolve "localhost": lookup localhost: operation was canceled
06:11:35 INFO juju.api apiclient.go:686 connection established to "wss://localhost:33817/api"
06:11:35 DEBUG juju.jujuclient proxy.go:65 unmarshalled proxy config for "kubernetes-port-forward"
06:11:35 INFO juju.juju api.go:86 connecting to API addresses: [10.152.183.99:17070]
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 INFO juju.kubernetes.klog klog.go:59 Use tokens from the TokenRequest API or manually created secret-based tokens instead of auto-generated secret-based tokens.
06:11:35 DEBUG juju.api apiclient.go:624 starting proxier for connection
06:11:35 DEBUG juju.api apiclient.go:628 tunnel proxy in use at localhost on port 38751
06:11:35 DEBUG juju.api apiclient.go:1151 successfully dialed "wss://localhost:38751/model/5c05ffd8-d260-4920-821c-6541d42febaf/api"
06:11:35 INFO juju.api apiclient.go:1053 cannot resolve "localhost": lookup localhost: operation was canceled
06:11:35 INFO juju.api apiclient.go:686 connection established to "wss://localhost:38751/model/5c05ffd8-d260-4920-821c-6541d42febaf/api"
06:11:35 DEBUG juju.core.charm computedseries.go:27 series "jammy" for charm "mysql-k8s" with format 2, Kubernetes true
06:11:35 DEBUG juju.core.charm computedseries.go:27 series "jammy" for charm "mysql-k8s" with format 2, Kubernetes true
06:11:35 INFO cmd charm.go:384 Preparing to deploy local charm: "mysql-k8s_1852a95.charm"
06:11:49 INFO cmd charm.go:406 Located local charm "mysql-k8s", revision 0
06:11:49 INFO cmd charm.go:236 Deploying "kfp-db" from local charm "mysql-k8s", revision 0 on jammy
06:11:50 DEBUG juju.api monitor.go:35 RPC connection died
06:11:50 DEBUG juju.api monitor.go:35 RPC connection died
06:11:50 INFO cmd supercommand.go:544 command finished
Only difference in this command is I'm using GHCR instead of $AWS_ECR_URL/canonical/charmed-mysql:753477ce39712221f008955b746fcf01a215785a215fe3de56f525380d14ad97
—either there's something in AWS_ECR_URL
causing it to fail or (more likely imo) we have different environments
@carlcsaposs-canonical I suspect the difference is in proxy-in-use due to airgapped env. I am not sure, how to trace it from here. IMHO, we need Juju team assist. BTW, we have successfully deployed mysql-k8s in PartnetCloud airgapped env the last weeks.
@jameinel can you hint us something here? It smells like Juju/networking issue here.
BTW, I do not like in https://pastebin.ubuntu.com/p/G3QZVjMdXK/:
20:50:18 DEBUG juju.api monitor.go:35 RPC connection died
It is exactly in between:
upload charm /root/kubeflow/mysql-k8s_1852a95.charm for series jammy with architecture=amd64
...
20:50:18 DEBUG juju.api monitor.go:35 RPC connection died
20:50:18 DEBUG juju.api monitor.go:35 RPC connection died
ERROR cannot deploy bundle: mysql-k8s is not available on the following series: jammy not supported
@gustavosr98 as a shoot in a dark, can you specify --series jammy
explicitly for juju deploy mysql-k8s
?
@gustavosr98 as a shoot in a dark, can you specify
--series jammy
explicitly forjuju deploy mysql-k8s
?
@taurus-forever Already tried:
Tried in a couple different ways
$ juju deploy ./mysql-k8s_1852a95.charm kfp-db --resource mysql-image=$AWS_ECR_URL/canonical/charmed-mysql:753477ce39712221f008955b746fcf01a215785a215fe3de56f525380d14ad97 --constraints="mem=2G" --series jammy
@gustavosr98 To confirm, is the machine with the juju cli running Ubuntu 22.04?
@carlcsaposs-canonical The Juju CLI is being used from an ubuntu 22.04 machine (Pod OCI image ubuntu:22.04)
It seems adding --force
workaround it
But IMO, this is still a bug
Not sure if on Juju or on the Charm metadata that has been set up
Other charms do not seem to hit this on the same env
@gustavosr98 can you please post distro-info on the controller(s)
here and into https://chat.charmhub.io/charmhub/pl/tr9rbh3jfpnz8cki6s3pj8bugo ? Tnx!
Juju Controllers are K8s pods
Images
$ kubectl describe -n controller-jc-ctrl pod/controller-0 | grep -i image
Image: 701143232170.dkr.ecr.eu-west-1.amazonaws.com/jujusolutions/juju-db:4.4
Image ID: 701143232170.dkr.ecr.eu-west-1.amazonaws.com/jujusolutions/juju-db@sha256:2462e1defdbfe5d649d765ab97d306f08409c501f16e7cd7beeea9c79721a251
Image: 701143232170.dkr.ecr.eu-west-1.amazonaws.com/jujusolutions/jujud-operator:2.9.45
Image ID: 701143232170.dkr.ecr.eu-west-1.amazonaws.com/jujusolutions/jujud-operator@sha256:e281ab108f4b72efffb878c11bc8c3a2dce999a9ef0906274eaa763ece27f2a1
OS
$ kubectl exec -n controller-jc-ctrl -it pod/controller-0 bash
root@controller-0:/# $ cat /etc/os-release
NAME="Ubuntu"
VERSION="20.04.6 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04.6 LTS"
VERSION_ID="20.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=focal
UBUNTU_CODENAME=focal
@gustavosr98 at the moment we believe it is a Juju issue: https://bugs.launchpad.net/juju/+bug/2037771 (c) Joseph Phillips
This should confirm it is not MySQL charm related
On a different air gapped environment it seems working The diff is
I am trying to deploy mysql-k8s charm on an air-gapped environment
Steps to reproduce
Expected behavior
To deploy from local charm
Actual behavior
Tried in a couple different ways
Versions
AWS EKS K8s 1.25 Juju 2.9.45 Charm revision: 8.0/stable (local
mysql-k8s_1852a95.charm
)Log output
Additional context
Other charms, specifically from Charmed Kubeflow, do deploy on this air-gapped environment