Closed andrewm4894 closed 2 years ago
I think you're right: It is kustomize
issue as I tried to run it locally and it works.
My version:
$ kustomize version
Version: {KustomizeVersion:3.1.0 GitCommit:95f3303493fdea243ae83b767978092396169baf BuildDate:2019-07-26T19:21:45+01:00 GoOs:darwin GoArch:amd64}
What version do you use?
Also, you can try the new overlay to install it:
cd manifests/metadata
kustomize build overlays/db | kubectl apply -n kubeflow -f -
Changing instances of env
to envs
and putting the values in a list resolved this issue for me.
Changing instances of
env
toenvs
and putting the values in a list resolved this issue for me.
@parthmishra would you be able to give me some info/tips/pointers on how to do this? Apologies but i'm a total newbie here so not sure how/where to change these references.
Sure, in overlays/db/kustomization.yaml
file (edit: and in base/kustomization.yaml
), I commented out the env
and replaced with envs
as shown below:
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
commonLabels:
kustomize.component: metadata
namespace: kubeflow
generatorOptions:
# name suffix hash is not propagated correctly to base resources
disableNameSuffixHash: true
configMapGenerator:
- name: metadata-db-parameters
# env: params.env
envs:
- params.env
secretGenerator:
- name: metadata-db-secrets
# env: secrets.env
envs:
- secrets.env
...
And then I re-ran the kustomize command and it should work.
Thanks!
Unfortunately i'm now getting this error:
Error: json: cannot unmarshal string into Go struct field ConfigMapArgs.envs of type []string
error: no objects passed to apply
My overlays/db/kustomization.yaml
looks like this:
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
commonLabels:
kustomize.component: metadata
namespace: kubeflow
generatorOptions:
# name suffix hash is not propagated correctly to base resources
disableNameSuffixHash: true
configMapGenerator:
- name: metadata-db-parameters
envs: params.env
secretGenerator:
- name: metadata-db-secrets
envs: secrets.env
bases:
- ../../base
resources:
- metadata-db-pvc.yaml
- metadata-db-deployment.yaml
- metadata-db-service.yaml
patchesStrategicMerge:
- metadata-deployment.yaml
images:
- name: mysql
newName: mysql
newTag: 8.0.3
vars:
- name: metadata-db-service
objref:
kind: Service
name: metadata-db
apiVersion: v1
fieldref:
fieldpath: metadata.name
If i run kustomize version
i see:
Version: {Version:unknown GitCommit:$Format:%H$ BuildDate:1970-01-01T00:00:00Z GoOs:linux GoArch:amd64}
I think that might be because i installed it via these instructions but i believe that should be the latest.
Seems like this issue is related to my new error. Googling furiously.
You need to change
envs: secrets.env
to
envs:
- secrets.env
And repeat for envs: params.env
The top portion is interpreted as a string and thus causing the error when it's supposed to be a list.
Now i'm getting:
Error: accumulating resources: couldn't make target for path '../../base': json: unknown field "env"
error: no objects passed to apply
overlays/db/kustomization.yaml
:
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
commonLabels:
kustomize.component: metadata
namespace: kubeflow
generatorOptions:
# name suffix hash is not propagated correctly to base resources
disableNameSuffixHash: true
configMapGenerator:
- name: metadata-db-parameters
envs:
- params.env
secretGenerator:
- name: metadata-db-secrets
envs:
- secrets.env
bases:
- ../../base
resources:
- metadata-db-pvc.yaml
- metadata-db-deployment.yaml
- metadata-db-service.yaml
patchesStrategicMerge:
- metadata-deployment.yaml
images:
- name: mysql
newName: mysql
newTag: 8.0.3
vars:
- name: metadata-db-service
objref:
kind: Service
name: metadata-db
apiVersion: v1
fieldref:
fieldpath: metadata.name
Apologies for the back and forth, hopefully might be useful for others who might run into this.
Ah that's my bad, I forgot to say that you need to do this in the base file as well (base/kustomization.yaml
). Edit any instances of env
to envs
like you did with the overlays.
Thanks! I got a load of output so looks like it did something :)
Will try work through one of the metadata pipelines to confirm.
Although last line of output said:
for: "STDIN": Deployment.apps "metadata-ui" is invalid: spec.selector: Invalid value: v1.LabelSelector{MatchLabels:map[string]string{"app":"metadata-ui", "kustomize
.component":"metadata"}, MatchExpressions:[]v1.LabelSelectorRequirement(nil)}: field is immutable
But not sure if that's a non issue or not.
Some other output that suggests maybe did not work:
serviceaccount/metadata-ui unchanged
role.rbac.authorization.k8s.io/metadata-ui unchanged
rolebinding.rbac.authorization.k8s.io/metadata-ui unchanged
configmap/metadata-db-parameters unchanged
configmap/metadata-metadata-grpc-configmap-2dd6h7mhg6 unchanged
configmap/metadata-ui-parameters-b6c8ghff7c unchanged
secret/metadata-db-secrets unchanged
service/metadata-db unchanged
service/metadata-envoy-service unchanged
service/metadata-grpc-service unchanged
service/metadata-service unchanged
service/metadata-ui unchanged
persistentvolumeclaim/metadata-mysql unchanged
serviceaccount/metadata-ui unchanged
role.rbac.authorization.k8s.io/metadata-ui unchanged
rolebinding.rbac.authorization.k8s.io/metadata-ui unchanged
configmap/metadata-db-parameters unchanged
configmap/metadata-metadata-grpc-configmap-2dd6h7mhg6 unchanged
configmap/metadata-ui-parameters-b6c8ghff7c unchanged
secret/metadata-db-secrets unchanged
service/metadata-db unchanged
service/metadata-envoy-service unchanged
service/metadata-grpc-service unchanged
service/metadata-service unchanged
service/metadata-ui unchanged
persistentvolumeclaim/metadata-mysql unchanged
...
for: "STDIN": Deployment.apps "metadata-deployment" is invalid: spec.selector: Invalid value: v1.LabelSelector{MatchLabels:map[string]string{"component":"server", "
kustomize.component":"metadata"}, MatchExpressions:[]v1.LabelSelectorRequirement(nil)}: field is immutable
...
for: "STDIN": Deployment.apps "metadata-envoy-deployment" is invalid: spec.selector: Invalid value: v1.LabelSelector{MatchLabels:map[string]string{"component":"envo
y", "kustomize.component":"metadata"}, MatchExpressions:[]v1.LabelSelectorRequirement(nil)}: field is immutable
...
for: "STDIN": Deployment.apps "metadata-grpc-deployment" is invalid: spec.selector: Invalid value: v1.LabelSelector{MatchLabels:map[string]string{"component":"grpc-
server", "kustomize.component":"metadata"}, MatchExpressions:[]v1.LabelSelectorRequirement(nil)}: field is immutable
...
for: "STDIN": Deployment.apps "metadata-ui" is invalid: spec.selector: Invalid value: v1.LabelSelector{MatchLabels:map[string]string{"app":"metadata-ui", "kustomize
.component":"metadata"}, MatchExpressions:[]v1.LabelSelectorRequirement(nil)}: field is immutable
okay moving to envs worked. is this just happening because i downloaded the latest available kustomize ? is there any up to date documentation about what keys to use, or are valid in regards of latest version or updates.
it was rather unclear.
{Version:kustomize/v3.5.4 GitCommit:3af514fa9f85430f0c1557c4a0291e62112ab026 BuildDate:2020-01-11T03:12:59Z GoOs:linux GoArch:amd64}
Hi @parthmishra
I am facing a similar issue, (#2650)
kustomize build overlays/db
Error: json: cannot unmarshal array into Go struct field ConfigMapArgs.configMapGenerator.env of type string
Please let me know how I should change my file.
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
commonLabels:
kustomize.component: metadata
namespace: kubeflow
generatorOptions:
# name suffix hash is not propagated correctly to base resources
disableNameSuffixHash: true
configMapGenerator:
- name: metadata-db-parameters
env:
- param.env
secretGenerator:
- name: metadata-db-secrets
env:
- secret.env
bases:
- ../../base
resources:
- metadata-db-pvc.yaml
- metadata-db-deployment.yaml
- metadata-db-service.yaml
patchesStrategicMerge:
- metadata-deployment.yaml
images:
- name: mysql
newName: mysql
newTag: 8.0.3
vars:
- name: metadata-db-service
objref:
kind: Service
name: metadata-db
apiVersion: v1
fieldref:
fieldpath: metadata.name
Issue-Label Bot is automatically applying the labels:
Label | Probability |
---|---|
area/kustomize | 0.99 |
Please mark this comment with :thumbsup: or :thumbsdown: to give our bot feedback! Links: app homepage, dashboard and code for this bot.
/kind bug
I'm trying to install metadata onto my gcp kubeflow following the docs here.
However when i run
I get this error:
I found this issue which seems to suggest whats going on. But i have no idea how to implement a work around to get metadata install on my cluster.
It feels like might be easy enough but unfortunately i really don't know what i'm doing when it comes to kubeflow and kubernetes troubleshooting.
Has anyone else run into this?
Enviornment