Closed William-Yi-Weng closed 4 years ago
If i remove the version suffix. I can get the output of:
$kubectl get --raw /apis/external.metrics.k8s.io |jq { "kind": "APIGroup", "apiVersion": "v1", "name": "external.metrics.k8s.io", "versions": [ { "groupVersion": "external.metrics.k8s.io/v1beta1", "version": "v1beta1" } ], "preferredVersion": { "groupVersion": "external.metrics.k8s.io/v1beta1", "version": "v1beta1" } }
And when I check the apiservice, it shows me False (FailedDiscoveryCheck).
kubectl get apiservice NAME SERVICE AVAILABLE AGE v1. Local True 9d v1.apps Local True 9d v1.authentication.k8s.io Local True 9d v1.authorization.k8s.io Local True 9d v1.autoscaling Local True 9d v1.batch Local True 9d v1.networking.k8s.io Local True 9d v1.rbac.authorization.k8s.io Local True 9d v1.storage.k8s.io Local True 9d v1alpha1.metrics.aws Local True 4d v1beta1.admissionregistration.k8s.io Local True 9d v1beta1.apiextensions.k8s.io Local True 9d v1beta1.apps Local True 9d v1beta1.authentication.k8s.io Local True 9d v1beta1.authorization.k8s.io Local True 9d v1beta1.batch Local True 9d v1beta1.certificates.k8s.io Local True 9d v1beta1.coordination.k8s.io Local True 9d v1beta1.custom.metrics.k8s.io custom-metrics/k8s-cloudwatch-adapter False (FailedDiscoveryCheck) 4d v1beta1.events.k8s.io Local True 9d v1beta1.extensions Local True 9d v1beta1.external.metrics.k8s.io custom-metrics/k8s-cloudwatch-adapter False (FailedDiscoveryCheck) 4d v1beta1.policy Local True 9d v1beta1.rbac.authorization.k8s.io Local True 9d v1beta1.scheduling.k8s.io Local True 9d v1beta1.storage.k8s.io Local True 9d v1beta2.apps Local True 9d v2beta1.autoscaling Local True 9d v2beta2.autoscaling Local True 9d
The details of v1beta1.custom.metrics.k8s.io : I found that there is no reponse from https://100.69.35.59:443.
Message: no response from https://100.69.35.59:443: Get https://100.69.35.59:443: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
Reason: FailedDiscoveryCheck
Status: False
I have no idea about it. Please give me some suggestion.
Hi,
Can you do a kubectl logs k8s-cloudwatch-adapter-xxxxxx
to see if there
is any errors?
On Tue, Oct 15, 2019, 11:05 AM William-Yi-Weng notifications@github.com wrote:
And when I check the apiservice, it shows me False (FailedDiscoveryCheck).
kubectl get apiservice NAME SERVICE AVAILABLE AGE v1. Local True 9d v1.apps Local True 9d v1.authentication.k8s.io Local True 9d v1.authorization.k8s.io Local True 9d v1.autoscaling Local True 9d v1.batch Local True 9d v1.networking.k8s.io Local True 9d v1.rbac.authorization.k8s.io Local True 9d v1.storage.k8s.io Local True 9d v1alpha1.metrics.aws Local True 4d v1beta1.admissionregistration.k8s.io Local True 9d v1beta1.apiextensions.k8s.io Local True 9d v1beta1.apps Local True 9d v1beta1.authentication.k8s.io Local True 9d v1beta1.authorization.k8s.io Local True 9d v1beta1.batch Local True 9d v1beta1.certificates.k8s.io Local True 9d v1beta1.coordination.k8s.io Local True 9d v1beta1.custom.metrics.k8s.io custom-metrics/k8s-cloudwatch-adapter False (FailedDiscoveryCheck) 4d v1beta1.events.k8s.io Local True 9d v1beta1.extensions Local True 9d v1beta1.external.metrics.k8s.io custom-metrics/k8s-cloudwatch-adapter False (FailedDiscoveryCheck) 4d v1beta1.policy Local True 9d v1beta1.rbac.authorization.k8s.io Local True 9d v1beta1.scheduling.k8s.io Local True 9d v1beta1.storage.k8s.io Local True 9d v1beta2.apps Local True 9d v2beta1.autoscaling Local True 9d v2beta2.autoscaling Local True 9d
The details of v1beta1.custom.metrics.k8s.io : I found that there is no reponse from https://100.69.35.59:443.
Message: no response from https://100.69.35.59:443: Get https://100.69.35.59:443: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Reason: FailedDiscoveryCheck Status: False
I have no idea about it. Please give me some suggestion.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/awslabs/k8s-cloudwatch-adapter/issues/12?email_source=notifications&email_token=AAG2UPUFDGZDDCACDKKMXQDQOUXQPA5CNFSM4I56TTY2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEBHIBHI#issuecomment-542015645, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAG2UPTYVVMYJX7BZZXIHT3QOUXQPANCNFSM4I56TTYQ .
Hi , @chankh Thanks for your response. There are a lot of logs in that. I just cut a part of it. Hope if it is useful. It looks that my api adapter doesn't actually work.
Logs: I1016 12:03:20.484558 1 reflector.go:337] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: Watch close - v1alpha1.ExternalMetric total 0 items received I1016 12:03:20.484765 1 round_trippers.go:419] curl -k -v -XGET -H "Authorization: Bearer eyJhbGciOiJSUzI1NiIsImtpZCI6IiJ9.eyJpc3MiOiJrdWJlcm5ldGVzL3NlcnZpY2VhY2NvdW50Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9uYW1lc3BhY2UiOiJjdXN0b20tbWV0cmljcyIsImt1YmVybmV0ZXMuaW8vc2VydmljZWFjY291bnQvc2VjcmV0Lm5hbWUiOiJrOHMtY2xvdWR3YXRjaC1hZGFwdGVyLXRva2VuLXB2cnM4Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9zZXJ2aWNlLWFjY291bnQubmFtZSI6Ims4cy1jbG91ZHdhdGNoLWFkYXB0ZXIiLCJrdWJlcm5ldGVzLmlvL3NlcnZpY2VhY2NvdW50L3NlcnZpY2UtYWNjb3VudC51aWQiOiI3MTc4OGE1NS1lYmFkLTExZTktOWRhNC0wMjRhODYwNzhlNWMiLCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6Y3VzdG9tLW1ldHJpY3M6azhzLWNsb3Vkd2F0Y2gtYWRhcHRlciJ9.JqhsA37bkVlH44DUl5cfABx1ElOT0JM6zkeqsK9HKVTQpMSW3naPRTHUEynjLWKKcfSSkUEE4zU9Aex1tRFJNZKVoqiPk4plDU46k7WX8TCMfMEZFQ2Dt3Ko5omN5dvPOmyfnAiMI2iyTSS_gORvt_rYjId4xz9xUzHhT2BA3G2Q62gQZmALR1FOOB_VElRElxocNUIdqwIvjAit2ciDlCXm_iXLvmE1Eg0jkOb4DhiFf1XX5RFLlkig270SiuZjYbYkiJvqP5TxEbJaR782AXtatFeAKKKuTplRsXXgd58WGi7Q7GNwS9-F-56q9kgLlHxcmTM5Fr4Dc1t_9OmJ2w" -H "Accept: application/json, /" -H "User-Agent: adapter/v0.0.0 (linux/amd64) kubernetes/$Format" 'https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=638930&timeout=7m40s&timeoutSeconds=460&watch=true' I1016 12:03:20.488888 1 round_trippers.go:438] GET https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=638930&timeout=7m40s&timeoutSeconds=460&watch=true 200 OK in 4 milliseconds I1016 12:03:20.488905 1 round_trippers.go:444] Response Headers: I1016 12:03:20.488911 1 round_trippers.go:447] Content-Type: application/json I1016 12:03:20.488917 1 round_trippers.go:447] Date: Wed, 16 Oct 2019 12:03:20 GMT I1016 12:03:49.581037 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:04:19.581285 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:04:49.581561 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:05:19.581805 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:05:49.582090 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:06:19.582365 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:06:49.582625 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:07:19.582890 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:07:49.583148 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:08:19.583406 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:08:49.583657 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:09:19.583906 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:09:49.584175 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:10:19.584440 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:10:49.584686 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:11:00.489248 1 reflector.go:337] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: Watch close - v1alpha1.ExternalMetric total 0 items received I1016 12:11:00.489448 1 round_trippers.go:419] curl -k -v -XGET -H "Authorization: Bearer eyJhbGciOiJSUzI1NiIsImtpZCI6IiJ9.eyJpc3MiOiJrdWJlcm5ldGVzL3NlcnZpY2VhY2NvdW50Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9uYW1lc3BhY2UiOiJjdXN0b20tbWV0cmljcyIsImt1YmVybmV0ZXMuaW8vc2VydmljZWFjY291bnQvc2VjcmV0Lm5hbWUiOiJrOHMtY2xvdWR3YXRjaC1hZGFwdGVyLXRva2VuLXB2cnM4Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9zZXJ2aWNlLWFjY291bnQubmFtZSI6Ims4cy1jbG91ZHdhdGNoLWFkYXB0ZXIiLCJrdWJlcm5ldGVzLmlvL3NlcnZpY2VhY2NvdW50L3NlcnZpY2UtYWNjb3VudC51aWQiOiI3MTc4OGE1NS1lYmFkLTExZTktOWRhNC0wMjRhODYwNzhlNWMiLCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6Y3VzdG9tLW1ldHJpY3M6azhzLWNsb3Vkd2F0Y2gtYWRhcHRlciJ9.JqhsA37bkVlH44DUl5cfABx1ElOT0JM6zkeqsK9HKVTQpMSW3naPRTHUEynjLWKKcfSSkUEE4zU9Aex1tRFJNZKVoqiPk4plDU46k7WX8TCMfMEZFQ2Dt3Ko5omN5dvPOmyfnAiMI2iyTSS_gORvt_rYjId4xz9xUzHhT2BA3G2Q62gQZmALR1FOOB_VElRElxocNUIdqwIvjAit2ciDlCXm_iXLvmE1Eg0jkOb4DhiFf1XX5RFLlkig270SiuZjYbYkiJvqP5TxEbJaR782AXtatFeAKKKuTplRsXXgd58WGi7Q7GNwS9-F-56q9kgLlHxcmTM5Fr4Dc1t_9OmJ2w" -H "Accept: application/json, /" -H "User-Agent: adapter/v0.0.0 (linux/amd64) kubernetes/$Format" 'https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=638930&timeout=9m6s&timeoutSeconds=546&watch=true' I1016 12:11:00.493289 1 round_trippers.go:438] GET https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=638930&timeout=9m6s&timeoutSeconds=546&watch=true 200 OK in 3 milliseconds I1016 12:11:00.493311 1 round_trippers.go:444] Response Headers: I1016 12:11:00.493317 1 round_trippers.go:447] Content-Type: application/json I1016 12:11:00.493322 1 round_trippers.go:447] Date: Wed, 16 Oct 2019 12:11:00 GMT I1016 12:11:19.584917 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:11:49.585202 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:12:19.585467 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:12:49.585761 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:13:19.586028 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:13:49.586271 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:14:19.586538 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:14:49.586760 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:15:19.587022 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:15:49.587261 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:16:19.587519 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:16:49.587755 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:17:19.587997 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:17:49.588255 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:18:19.588524 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:18:49.588788 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:19:19.589044 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:19:49.589314 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:20:06.493797 1 reflector.go:337] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: Watch close - v1alpha1.ExternalMetric total 0 items received I1016 12:20:06.494020 1 round_trippers.go:419] curl -k -v -XGET -H "Accept: application/json, /*" -H "User-Agent: adapter/v0.0.0 (linux/amd64) kubernetes/$Format" -H "Authorization: Bearer eyJhbGciOiJSUzI1NiIsImtpZCI6IiJ9.eyJpc3MiOiJrdWJlcm5ldGVzL3NlcnZpY2VhY2NvdW50Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9uYW1lc3BhY2UiOiJjdXN0b20tbWV0cmljcyIsImt1YmVybmV0ZXMuaW8vc2VydmljZWFjY291bnQvc2VjcmV0Lm5hbWUiOiJrOHMtY2xvdWR3YXRjaC1hZGFwdGVyLXRva2VuLXB2cnM4Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9zZXJ2aWNlLWFjY291bnQubmFtZSI6Ims4cy1jbG91ZHdhdGNoLWFkYXB0ZXIiLCJrdWJlcm5ldGVzLmlvL3NlcnZpY2VhY2NvdW50L3NlcnZpY2UtYWNjb3VudC51aWQiOiI3MTc4OGE1NS1lYmFkLTExZTktOWRhNC0wMjRhODYwNzhlNWMiLCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6Y3VzdG9tLW1ldHJpY3M6azhzLWNsb3Vkd2F0Y2gtYWRhcHRlciJ9.JqhsA37bkVlH44DUl5cfABx1ElOT0JM6zkeqsK9HKVTQpMSW3naPRTHUEynjLWKKcfSSkUEE4zU9Aex1tRFJNZKVoqiPk4plDU46k7WX8TCMfMEZFQ2Dt3Ko5omN5dvPOmyfnAiMI2iyTSS_gORvt_rYjId4xz9xUzHhT2BA3G2Q62gQZmALR1FOOB_VElRElxocNUIdqwIvjAit2ciDlCXm_iXLvmE1Eg0jkOb4DhiFf1XX5RFLlkig270SiuZjYbYkiJvqP5TxEbJaR782AXtatFeAKKKuTplRsXXgd58WGi7Q7GNwS9-F-56q9kgLlHxcmTM5Fr4Dc1t_9OmJ2w" 'https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=638930&timeout=8m59s&timeoutSeconds=539&watch=true' I1016 12:20:06.498366 1 round_trippers.go:438] GET https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=638930&timeout=8m59s&timeoutSeconds=539&watch=true 200 OK in 4 milliseconds I1016 12:20:06.498385 1 round_trippers.go:444] Response Headers: I1016 12:20:06.498391 1 round_trippers.go:447] Content-Type: application/json I1016 12:20:06.498396 1 round_trippers.go:447] Date: Wed, 16 Oct 2019 12:20:06 GMT I1016 12:20:19.589604 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:20:49.589878 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:21:19.590161 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:21:49.590423 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:22:19.590660 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync I1016 12:22:49.590910 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync
Do you see any logs like this?
I1018 01:44:53.259897 1 handler.go:55] processing item 'hello-queue-length' in namespace 'default'
I1018 01:44:53.259926 1 handler.go:68] externalMetricInfo: &{{ExternalMetric metrics.aws/v1alpha1} {hello-queue-length default /apis/metrics.aws/v1alpha1/namespaces/default/externalmetrics/hello-queue-length 32e93f40-335b-11e9-8e26-0acb94794ffe 14202893 1 2019-02-18 08:57:20 +0000 UTC <nil> <nil> map[] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"metrics.aws/v1alpha1","kind":"ExternalMetric","metadata":{"annotations":{},"name":"hello-queue-length","namespace":"default"},"spec":{"name":"hello-queue-length","queries":[{"id":"sqs_helloworld","metricStat":{"metric":{"dimensions":[{"name":"QueueName","value":"helloworld"}],"metricName":"ApproximateNumberOfMessagesVisible","namespace":"AWS/SQS"},"period":300,"stat":"Average","unit":"Count"},"returnData":true}],"resource":{"resource":"deployment"}}}
] [] nil [] } {hello-queue-length [{ sqs_helloworld {{[{QueueName helloworld}] ApproximateNumberOfMessagesVisible AWS/SQS} 300 Average Count} true}]}}
Yes, I can see that. I changed the queue name with sqs-dev-length.
I1018 02:17:32.979093 1 round_trippers.go:438] GET https://100.64.0.1:443/apis/metrics.aws/v1alpha1/externalmetrics?resourceVersion=1443796&timeout=7m42s&timeoutSeconds=462&watch=true 200 OK in 4 milliseconds
I1018 02:17:32.979110 1 round_trippers.go:444] Response Headers:
I1018 02:17:32.979116 1 round_trippers.go:447] Content-Type: application/json
I1018 02:17:32.979121 1 round_trippers.go:447] Date: Fri, 18 Oct 2019 02:17:32 GMT
I1018 02:17:50.797337 1 reflector.go:202] github.com/awslabs/k8s-cloudwatch-adapter/pkg/client/informers/externalversions/factory.go:114: forcing resync
I1018 02:17:50.797402 1 controller.go:137] adding item to queue for 'default/sqs-dev-length' with kind 'ExternalMetric'
I1018 02:17:50.802605 1 handler.go:55] processing item 'sqs-dev-length' in namespace 'default'
I1018 02:17:50.802628 1 handler.go:68] externalMetricInfo: &{{ } {sqs-dev-length default /apis/metrics.aws/v1alpha1/namespaces/default/externalmetrics/sqs-dev-length 63645ba9-f14d-11e9-9da4-024a86078e5c 1443796 1 2019-10-18 02:17:09 +0000 UTC
this log shows the metric adapter is running fine and was able to read the external metric config that you've passed
I think it might be limits of authority. Apiservice cannot get the info from adapter. I attached the correct IAM cloudwatch policy to my nodes and master. But the cutsom.metrics.k8s.io apiservice is still failed.
ubuntu@~$ kubectl describe apiservice v1beta1.custom.metrics.k8s.io
Name: v1beta1.custom.metrics.k8s.io
Namespace:
Labels: <none>
Annotations: kubectl.kubernetes.io/last-applied-configuration:
{"apiVersion":"apiregistration.k8s.io/v1beta1","kind":"APIService","metadata":{"annotations":{},"name":"v1beta1.custom.metrics.k8s.io"},"s...
API Version: apiregistration.k8s.io/v1
Kind: APIService
Metadata:
Creation Timestamp: 2019-11-15T12:06:14Z
Resource Version: 3411595
Self Link: /apis/apiregistration.k8s.io/v1/apiservices/v1beta1.custom.metrics.k8s.io
UID: 520221e2-07a0-11ea-8f70-0292bc9fd844
Spec:
Group: custom.metrics.k8s.io
Group Priority Minimum: 100
Insecure Skip TLS Verify: true
Service:
Name: k8s-cloudwatch-adapter
Namespace: custom-metrics
Version: v1beta1
Version Priority: 100
Status:
Conditions:
Last Transition Time: 2019-11-15T12:06:14Z
Message: no response from https://100.71.90.90:443: Get https://100.71.90.90:443: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
Reason: FailedDiscoveryCheck
Status: False
Type: Available
Events: <none>
This adapter uses external.metrics.k8s.io instead of custom metrics. Some old code that still register as custom metrics will fail but that doesn't matter, it has no effect on the adapter. These code was removed recently and pushed to the master branch. You can try the latest
image which should be most updated.
closing this issue since there is no update
Hi, I followed this guide https://aws.amazon.com/blogs/compute/scaling-kubernetes-deployments-with-amazon-cloudwatch-metrics/ to create my sqs metrics. When I run: $ kubectl get --raw "/apis/external.metrics.k8s.io/v1beta1" | jq Error from server (ServiceUnavailable): the server is currently unable to handle the request
If I deploy the following steps, the hpa cannont work and show: NAME REFERENCE TARGETS MINPODS MAXPODS REPLICAS AGE sqs-consumer-scaler Deployment/sqs-consumer < unknown >/30 1 10 1 2m
Log for pod: Events: Type Reason Age From Message
Warning FailedComputeMetricsReplicas 7m48s (x12 over 10m) horizontal-pod-autoscaler failed to get external metric sqs-dev-length: unable to get external metric default/sqs-dev-length/nil: external metrics aren't supported Warning FailedGetExternalMetric 33s (x41 over 10m) horizontal-pod-autoscaler unable to get external metric default/sqs-dev-length/nil: external metrics aren't supported
I think I attached correct IAM policy. And I have a metrics server which is running correctly: $ kubectl get --raw "/apis/metrics.k8s.io/" | jq { "kind": "APIGroup", "apiVersion": "v1", "name": "metrics.k8s.io", "versions": [ { "groupVersion": "metrics.k8s.io/v1beta1", "version": "v1beta1" } ], "preferredVersion": { "groupVersion": "metrics.k8s.io/v1beta1", "version": "v1beta1" } }
Environment: kops Version 1.12.3 kubectl client: v1.15.3 kubectl server: v1.12.10
Platform: EC2: debian-stretch