Closed VaanPan closed 4 years ago
I pulled controller manager log as below. It seems it was under provisioning status then suddenly went to Poll returned "failed" : "(no description provided)"
How to investigate such kind of error further?
I0508 12:30:36.114049 1 request.go:947] Response Body: {"apiVersion":"servicecatalog.k8s.io/v1beta1","kind":"ServiceInstance","metadata":{"creationTimestamp":"2020-05-08T12:30:35Z","finalizers":["kubernetes-incubator/service-catalog"],"generation":2,"labels":{"servicecatalog.k8s.io/spec.clusterServiceClassRef.name":"08f8c4dbcc1b58d30dc0209847a0c7e2add835ccac9741a1840a87a6","servicecatalog.k8s.io/spec.clusterServicePlanRef.name":"02fe063830f44f2ff2c4d9d69afdcbabadb797275fe8c190eda3dff7"},"name":"my-rediscache-instance","namespace":"default","resourceVersion":"162368","selfLink":"/apis/servicecatalog.k8s.io/v1beta1/namespaces/default/serviceinstances/my-rediscache-instance/status","uid":"a062633d-576c-48ec-9100-0382eb094a99"},"spec":{"clusterServiceClassExternalName":"azure-rediscache","clusterServiceClassRef":{"name":"0346088a-d4b2-4478-aa32-f18e295ec1d9"},"clusterServicePlanExternalName":"basic","clusterServicePlanRef":{"name":"362b3d1b-5b57-4289-80ad-4a15a760c29c"},"externalID":"f99888ab-f5e4-44ca-9d6d-b3d972b6121e","parameters":{"location":"chinaeast2","resourceGroup":"AKStest"},"updateRequests":0,"userInfo":{"groups":["system:serviceaccounts","system:serviceaccounts:catalog","system:authenticated"],"uid":"2ebe36a1-9900-4b03-8197-0a7552d1bbbe","username":"system:serviceaccount:catalog:service-catalog-controller-manager"}},"status":{"asyncOpInProgress":true,"conditions":[{"lastTransitionTime":"2020-05-08T12:30:36Z","message":"The instance is being provisioned asynchronously","reason":"Provisioning","status":"False","type":"Ready"}],"currentOperation":"Provision","deprovisionStatus":"Required","inProgressProperties":{"clusterServicePlanExternalID":"362b3d1b-5b57-4289-80ad-4a15a760c29c","clusterServicePlanExternalName":"basic","parameterChecksum":"b3d9dd5d9d6a971e4fd899dd3ab97d1a5960d536f0829d27308d565cff93379f","parameters":{"location":"chinaeast2","resourceGroup":"AKStest"},"userInfo":{"groups":["system:serviceaccounts","system:serviceaccounts:catalog","system:authenticated"],"uid":"2ebe36a1-9900-4b03-8197-0a7552d1bbbe","username":"system:serviceaccount:catalog:service-catalog-controller-manager"}},"lastConditionState":"Provisioning","lastOperation":"provisioning","observedGeneration":2,"operationStartTime":"2020-05-08T12:30:36Z","orphanMitigationInProgress":false,"provisionStatus":"","reconciledGeneration":0,"userSpecifiedClassName":"ClusterServiceClass/azure-rediscache","userSpecifiedPlanName":"basic"}} I0508 12:30:36.114304 1 request.go:947] Request Body: {"kind":"Event","apiVersion":"v1","metadata":{"name":"my-rediscache-instance.160d0d742aa0d692","namespace":"default","creationTimestamp":null},"involvedObject":{"kind":"ServiceInstance","namespace":"default","name":"my-rediscache-instance","uid":"a062633d-576c-48ec-9100-0382eb094a99","apiVersion":"servicecatalog.k8s.io/v1beta1","resourceVersion":"162367"},"reason":"Provisioning","message":"The instance is being provisioned asynchronously","source":{"component":"service-catalog-controller-manager"},"firstTimestamp":"2020-05-08T12:30:36Z","lastTimestamp":"2020-05-08T12:30:36Z","count":1,"type":"Normal","eventTime":null,"reportingComponent":"","reportingInstance":""} I0508 12:30:36.114298 1 event.go:258] Event(v1.ObjectReference{Kind:"ServiceInstance", Namespace:"default", Name:"my-rediscache-instance", UID:"a062633d-576c-48ec-9100-0382eb094a99", APIVersion:"servicecatalog.k8s.io/v1beta1", ResourceVersion:"162367", FieldPath:""}): type: 'Normal' reason: 'Provisioning' The instance is being provisioned asynchronously I0508 12:30:36.114460 1 round_trippers.go:419] curl -k -v -XPOST -H "Authorization: Bearer eyJhbGciOiJSUzI1NiIsImtpZCI6IiJ9.eyJpc3MiOiJrdWJlcm5ldGVzL3NlcnZpY2VhY2NvdW50Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9uYW1lc3BhY2UiOiJjYXRhbG9nIiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9zZWNyZXQubmFtZSI6InNlcnZpY2UtY2F0YWxvZy1jb250cm9sbGVyLW1hbmFnZXItdG9rZW4tZmtjdjYiLCJrdWJlcm5ldGVzLmlvL3NlcnZpY2VhY2NvdW50L3NlcnZpY2UtYWNjb3VudC5uYW1lIjoic2VydmljZS1jYXRhbG9nLWNvbnRyb2xsZXItbWFuYWdlciIsImt1YmVybmV0ZXMuaW8vc2VydmljZWFjY291bnQvc2VydmljZS1hY2NvdW50LnVpZCI6IjJlYmUzNmExLTk5MDAtNGIwMy04MTk3LTBhNzU1MmQxYmJiZSIsInN1YiI6InN5c3RlbTpzZXJ2aWNlYWNjb3VudDpjYXRhbG9nOnNlcnZpY2UtY2F0YWxvZy1jb250cm9sbGVyLW1hbmFnZXIifQ.qGgm_2ngAi8-njh3SxC0uHaqqME6Ogmt_EGwIjiECK6PobEeF0edayUIz0JCTLXKjf6IpbaqWi8zG-hbmd60ZMdc2KOyQFOWoH9XA_EaR6tqfG_tls7p29q0YiDXuwrwvUEPosuAt3us5j8ZMlNHYvah0iUuRF3BZLp6adeTfmpPKXXxMOW5XRO6QopnZZ0gufEKBXMlsGjPePEl0nvPK9VM6vxOgwd0zm6lQVnxqSz_wxBAZ53qN8NeTBSGBi679rYMg9DUIEZb1dvKeykuTY5n9njo8G1TEIE22Gu5xOPeh-7-50vv2EvlEt90Do-bYQ1fDHXwVHYAJQgx3-lqmj9MuK2hkTWLGZA8UBWBjaDpJsuIH9z5d6KYL4QH-TdHUFKbKyO3Y2X6JjVtoMDHjIazVcJL0HnanVv3I32DHmVjRNk5R2i3qEuE19_DZ4cLd7SKWZFcfeSAPlsRoANxwIJh0fxViz3GXx_ZUmD1Wfyv_RY5wIHSO9H6wnYjrExN9dM0fO-lKrnSQjHBKuQmChMjEjPgwXeyRFlfSvyYjHl_dcdnRO4obxcV4pyWDmz0XVawcT13eOhBfjXidpenpTpGbSl2gHZKmcGLU7QtQ2sDLgxUjdSp2Qc9TTclwKG4Be_G2bSuF99sXReVH_GWxPHN-PxbGBIEM6pzCBw" -H "Accept: application/json, /" -H "Content-Type: application/json" -H "User-Agent: service-catalog/v0.3.0 (linux/amd64) kubernetes/f950e68/service-catalog-controller-manager" 'https://10.0.0.1:443/api/v1/namespaces/default/events' I0508 12:30:36.125733 1 round_trippers.go:438] POST https://10.0.0.1:443/api/v1/namespaces/default/events 201 Created in 11 milliseconds I0508 12:30:36.125752 1 round_trippers.go:444] Response Headers: I0508 12:30:36.125756 1 round_trippers.go:447] Date: Fri, 08 May 2020 12:30:36 GMT I0508 12:30:36.125795 1 round_trippers.go:447] Content-Type: application/json I0508 12:30:36.125801 1 round_trippers.go:447] Content-Length: 850 I0508 12:30:36.125818 1 request.go:947] Response Body: {"kind":"Event","apiVersion":"v1","metadata":{"name":"my-rediscache-instance.160d0d742aa0d692","namespace":"default","selfLink":"/api/v1/namespaces/default/events/my-rediscache-instance.160d0d742aa0d692","uid":"b33f92b0-d248-46e2-93a2-7baf9c301558","resourceVersion":"162369","creationTimestamp":"2020-05-08T12:30:36Z"},"involvedObject":{"kind":"ServiceInstance","namespace":"default","name":"my-rediscache-instance","uid":"a062633d-576c-48ec-9100-0382eb094a99","apiVersion":"servicecatalog.k8s.io/v1beta1","resourceVersion":"162367"},"reason":"Provisioning","message":"The instance is being provisioned asynchronously","source":{"component":"service-catalog-controller-manager"},"firstTimestamp":"2020-05-08T12:30:36Z","lastTimestamp":"2020-05-08T12:30:36Z","count":1,"type":"Normal","eventTime":null,"reportingComponent":"","reportingInstance":""} I0508 12:30:37.114430 1 controller_instance.go:1000] ServiceInstance "default/my-rediscache-instance" v162368: Processing poll event I0508 12:30:37.114470 1 controller_clusterservicebroker.go:130] ClusterServiceBroker "osba" v159649: Updating broker client I0508 12:30:37.114632 1 controller_clusterservicebroker.go:130] ClusterServiceBroker "osba" v159649: Updating broker client I0508 12:30:37.114896 1 controller_instance.go:1027] ServiceInstance "default/my-rediscache-instance" v162368: Polling last operation I0508 12:30:37.114912 1 osbproxy.go:107] OSBClientProxy PollLastOperation() I0508 12:30:37.116280 1 controller_instance.go:1067] ServiceInstance "default/my-rediscache-instance" v162368: Poll returned "failed" : "(no description provided)" I0508 12:30:37.116302 1 controller_instance.go:499] ServiceInstance "default/my-rediscache-instance" v162368: BrokerOpRetry: removed a062633d-576c-48ec-9100-0382eb094a99 from instanceOperationRetryQueue I0508 12:30:37.116326 1 controller_instance.go:1812] ServiceInstance "default/my-rediscache-instance" v162368: Provision call failed: (no description provided) I0508 12:30:37.116336 1 controller_instance.go:1813] ServiceInstance "default/my-rediscache-instance" v162368: Setting condition "Ready" to False I0508 12:30:37.116346 1 controller_instance.go:1812] ServiceInstance "default/my-rediscache-instance" v162368: Provision call failed: (no description provided)
Did you check osba logs? If error was in osba calling Azure APIs, osba logs would be helpful.
Thanks a lot, zhongyi.
OSBA log shows detail errors. After fix AAD app resource group RBAC assigment issue, Redis Cache successfully deployed.
kubectl logs osba-open-service-broker-azure-5b548879c-tn585 -n osba
time="2020-05-08T12:25:02Z" level=error msg="error executing job; not submitting any follow-up tasks" error="error executing provisioning step \"deployARMTemplate\" for instance \"2df244cc-5fd5-4262-8cbe-0a4072c03ed1\": error executing provisioning step: error deploying ARM template: error deploying \"ff70dcfc-8867-4187-bbed-5959535bf99d\" in resource group \"AKStest\": error getting deployment: resources.DeploymentsClient#Get: Failure responding to request: StatusCode=403 -- Original Error: autorest/azure: Service returned an error. Status=403 Code=\"AuthorizationFailed\" Message=\"The client '4aae7b71-ff49-4049-b1f7-d3fe4b87abf0' with object id '4aae7b71-ff49-4049-b1f7-d3fe4b87abf0' does not have authorization to perform action 'Microsoft.Resources/deployments/read' over scope '/subscriptions/df5f9973-e18c-468f-99fd-44a55aee2cd2/resourcegroups/AKStest/providers/Microsoft.Resources/deployments/ff70dcfc-8867-4187-bbed-5959535bf99d' or the scope is invalid. If access was recently granted, please refresh your credentials.\"" job=executeProvisioningStep taskID=eb42e4a9-809c-4b02-88e0-6e0ec70d7633
Hi Team
I am trying to apply OSBA for Azure in China Azure AKS but meet some problem. Could you please kindly help?
Below is what I did: wget https://github.com/kubernetes-sigs/service-catalog/archive/v0.3.0-beta.2.tar.gz tar -zxvf v0.3.0-beta.2.tar.gz helm install catalog service-catalog-0.3.0-beta.2/charts/catalog/ -n catalog
curl -sLO https://download.svcat.sh/cli/latest/linux/amd64/svcat chmod +x ./svcat sudo mv ./svcat /usr/local/bin/ svcat version –client [Client Version: v0.3.0-beta.2 Server Version: v1.15.10]
sudo yum install git git clone https://github.com/azure/open-service-broker-azure
kubectl create namespace osba helm repo add azure https://kubernetescharts.blob.core.windows.net/azure helm install osba azure/open-service-broker-azure --namespace osba \ --set azure.subscriptionId=df5f9973-e18c-468f-99fd-44a55aee2cd2 \ --set azure.tenantId=52f88278-ee26-4dc5-840b-03eb350fc0b6 \ --set azure.clientId=e4ff5b20-45ce-4558-b5a2-43a63929897b \ --set azure.clientSecret=hide \ --set azure.environment=AzureChinaCloud
kubectl get clusterservicebroker -o yaml kubectl get clusterserviceclasses -o=custom-columns=NAME:.metadata.name,EXTERNAL\ NAME:.spec.externalName kubectl get clusterserviceplans -o=custom-columns=NAME:.metadata.name,EXTERNAL\ NAME:.spec.externalName,SERVICE\ CLASS:.spec.clusterServiceClassRef.name --sort-by=.spec.clusterServiceClassRef.name
vim open-service-broker-azure/contrib/k8s/examples/redis/rediscache-instance.yaml
[change resource group and location to China environment] parameters: location: chinaeast2
resourceGroup: AKStest
kubectl create -f open-service-broker-azure/contrib/k8s/examples/redis/rediscache-instance.yaml
After deployment, ServiceInstance is under provision fail status
kubectl describe ServiceInstance my-rediscache-instance
Events: Type Reason Age From Message
Normal Provisioning 44m service-catalog-controller-manager The instance is being provisioned asynchronously Warning ProvisionCallFailed 44m (x2 over 44m) service-catalog-controller-manager Provision call failed: (no description provided) Warning StartingInstanceOrphanMitigation 44m service-catalog-controller-manager The instance provision call failed with an ambiguous error; attempting to deprovision the instance in order to mitigate an orphaned resource Normal Deprovisioning 44m service-catalog-controller-manager The instance is being deprovisioned asynchronously Warning DeprovisionCallFailed 44m service-catalog-controller-manager Deprovision call failed: (no description provided) Warning DeprovisionCallFailed 4m3s (x128 over 44m) service-catalog-controller-manager Deprovision call failed; received error response from broker: Status: 409; ErrorMessage:; Description: ; ResponseError:
Question I checked Azure subscription activity log and resource group activity log, but no error recorded. Just don't understand why 'Provision call failed: (no description provided)'