Closed hphsu closed 6 years ago
I created a cluster using kops and tried installing Argo using the same bucket (applatixtest3). It seems to work just fine... Will dig deeper.
Francis found that detecting the region of the cluster-bucket is different when he tries it vs when I tried. For me, the installer correctly detected the region of the bucket as "us-west-2" whereas for him, it was detected as "None".
double check the "aws configure" also show us-west-2 as my default region, not sure where does installation pickup None as region?
argo cluster ops> aws configure list
Name Value Type Location
---- ----- ---- --------
profile <not set> None None
access_key ****************7VJQ shared-credentials-file
secret_key ****************7c7B shared-credentials-file
region us-west-2 config-file ~/.aws/config
argo cluster ops> argocluster install-argo-only --cloud-region us-west-2 --cluster-name t2.test.eng.applatix.net --cloud-provider aws --cloud-profile dev --cluster-bucket applatixtest3 --kubeconfig /tmp/ax_kube/config
2017-10-18T22:42:06 INFO ax.cluster_management.argo_cluster_manager MainThread: Installing Argo platform ...
2017-10-18T22:42:06 INFO ax.cluster_management.argo_cluster_manager MainThread: s3 bucket endpoint: None
2017-10-18T22:42:07 INFO ax.cluster_management.app.options.install_options MainThread: Cloud placement not provided, setting it to us-west-2a from currently available zones ['us-west-2a', 'us-west-2b', 'us-west-2c']
2017-10-18T22:42:07 INFO ax.meta.cluster_id MainThread: Instantiating cluster bucket ...
2017-10-18T22:42:15 INFO ax.cloud.aws.aws_s3 MainThread: Using region None for bucket applatixtest3
2017-10-18T22:42:24 INFO ax.cluster_management.app.common MainThread: Cannot find cluster name id: An error occurred (403) when calling the HeadBucket operation: Forbidden. Cluster is not yet created.
2017-10-18T22:42:24 INFO ax.meta.cluster_id MainThread: Cluster id not provided, generate one.
2017-10-18T22:42:24 INFO ax.meta.cluster_id MainThread: Created new name-id t2.test.eng.applatix.net-9c2aa9ec-b455-11e7-af2d-025000000001
2017-10-18T22:42:24 INFO ax.meta.config_s3_path MainThread: Using AX cluster config path applatixtest3
2017-10-18T22:42:27 INFO ax.cloud.aws.aws_s3 MainThread: Using region None for bucket applatixtest3
2017-10-18T22:42:31 INFO ax.cloud.aws.aws_s3 MainThread: Using region None for bucket applatixtest3
2017-10-18T22:42:31 INFO ax.platform.ax_cluster_info MainThread: Downloading cluster current state ...
2017-10-18T22:42:40 ERROR ax.cluster_management.argo_cluster_manager MainThread: An error occurred (403) when calling the HeadBucket operation: Forbidden
Traceback (most recent call last):
File "/ax/python/ax/cluster_management/argo_cluster_manager.py", line 86, in parse_args_and_run
getattr(self, cmd)(args)
File "/ax/python/ax/cluster_management/argo_cluster_manager.py", line 265, in install_argo_only
PlatformOnlyInstaller(platform_install_config).run()
File "/ax/python/ax/cluster_management/app/cluster_installer.py", line 514, in __init__
self._ci_installer = ClusterInstaller(cfg=self._cfg.get_install_config(), kubeconfig=self._cfg.kube_config)
File "/ax/python/ax/cluster_management/app/cluster_installer.py", line 70, in __init__
dry_run=self._cfg.dry_run
File "/ax/python/ax/cluster_management/app/common.py", line 83, in __init__
self._csm = ClusterStateMachine(cluster_name_id=self._idobj.get_cluster_name_id(), cloud_profile=cloud_profile)
File "/ax/python/ax/cluster_management/app/state/state.py", line 45, in __init__
current_state = self._cluster_info.download_cluster_current_state() or ClusterState.UNKNOWN
File "/ax/python/ax/platform/ax_cluster_info.py", line 260, in download_cluster_current_state
return self._bucket.get_object(key=self._s3_cluster_current_state)
File "/ax/python/ax/cloud/aws/aws_s3.py", line 365, in get_object
if not self.exists():
File "/ax/python/ax/cloud/aws/aws_s3.py", line 285, in exists
return self._exists()
File "/usr/local/lib/python2.7/dist-packages/retrying.py", line 49, in wrapped_f
return Retrying(*dargs, **dkw).call(f, *args, **kw)
File "/usr/local/lib/python2.7/dist-packages/retrying.py", line 212, in call
raise attempt.get()
File "/usr/local/lib/python2.7/dist-packages/retrying.py", line 247, in get
six.reraise(self.value[0], self.value[1], self.value[2])
File "/usr/local/lib/python2.7/dist-packages/retrying.py", line 200, in call
attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
File "/ax/python/ax/cloud/aws/aws_s3.py", line 565, in _exists
raise ce
ClientError: An error occurred (403) when calling the HeadBucket operation: Forbidden
!!! Operation failed due to runtime error: An error occurred (403) when calling the HeadBucket operation: Forbidden
I think I know what's wrong here! The installer expects an AWS profile called “default” and expects this profile to have enough privileges to query AWS (s3 specifically).. I have the “default” profile .. where this works. But you’re default profile probably doesn’t have this.
I’ll add a “--cloud-profile” option to override this…
This should work now that 64c2698 has been checked in.
I try to install argo on existing AWS kubernetes cluster, but run into following S3 related permission error.
Here is minion IAM role policy.
Here is log when I created kubeternetes and bucket: