When deploying ambari, script was stuck at install-ambari-components.sh
On logging into the UI and checking the error, the main problem seems to be that the NameNode (of the master instance) is not starting due to java security exception.
I have tried to copy the appropriate oracle policy files (for JDK 7) into the security folders of the jre on all the master and worker instances and have rebooted using 'sudo reboot' on all the instance. But starting the components for the master instance (in ambari UI), still fails with same security exception.
On further investigation, it seems that the jdk used in the instance is not oracle but openJDK. But this means that this security exception should not be an issue since openJDK seems to have unlimited policy files.
Is there some other configuration/ workarounds that would help with deploying ambari/ HDP on GCP?
Or are there better ways to deploy HDP on GCP?
(I am using free tier, with option -n 2 -m n1-standard-2 on deploying ambari at zone us-central1-a)
The log with the main exception is as below:
18/02/09 02:29:42 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=true 18/02/09 02:29:42 INFO blockmanagement.BlockManager: dfs.block.access.key.update.interval=600 min(s), dfs.block.access.token.lifetime=600 min(s), dfs.encrypt.data.transfer.algorithm=null 18/02/09 02:29:42 ERROR namenode.NameNode: Failed to start namenode. java.lang.ExceptionInInitializerError at javax.crypto.KeyGenerator.nextSpi(KeyGenerator.java:341) at javax.crypto.KeyGenerator.<init>(KeyGenerator.java:169) at javax.crypto.KeyGenerator.getInstance(KeyGenerator.java:224) at org.apache.hadoop.security.token.SecretManager.<init>(SecretManager.java:143) at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.<init>(BlockTokenSecretManager.java:120) at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.<init>(BlockTokenSecretManager.java:111) at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.createBlockTokenSecretManager(BlockManager.java:437) at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.<init>(BlockManager.java:324) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:744) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:704) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1125) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1571) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1709) Caused by: java.lang.SecurityException: Can not initialize cryptographic mechanism at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:94) ... 13 more Caused by: java.lang.SecurityException: Cannot locate policy or framework files! at javax.crypto.JceSecurity.setupJurisdictionPolicies(JceSecurity.java:317) at javax.crypto.JceSecurity.access$000(JceSecurity.java:50) at javax.crypto.JceSecurity$1.run(JceSecurity.java:86) at java.security.AccessController.doPrivileged(Native Method) at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:83) ... 13 more 18/02/09 02:29:42 INFO util.ExitUtil: Exiting with status 1 18/02/09 02:29:42 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at hadoop-m.c.hdp-989.internal/10.128.0.3 ************************************************************/ yes: standard output: Broken pipe yes: write error
When deploying ambari, script was stuck at install-ambari-components.sh On logging into the UI and checking the error, the main problem seems to be that the NameNode (of the master instance) is not starting due to java security exception. I have tried to copy the appropriate oracle policy files (for JDK 7) into the security folders of the jre on all the master and worker instances and have rebooted using 'sudo reboot' on all the instance. But starting the components for the master instance (in ambari UI), still fails with same security exception. On further investigation, it seems that the jdk used in the instance is not oracle but openJDK. But this means that this security exception should not be an issue since openJDK seems to have unlimited policy files.
Is there some other configuration/ workarounds that would help with deploying ambari/ HDP on GCP?
Or are there better ways to deploy HDP on GCP?
(I am using free tier, with option -n 2 -m n1-standard-2 on deploying ambari at zone us-central1-a)
The log with the main exception is as below:
18/02/09 02:29:42 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=true 18/02/09 02:29:42 INFO blockmanagement.BlockManager: dfs.block.access.key.update.interval=600 min(s), dfs.block.access.token.lifetime=600 min(s), dfs.encrypt.data.transfer.algorithm=null 18/02/09 02:29:42 ERROR namenode.NameNode: Failed to start namenode. java.lang.ExceptionInInitializerError at javax.crypto.KeyGenerator.nextSpi(KeyGenerator.java:341) at javax.crypto.KeyGenerator.<init>(KeyGenerator.java:169) at javax.crypto.KeyGenerator.getInstance(KeyGenerator.java:224) at org.apache.hadoop.security.token.SecretManager.<init>(SecretManager.java:143) at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.<init>(BlockTokenSecretManager.java:120) at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.<init>(BlockTokenSecretManager.java:111) at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.createBlockTokenSecretManager(BlockManager.java:437) at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.<init>(BlockManager.java:324) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:744) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:704) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1125) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1571) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1709) Caused by: java.lang.SecurityException: Can not initialize cryptographic mechanism at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:94) ... 13 more Caused by: java.lang.SecurityException: Cannot locate policy or framework files! at javax.crypto.JceSecurity.setupJurisdictionPolicies(JceSecurity.java:317) at javax.crypto.JceSecurity.access$000(JceSecurity.java:50) at javax.crypto.JceSecurity$1.run(JceSecurity.java:86) at java.security.AccessController.doPrivileged(Native Method) at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:83) ... 13 more 18/02/09 02:29:42 INFO util.ExitUtil: Exiting with status 1 18/02/09 02:29:42 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at hadoop-m.c.hdp-989.internal/10.128.0.3 ************************************************************/ yes: standard output: Broken pipe yes: write error