Open KofDossou opened 9 years ago
Sounds like you already have maven installed? That seems to be causing some problem. What does the below give? yum search maven
One option could be to run below and re-run the service install: curl -o /etc/yum.repos.d/epel-apache-maven.repo https://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo
Before installing maven the error still existed.
Hi @abajwa-hw I am getting following error,
2015-09-02 12:10:28,345 - Error while executing command 'install':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 123, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/ZEPPELIN/package/scripts/master.py", line 8, in install
import params
File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/ZEPPELIN/package/scripts/params.py", line 58, in
I have set HIVE_CONF_DIR in Advanced zeppelin-env
Please can you help me out to resolve this issue as soon as possible.
I had similar issues when trying to install on ubuntu 14.04 The solution was to just comment out the maven installation command from script.py, restart ambar-server, and try again with the edited version of service stack
Guys, on Ubuntu 14 there are two issues that have to be fixed like this:
1) in the script /var/lib/ambari-server/resources/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/setup_snapshot.sh
you should add the following line in the beginning:
export PATH=$PATH:/devtools/apache-maven-3.3.3/bin
(change the path to your path - where your maven is)
2) fix the .py files in the same folder - add these lines just after the import statements in the beginning:
reload(sys)
sys.setdefaultencoding('utf8')
And then restart installation. Everything will work.
@uprush kindly sent a PR for ubuntu fixes. Please let us know if this resolves the issue so we can close this out
@abajwa-hw : We are using CentOS as the OS for our cluster. We are trying to follow your GitHub Instructions for installing Zeppelin with an already configured clustered , we got till the point where we included it in our Ambari Stack . After that , we tried to install it but we are getting following stack trace in our /var/log/zepplin/zepplin-setup.log :
Compiling Zeppelin view... Initialized empty Git repository in /home/zeppelin/iframe-view/.git/ [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Zeppelin View 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-clean-plugin/2.5/maven-clean-plugin-2.5.pom [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1.039 s [INFO] Finished at: 2016-01-28T16:57:49-05:00 [INFO] Final Memory: 9M/102M [INFO] ------------------------------------------------------------------------ [ERROR] Plugin org.apache.maven.plugins:maven-clean-plugin:2.5 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.apache.maven.plugins:maven-clean-plugin:jar:2.5: Could not transfer artifact org.apache.maven.plugins:maven-clean-plugin:pom:2.5 from/to central (https://repo.maven.apache.org/maven2): java.security.ProviderException: java.security.KeyException -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
And the Ambari Stack Trace is :
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 295, in
0K .......... .......... .......... .......... .......... 11% 324K 1s
50K .......... .......... .......... .......... .......... 22% 651K 1s
100K .......... .......... .......... .......... .......... 34% 1.15M 1s 150K .......... .......... .......... .......... .......... 45% 1.36M 0s 200K .......... .......... .......... .......... .......... 57% 665K 0s 250K .......... .......... .......... .......... .......... 68% 1.17M 0s 300K .......... .......... .......... .......... .......... 79% 1.40M 0s 350K .......... .......... .......... .......... .......... 91% 71.9M 0s 400K .......... .......... .......... ....... 100% 78.8M=0.5s
2016-01-28 16:57:43 (946 KB/s) - ?notebooks.zip? saved [448310/448310]
We are working off this blog : https://github.com/hortonworks-gallery/ambari-zeppelin-service
Let us know if we are missing something.
Mangesh
@mangeshkaslikar are you still encountering problems? You can just set zeppelin.setup.view to skip the view install which
I still get a similar problem when using HDP 2.4 on ubuntu 14 and trying to install it.
Any ideas whats the problem here?
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 235, in
Master().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 36, in install import params File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/params.py", line 59, in spark_jar = format("{spark_jar_dir}/zeppelin-spark-0.5.5-SNAPSHOT.jar") File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py", line 90, in format return ConfigurationFormatter().format(format_string, args, **result) File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py", line 54, in format result_protected = self.vformat(format_string, args, all_params) File "/usr/lib/python2.7/string.py", line 549, in vformat result = self._vformat(format_string, args, kwargs, used_args, 2) File "/usr/lib/python2.7/string.py", line 582, in _vformat result.append(self.format_field(obj, format_spec)) File "/usr/lib/python2.7/string.py", line 599, in format_field return format(value, format_spec) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 81, in getattr raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!") resource_management.core.exceptions.Fail: Configuration parameter 'zeppelin-ambari-config' was not found in configurations dictionary!
Scenario: Installing Zeppelin on a node. configured maven on the node hosting zeppelin but installation fails. My maven configuration(below):
Apache Maven 3.2.2 (45f7c06d68e745d05611f7fd14efb6594181933e; 2014-06-17T09:51:42-04:00) Maven home: /usr/bin/maven Java version: 1.7.0_85, vendor: Oracle Corporation Java home: /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
Error outupt(see below) stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 221, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 218, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 77, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in install_packages Package(name) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 157, in init self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 45, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, _kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, _kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install apache-maven' returned 1. Error: Nothing to do stdout: 2015-08-16 06:34:48,266 - Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': True} 2015-08-16 06:34:48,269 - File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-8.zip'] {'content': DownloadSource('http://ip-172-31-40-144.us-west-2.compute.internal:8080/resources//jce_policy-8.zip')} 2015-08-16 06:34:48,269 - Not downloading the file from http://ip-172-31-40-144.us-west-2.compute.internal:8080/resources//jce_policy-8.zip, because /var/lib/ambari-agent/data/tmp/jce_policy-8.zip already exists 2015-08-16 06:34:48,270 - Group['spark'] {'ignore_failures': False} 2015-08-16 06:34:48,271 - Group['zeppelin'] {'ignore_failures': False} 2015-08-16 06:34:48,271 - Group['hadoop'] {'ignore_failures': False} 2015-08-16 06:34:48,272 - Group['users'] {'ignore_failures': False} 2015-08-16 06:34:48,272 - Group['knox'] {'ignore_failures': False} 2015-08-16 06:34:48,273 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,274 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,275 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,276 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']} 2015-08-16 06:34:48,277 - User['atlas'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,278 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,279 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']} 2015-08-16 06:34:48,280 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']} 2015-08-16 06:34:48,281 - User['zeppelin'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,282 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,283 - User['spark'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,284 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']} 2015-08-16 06:34:48,285 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,286 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,287 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,288 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,289 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,290 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,291 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,292 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,293 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2015-08-16 06:34:48,294 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2015-08-16 06:34:48,296 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2015-08-16 06:34:48,312 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2015-08-16 06:34:48,313 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'} 2015-08-16 06:34:48,318 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2015-08-16 06:34:48,320 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2015-08-16 06:34:48,335 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2015-08-16 06:34:48,336 - Group['hdfs'] {'ignore_failures': False} 2015-08-16 06:34:48,337 - User['hdfs'] {'ignore_failures': False, 'groups': ['hadoop', 'hdfs']} 2015-08-16 06:34:48,338 - Directory['/etc/hadoop'] {'mode': 0755} 2015-08-16 06:34:48,364 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2015-08-16 06:34:48,387 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.3.0.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2015-08-16 06:34:48,403 - File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)} 2015-08-16 06:34:48,405 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2015-08-16 06:34:48,410 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': InlineTemplate(...)} 2015-08-16 06:34:48,411 - Package['unzip'] {} 2015-08-16 06:34:48,599 - Skipping installation of existing package unzip 2015-08-16 06:34:48,599 - Package['curl'] {} 2015-08-16 06:34:48,658 - Skipping installation of existing package curl 2015-08-16 06:34:48,658 - Package['hdp-select'] {} 2015-08-16 06:34:48,716 - Skipping installation of existing package hdp-select 2015-08-16 06:34:48,717 - Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': True} 2015-08-16 06:34:48,718 - File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] {'content': DownloadSource('http://ip-172-31-40-144.us-west-2.compute.internal:8080/resources//jdk-8u40-linux-x64.tar.gz'), 'not_if': 'test -f /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'} 2015-08-16 06:34:48,733 - Skipping File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] due to not_if 2015-08-16 06:34:48,734 - Directory['/usr/jdk64'] {} 2015-08-16 06:34:48,735 - Execute['('chmod', 'a+x', '/usr/jdk64')'] {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java', 'sudo': True} 2015-08-16 06:34:48,749 - Skipping Execute['('chmod', 'a+x', '/usr/jdk64')'] due to not_if 2015-08-16 06:34:48,750 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java'} 2015-08-16 06:34:48,764 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] due to not_if 2015-08-16 06:34:48,765 - File['/usr/jdk64/jdk1.8.0_40/bin/java'] {'mode': 0755, 'cd_access': 'a'} 2015-08-16 06:34:48,767 - Execute['('chgrp', '-R', 'hadoop', '/usr/jdk64/jdk1.8.0_40')'] {'sudo': True} 2015-08-16 06:34:48,815 - Execute['('chown', '-R', 'root', '/usr/jdk64/jdk1.8.0_40')'] {'sudo': True} 2015-08-16 06:34:49,216 - Execute['echo User selected spark_version:1.3'] {} 2015-08-16 06:34:49,231 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package -iname "*.sh" | xargs chmod +x'] {} 2015-08-16 06:34:49,250 - Execute['hadoop fs -mkdir -p /user/zeppelin'] {'ignore_failures': True, 'user': 'hdfs'} 2015-08-16 06:34:52,983 - Execute['hadoop fs -chown zeppelin /user/zeppelin'] {'user': 'hdfs'} 2015-08-16 06:34:56,878 - Execute['hadoop fs -chgrp zeppelin /user/zeppelin'] {'user': 'hdfs'} 2015-08-16 06:35:00,728 - Execute['hadoop fs -mkdir -p hdfs:///apps/zeppelin'] {'ignore_failures': True, 'user': 'hdfs'} 2015-08-16 06:35:04,464 - Execute['hadoop fs -chown zeppelin hdfs:///apps/zeppelin'] {'user': 'hdfs'} 2015-08-16 06:35:08,144 - Execute['hadoop fs -chgrp zeppelin hdfs:///apps/zeppelin'] {'user': 'hdfs'} 2015-08-16 06:35:11,941 - Directory['/var/run/zeppelin-notebook'] {'owner': 'zeppelin', 'group': 'zeppelin', 'recursive': True} 2015-08-16 06:35:11,943 - Directory['/var/log/zeppelin'] {'owner': 'zeppelin', 'group': 'zeppelin', 'recursive': True} 2015-08-16 06:35:11,945 - Execute['touch /var/log/zeppelin/zeppelin-setup.log'] {'user': 'zeppelin'} 2015-08-16 06:35:12,031 - Execute['rm -rf /opt/incubator-zeppelin'] {'ignore_failures': True} 2015-08-16 06:35:12,045 - Execute['mkdir /opt/incubator-zeppelin'] {} 2015-08-16 06:35:12,060 - Execute['chown -R zeppelin:zeppelin /opt/incubator-zeppelin'] {} 2015-08-16 06:35:12,074 - Execute['echo Processing with zeppelin tar compiled with spark 1.3'] {} 2015-08-16 06:35:12,086 - Execute['echo Installing packages'] {} 2015-08-16 06:35:12,100 - Package['git'] {} 2015-08-16 06:35:12,284 - Skipping installation of existing package git 2015-08-16 06:35:12,285 - Package['java-1.7.0-openjdk-devel'] {} 2015-08-16 06:35:12,345 - Skipping installation of existing package java-1.7.0-openjdk-devel 2015-08-16 06:35:12,346 - Package['apache-maven'] {} 2015-08-16 06:35:12,405 - Installing package apache-maven ('/usr/bin/yum -d 0 -e 0 -y install apache-maven')