sequenceiq / docker-ambari

Docker image with Ambari
291 stars 200 forks source link

Hbase Client install failed with OSError: [Errno 20] Not a directory: '/etc/resolv.conf/hadoop' #50

Closed oguennec closed 9 years ago

oguennec commented 9 years ago

I have setup successfully a Single-node cluster using sequenceiq/ambari v1.7.0 by following instructions on https://registry.hub.docker.com/u/sequenceiq/ambari/.

But when adding Hbase service through Ambari web console, Hbase Client install failed with the message :

OSError: [Errno 20] Not a directory: '/etc/resolv.conf/hadoop'

stderr: Traceback (most recent call last): File “/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HBASE/package/scripts/hbaseclient.py”, line 43, in HbaseClient().execute() File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 123, in execute method(env) File “/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HBASE/package/scripts/hbase_client.py”, line 30, in install self.configure(env) File “/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HBASE/package/scripts/hbase_client.py”, line 36, in configure hbase(name='client') File “/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HBASE/package/scripts/hbase.py”, line 37, in hbase recursive = True File “/usr/lib/python2.6/site-packages/resource_management/core/base.py”, line 148, in _init self.env.run() File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 149, in run self.run_action(resource, action) File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 115, in run_action provider_action() File “/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py”, line 148, in action_create os.makedirs(path, self.resource.mode or 0755) File “/usr/lib64/python2.6/os.py”, line 150, in makedirs makedirs(head, mode) File “/usr/lib64/python2.6/os.py”, line 157, in makedirs mkdir(name, mode) OSError: [Errno 20] Not a directory: '/etc/resolv.conf/hadoop'

stdout: 2015-05-15 11:55:56,285 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x “” –retry 10 http://amb0.mycorp.kom:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': …, 'notif': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']} 2015-05-15 11:55:56,303 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x “” –retry 10 http://amb0.mycorp.kom:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if 2015-05-15 11:55:56,303 - Group['hadoop'] {'ignore_failures': False} 2015-05-15 11:55:56,305 - Modifying group hadoop 2015-05-15 11:55:56,775 - Group['users'] {'ignore_failures': False} 2015-05-15 11:55:56,776 - Modifying group users 2015-05-15 11:55:57,209 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2015-05-15 11:55:57,210 - Modifying user mapred 2015-05-15 11:55:57,231 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2015-05-15 11:55:57,231 - Modifying user hbase 2015-05-15 11:55:57,560 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2015-05-15 11:55:57,561 - Modifying user ambari-qa 2015-05-15 11:55:57,581 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2015-05-15 11:55:57,582 - Modifying user zookeeper 2015-05-15 11:55:57,602 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2015-05-15 11:55:57,603 - Modifying user hdfs 2015-05-15 11:55:57,623 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2015-05-15 11:55:57,624 - Modifying user yarn 2015-05-15 11:55:57,645 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2015-05-15 11:55:57,648 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'} 2015-05-15 11:55:57,667 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if 2015-05-15 11:55:57,668 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2015-05-15 11:55:57,670 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/etc/resolv.conf/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'} 2015-05-15 11:55:58,202 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True} 2015-05-15 11:55:58,204 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'} 2015-05-15 11:55:58,223 - Skipping Link['/etc/hadoop/conf'] due to not_if 2015-05-15 11:55:58,255 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(…), 'owner': 'hdfs'} 2015-05-15 11:55:58,279 - Repository['HDP-2.2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.2.4.2', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None} 2015-05-15 11:55:58,297 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')} 2015-05-15 11:55:58,300 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2015-05-15 11:55:58,308 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')} 2015-05-15 11:55:58,309 - Package['unzip'] {} 2015-05-15 11:55:59,018 - Skipping installing existent package unzip 2015-05-15 11:55:59,018 - Package['curl'] {} 2015-05-15 11:55:59,726 - Skipping installing existent package curl 2015-05-15 11:55:59,726 - Package['hdp-select'] {} 2015-05-15 11:56:00,435 - Skipping installing existent package hdp-select 2015-05-15 11:56:00,438 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; curl -kf -x “” –retry 10 http://amb0.mycorp.kom:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] {'environment': …, 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']} 2015-05-15 11:56:00,455 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; curl -kf -x “” –retry 10 http://amb0.mycorp.kom:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] due to not_if 2015-05-15 11:56:00,457 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']} 2015-05-15 11:56:00,474 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if 2015-05-15 11:56:00,738 - Package['hbase_2_2'] {} 2015-05-15 11:56:01,447 - Skipping installing existent package hbase2_2 2015-05-15 11:56:01,480 - Directory['/etc/hbase/conf'] {'owner': 'hbase', 'group': 'hadoop', 'recursive': True} 2015-05-15 11:56:01,482 - Changing owner for /etc/hbase/conf from 0 to hbase 2015-05-15 11:56:01,482 - Changing group for /etc/hbase/conf from 0 to hadoop 2015-05-15 11:56:01,483 - Directory['/etc/resolv.conf/hadoop/hbase'] {'owner': 'hbase', 'recursive': True} 2015-05-15 11:56:01,483 - Creating directory Directory['/etc/resolv.conf/hadoop/hbase']

docker version Client version: 1.5.0 Client API version: 1.17 Go version (client): go1.3.3 Git commit (client): a8a31ef/1.5.0 OS/Arch (client): linux/amd64 Server version: 1.5.0 Server API version: 1.17 Go version (server): go1.3.3 Git commit (server): a8a31ef/1.5.0

jake-low commented 9 years ago

This is related to AMBARI-8620. Basically Ambari looks at all the mounted filesystems to find directories to recommend for various settings. Since docker mounts special files at /etc/resolv.conf, /etc/hostname, and /etc/hosts to allow containers to modify these files, Ambari thinks those mount points are directories where it can put data.

The setting that's causing you trouble is hbase.tmp.dir, which has defaulted to /etc/resolv.conf/hadoop/hbase. You can fix it in the "Customize Services" setup step, under the "HBase" tab: search for "HBase local directory", and remove the /etc/resolv.conf prefix so that the value is just /hadoop/hbase.

Note that I had a similar problem with zk_local_dir for Zookeeper and storm.local.dir for Storm; you may run into those too (and possibly others; but the fix will be the same -- find where /etc/resolv.conf is incorrectly being used as a directory prefix during configuration and remove it).

oguennec commented 9 years ago

Thanks. As I can see this issue will be fixed in Ambari 2.1.0.

I couldn't access this version from http://s3.amazonaws.com/dev.hortonworks.com/ambari/centos6/2.x/latest/2.1.0/. Not sure if this is public.

But I could replicate the bug with Ambari 2.0.1 on Docker. I could fix all the faulty paths in HDFS, YARN, HBASE and Zookeeper in the "Customize Services" setup step before proceeding with success to setup HDP 2.2.4.2 with Ambari 2.0.1 on Docker 1.5.0.