Open unixph opened 8 years ago
Do you know which version of hadoop you are running, and also can you tell me if your Linux is 64 bit or 32 bit
Linux 2.6.32-504.el6.x86_64 x86_64 and my hadoop is hadoop-2.6.4
sysctl.conf # Kernel sysctl configuration file for Red Hat Linux # # For binary values, 0 is disabled, 1 is enabled. See sysctl(8) and # sysctl.conf(5) for more details.
# Controls IP packet forwarding net.ipv4.ip_forward = 0
# Controls source route verification net.ipv4.conf.default.rp_filter = 1
# Do not accept source routing net.ipv4.conf.default.accept_source_route = 0
# Controls the System Request debugging functionality of the kernel kernel.sysrq = 0
# Controls whether core dumps will append the PID to the core filename. # Useful for debugging multi-threaded applications. kernel.core_uses_pid = 1
# Controls the use of TCP syncookies net.ipv4.tcp_syncookies = 1
# Disable netfilter on bridges. net.bridge.bridge-nf-call-ip6tables = 0 net.bridge.bridge-nf-call-iptables = 0 net.bridge.bridge-nf-call-arptables = 0
# Controls the maximum size of a message, in bytes kernel.msgmnb = 65536
# Controls the default maxmimum size of a mesage queue kernel.msgmax = 65536
# Controls the maximum shared segment size, in bytes kernel.shmmax = 68719476736
# Controls the maximum number of shared memory segments, in pages kernel.shmall = 4294967296
#Disable IPV6 net.ipv6.conf.all.disable_ipv6 = 1 net.ipv6.conf.default.disable_ipv6 = 1
#Disable system-wide core dumps fs.suid_dumpable = 0
#Turn off ICMP broadcasts net.ipv4.icmp_echo_ignore_broadcasts = 1
# Disable Secure ICMP Redirect Acceptance net.ipv4.conf.all.accept_redirects = 0 net.ipv4.conf.all.secure_redirects = 0 net.ipv4.conf.default.secure_redirects = 0
# Disable Send Packet Redirects net.ipv4.conf.all.send_redirects = 0 net.ipv4.conf.default.send_redirects = 0
# Disable Source Routed Packet Acceptance net.ipv4.conf.all.accept_source_route = 0 net.ipv4.conf.default.accept_source_route = 0
# Enable Bad Error Message Protection net.ipv4.icmp_ignore_bogus_error_responses = 1
#Turn off ICMP broadcasts net.ipv4.icmp_echo_ignore_broadcasts = 1
#Disable ICMP Redirect Acceptance net.ipv4.conf.all.accept_redirects = 0 # Disable Secure ICMP Redirect Acceptance net.ipv4.conf.all.secure_redirects = 0
# Disable Send Packet Redirects net.ipv4.conf.all.send_redirects = 0 net.ipv4.conf.default.send_redirects = 0
# Disable Source Routed Packet Acceptance net.ipv4.conf.all.accept_source_route = 0 net.ipv4.conf.default.accept_source_route = 0
# Enable Bad Error Message Protection net.ipv4.icmp_ignore_bogus_error_responses = 1 #Disable IPV6 net.ipv6.conf.all.disable_ipv6 = 1 net.ipv6.conf.default.disable_ipv6 = 1
/etc/hosts
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 10.92.145.78 x01trcamapp1a #Tripwire 10.91.74.24 w01sasccmapp1a ## DNS 10.80.114.8 s01tccdns1 10.81.112.8 s01accdns1
# CyberArk 10.197.243.113 w01gcybcpm1a 10.196.243.102 w01rcybcpm1a
Can you please help?
You want to use this Document, I think this should solve the problem coz I think you are missing the proper source files for Hadoop http://www.ercoppa.org/Linux-Compile-Hadoop-220-fix-Unable-to-load-native-hadoop-library.htm the source files u should download should be for your version of hadoop i.e 2.6.4 let me know
Thanks. So it isn't about the IPv6 module issue? I want to resolve this error: Address family not supported by protocol
did you configure the ipv6 in the config file?? check in server.conf if you configure it there I think it will resolve this issue
I wrote my sysctl.conf file above. I haven't done anything in the config file. Should I add something there to resolve the issue? Although, it will need to be rebooted if I change something. Is there any other way to do it? Many thanks.
Followed these steps: http://tecadmin.net/setup-hadoop-2-4-single-node-cluster-on-linux/# But was stuck in starting the hadoop cluster start-dfs.sh and got the error.
Hi,
Please help me how to resolve this error. I am trying to install Hadoop on RHEL 6.2. Thanks.
$ start-dfs.sh
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Starting namenodes on [localhost] localhost: socket: Address family not supported by protocol localhost: ssh: connect to host localhost port 22: Address family not supported by protocol localhost: socket: Address family not supported by protocol localhost: ssh: connect to host localhost port 22: Address family not supported by protocol Starting secondary namenodes [0.0.0.0] 0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused 16/08/11 10:06:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
$ start-yarn.sh starting yarn daemons resourcemanager running as process 20041. Stop it first. localhost: socket: Address family not supported by protocol localhost: ssh: connect to host localhost port 22: Address family not supported by protocol