RevolutionAnalytics / RHadoop

RHadoop
https://github.com/RevolutionAnalytics/RHadoop/wiki
763 stars 278 forks source link

Failure to Login #63

Open DataJunkie opened 12 years ago

DataJunkie commented 12 years ago

When trying to install the rhdfs pacakge, I get an IOException from Java:

demos@workhorse:~$ /usr/bin/R CMD INSTALL rhdfs

Have you ever seen this? How can I fix it?

I am using R 2.14 and CDH3.

DataJunkie commented 12 years ago

Is there someway I can get the full stack trace? The exception seems to be thrown from somewhere in the classes in the JAR file. It seems to be an issue inside the RJavaTools class. It looks like the invokeMethod function is throwing the exception... but I can't figure out how to resolve it.

RevolutionAnalytics commented 12 years ago

HI Ryan,

Glad to hear from you again. I've not seen that error before, but I suspect it is a CLASSPATH issue.

Do you have HADOOP_HOME, and HADOOP_CONF properly set? Are the requisite hadoop jar files located under HADOOP_HOME?

Thanks

Daivd

On Wed, Mar 14, 2012 at 11:14 AM, Ryan Rosario < reply@reply.github.com

wrote:

When trying to install the rhdfs pacakge, I get an IOException from Java:

demos@workhorse:~$ /usr/bin/R CMD INSTALL rhdfs

  • installing to library /home/demos/R/x86_64-pc-linux-gnu-library/2.14
  • installing source package rhdfs ... * R * inst * preparing package for lazy loading * help ** installing help indices * building package indices ... \ testing if installed package can be loaded Error : .onLoad failed in loadNamespace() for 'rhdfs', details: call: .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, error: java.io.IOException: failure to login Error: loading failed Execution halted ERROR: loading failed
  • removing /home/demos/R/x86_64-pc-linux-gnu-library/2.14/rhdfs

Have you ever seen this? How can I fix it?

I am using R 2.14 and CDH3.


Reply to this email directly or view it on GitHub: https://github.com/RevolutionAnalytics/RHadoop/issues/63

DataJunkie commented 12 years ago

Hi David :)

Unfortunately, all I've been able to find is that it is a classloader issue in RJavaTools it seems.

I have HADOOP_HOME pointing to the root directory of the hadoop distribution (where the conf, logs etc. directories are) as it the standard operating procedure. I have HADOOP_CONF set to $HADOOP_HOME/conf. rhdfs seems to see that these are both set.

I have hadoop-core-0.20.2-cdh3u1.jar and several other jars in $HADOOP_HOME at the top level: $HADOOP_HOME/hadoop-core-0.20.2-cdh3u1.jar.

RevolutionAnalytics commented 12 years ago

Hi Ryan,

I'm assuming you already ran "R CMD javareconf -e" ? I'm still looking at other causes...

David

On Wed, Mar 14, 2012 at 1:37 PM, Ryan Rosario < reply@reply.github.com

wrote:

Hi David :)

Unfortunately, all I've been able to find is that it is a classloader issue in RJavaTools it seems.

I have HADOOP_HOME pointing to the root directory of the hadoop distribution (where the conf, logs etc. directories are) as it the standard operating procedure. I have HADOOP_CONF set to $HADOOP_HOME/conf. rhdfs seems to see that these are both set.

I have hadoop-core-0.20.2-cdh3u1.jar and several other jars in $HADOOP_HOME at the top level: $HADOOP_HOME/hadoop-core-0.20.2-cdh3u1.jar.


Reply to this email directly or view it on GitHub:

https://github.com/RevolutionAnalytics/RHadoop/issues/63#issuecomment-4507370

RevolutionAnalytics commented 12 years ago

Hi Ryan,

You wouldn't by chance be running HIVE on the same machine where you are install rhdfs? I'm wondering what your classpath looks like and if the HIVE jar files are somehow conflicting with the HADOOP ones.

Thanks

David

On Wed, Mar 14, 2012 at 3:21 PM, David Champagne < david.champagne@revolutionanalytics.com> wrote:

Hi Ryan,

I'm assuming you already ran "R CMD javareconf -e" ? I'm still looking at other causes...

David

On Wed, Mar 14, 2012 at 1:37 PM, Ryan Rosario < reply@reply.github.com

wrote:

Hi David :)

Unfortunately, all I've been able to find is that it is a classloader issue in RJavaTools it seems.

I have HADOOP_HOME pointing to the root directory of the hadoop distribution (where the conf, logs etc. directories are) as it the standard operating procedure. I have HADOOP_CONF set to $HADOOP_HOME/conf. rhdfs seems to see that these are both set.

I have hadoop-core-0.20.2-cdh3u1.jar and several other jars in $HADOOP_HOME at the top level: $HADOOP_HOME/hadoop-core-0.20.2-cdh3u1.jar.


Reply to this email directly or view it on GitHub:

https://github.com/RevolutionAnalytics/RHadoop/issues/63#issuecomment-4507370

DataJunkie commented 12 years ago

Thanks. Yes already did that. But this is interesting...

On my work laptop (Ubuntu 11.10) it works fine using the exact same Hadoop (I copy the Hadoop directory from machine to machine so the config, version, etc. is identical).

This is just a stab in the dark, but it might be an incompatibility with Ubuntu 10.04 (about to be phased out), or the version of Java I have on it. I did not try the command with the -e switch though.

DataJunkie commented 12 years ago

Not Hive, but HBase. Both machines have HBase though. When I get a chance I will check my version of the JRE on both machines. Very strange issue.

eric-kimbrel commented 12 years ago

any further resolution on this? I have the same issue on debian linux. I do have HIVE installed.

Kind of at a loss of what to try to move forward.

DataJunkie commented 12 years ago

Hi. I thought I responded on this thread with a solution, but I must not have.

IIRC, I believe the issue was the version of Java that I was using was not correct. Either it was the wrong architecture (32/64), or it was OpenJDK and it must be Sun/Oracle.

The only other thing I can think of (if the Java version wasn't the solution) was the version of Hadoop was not compatible with the package.

I hope this helps, R.

On Thu, Jun 28, 2012 at 12:43 PM, eric-kimbrel reply@reply.github.com wrote:

any further resolution on this?  I have the same issue on debian linux.   I do have HIVE installed.

Kind of at a loss of what to try to move forward.


Reply to this email directly or view it on GitHub: https://github.com/RevolutionAnalytics/RHadoop/issues/63#issuecomment-6639458

RRR

DataJunkie commented 12 years ago

Try this, it is from the R admin manual. I remember doing something like this, but I don't remember the exact command:

Set JAVA_HOME to the Sun java appropriate for your system.

export JAVA_HOME=

R sets up the appropriate Java for use.

R CMD javareconf

On Thu, Jun 28, 2012 at 3:44 PM, eric-kimbrel reply@reply.github.com wrote:

That puts me back on track to some degree... i see that when i run javreconf i have

Java home path   : /usr/lib/jvm/java-6-openjdk/jre

openJdk, which is apparently not good.

but i have JAVA_HOME set to /usr/lib/jvm/java-6-sun-1.6.0.26

How do i get R to look at the sun version instead of openJDK?


Reply to this email directly or view it on GitHub: https://github.com/RevolutionAnalytics/RHadoop/issues/63#issuecomment-6643954

RRR

jamiefolson commented 11 years ago

I think rJava and Hive have different requirements for JAVA_HOME. I ran into similar troubles.