Open prabhunkl opened 10 years ago
Hi.
RHive don't use host argument as JDBC URL but make URL with host and other information(hiveserver2, user, password). So, RHive don't support kerberos.
We will fix it in the next version of RHive.
Thanks. If there is any problem again, feel free to contact us.
On Tue, May 6, 2014 at 11:30 PM, prabhunkl notifications@github.com wrote:
Hi There,
Is the RHive will work with hiveserver2 which is enabled with kerberos security ?
When I try to connect to Hive I am getting following exception in my R Studio console
rhive.connect(host=" hostname.domain.com/default;principal=hive/hostname.domain.com@RELAM.COM ",defaultFS="hdfs://namenode.domain.com:8020/user/me",hiveServer2=TRUE) 14/05/05 15:10:52 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. Exception in thread "Thread-35" java.lang.IllegalArgumentException: Kerberos principal should have 3 parts: hive/ hostname.domain.com@RELAM.COM:10000/default at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:64) at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:198) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:138) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:123) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:582) at java.sql.DriverManager.getConnection(DriverManager.java:185) at com.nexr.rhive.hive.DatabaseConnection.connect(DatabaseConnection.java:51) at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:330) at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.run(HiveJdbcClient.java:322) Error: java.lang.IllegalStateException: Not connected to hiveserver
Thanks, Prabhu.
— Reply to this email directly or view it on GitHubhttps://github.com/nexr/RHive/issues/59 .
Thanks for your response! but I could able to connect my Hadoop cluster which is enabled with kerberos.
library(RHive)
rhive.init(hiveHome="/usr/lib/hive", hadoopHome="/usr/lib/hadoop", hadoopConf="/etc/hadoop/conf", verbose=TRUE)
rhive.connect()
and I am seeing RHive is trying to WRITE something to the root of the hadoop file system. And failing since it do not have any write permission.
Any thoughts ?
> rhive.connect()
14/05/09 16:28:04 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/client/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/05/09 16:28:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/05/09 16:28:05 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
14/05/09 16:28:06 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
Error: org.apache.hadoop.security.AccessControlException: Permission denied: user=prabhunkl, access=WRITE, inode="/":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:234)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:214)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:158)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5202)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5184)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5158)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2090)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2043)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNames
Hi.
HDFS default directory for RHive is "/rhive". If Rhive default dir. don't exist, RHive will make it. After that, RHive continue to use it. However, if "rhive" directory don't have the 755 or 777 permission, AccessControlException can occur like your case. In this case, please change the directory permission to 775 for single-user or 777 for multi-user, and try again.
If you don't want to use default dir., you can change it by changing R system environment value like the following: You must change it before load RHive library or executing rhive.init()
Sys.setenv(RHIVE_FS_HOME="/rhive")
Thanks! If there is any problem again, feel free to contact us.
On Sat, May 10, 2014 at 5:38 AM, Prabhu notifications@github.com wrote:
Thanks for your response! but I could able to connect my Hadoop cluster which is enabled with kerberos.
library(RHive) rhive.init(hiveHome="/usr/lib/hive", hadoopHome="/usr/lib/hadoop", hadoopConf="/etc/hadoop/conf", verbose=TRUE) rhive.connect()
and I am seeing RHive is trying to WRITE something to the root of the hadoop file system. And failing since it do not have any write permission.
Any thoughts ?
rhive.connect() 14/05/09 16:28:04 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/lib/hadoop/client/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/05/09 16:28:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/05/09 16:28:05 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 14/05/09 16:28:06 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. Error: org.apache.hadoop.security.AccessControlException: Permission denied: user=n230380, access=WRITE, inode="/":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:234) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:214) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:158) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5202) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5184) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5158) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2090) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2043) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNames
— Reply to this email directly or view it on GitHubhttps://github.com/nexr/RHive/issues/59#issuecomment-42711032 .
Thanks ssshow16!
my code looks as follows now. I have updated the folder permission to 755
library(RHive)
#Using user home directory
Sys.setenv(RHIVE_FS_HOME="/user/prabhunkl/rhive")
rhive.init(hiveHome="/usr/lib/hive", hadoopHome="/usr/lib/hadoop", hadoopConf="/etc/hadoop/conf", verbose=TRUE)
rhive.connect(host="168.69.200.212",port=10000, hiveServer2=TRUE)
And I am getting following exception. My Hadoop filesystem is secured with Kerberos, Is it safe to assume the exception is due to inability of passing the kerberos ticket or RHive is using PLAIN connection?
> rhive.connect(host="168.69.200.211",port=10000, hiveServer2=TRUE)
14/05/12 15:53:12 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
Exception in thread "Thread-7" java.lang.RuntimeException: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:337)
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.run(HiveJdbcClient.java:322)
Caused by: java.sql.SQLException: Could not open connection to jdbc:hive2://167.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:146)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:123)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:582)
at java.sql.DriverManager.getConnection(DriverManager.java:185)
at com.nexr.rhive.hive.DatabaseConnection.connect(DatabaseConnection.java:51)
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:330)
... 1 more
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type PLAIN
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:190)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:144)
... 7 more
Error: java.lang.IllegalStateException: Not connected to hiveserver
>
Hi Prabhu!
I guess this is Kerberos authentication problem.
As you know, if you use Kerberos authentication,
JDBC URL must include principal=
Can you try it without Kerberos authentication of HDFS/HiveServer2 for Test.
Thanks. If there is any problem again, feel free to contact us.
On Tue, May 13, 2014 at 12:58 AM, Prabhu notifications@github.com wrote:
Thanks ssshow16!
my code looks as follows now. I have updated the folder permission to 755
library(RHive)
Using user home directory
Sys.setenv(RHIVE_FS_HOME="/user/prabhunkl/rhive") rhive.init(hiveHome="/usr/lib/hive", hadoopHome="/usr/lib/hadoop", hadoopConf="/etc/hadoop/conf", verbose=TRUE) rhive.connect(host="168.69.200.212")
And I am getting following exception. My Hadoop filesystem is secured with Kerberos, Is it safe to assume the exception is due to inability of passing the kerberos ticket ?
rhive.connect(host="168.69.200.212") 14/05/12 11:45:59 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because lib hadoop cannot be loaded. Warning: +----------------------------------------------------------+
- / hiveServer2 argument has not been provided correctly. +
- / RHive will use a default value: hiveServer2=TRUE. + +----------------------------------------------------------+
Exception in thread "Thread-11" java.lang.RuntimeException: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.212:10000/default: java.net.ConnectException: Connection refused at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:337) at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.run(HiveJdbcClient.java:322) Caused by: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.212:10000/default: java.net.ConnectException: Connection refused at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:146) at org.apache.hive.jdbc.HiveConnection.
(HiveConnection.java:123) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:582) at java.sql.DriverManager.getConnection(DriverManager.java:185) at com.nexr.rhive.hive.DatabaseConnection.connect(DatabaseConnection.java:51) at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:330) ... 1 more Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused at org.apache.thrift.transport.TSocket.open(TSocket.java:185) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:248) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:144) ... 7 more Caused by: java.net.ConnectException: Connection refused at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351) at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) at java.net.Socket.connect(Socket.java:529) at org.apache.thrift.transport.TSocket.open(TSocket.java:180) ... 10 more Error: java.lang.IllegalStateException: Not connected to hiveserver — Reply to this email directly or view it on GitHubhttps://github.com/nexr/RHive/issues/59#issuecomment-42850805 .
Thanks! Bruce.
The earlier error was due to wrong Hive srever IP address. After correcting the IP address I ran the bellow script
library(RHive)
Sys.setenv(RHIVE_FS_HOME="/user/prabhunkl/rhive")
rhive.init(hiveHome="/usr/lib/hive", hadoopHome="/usr/lib/hadoop", hadoopConf="/etc/hadoop/conf", verbose=TRUE)
rhive.connect(host="168.69.200.211",port=10000, hiveServer2=TRUE)
And I am getting following Exception in RStudio. I clealy understand the bellow exception thrown Hive as it's not acception plain connection and also it's require Keberos ticket (TGT)
> rhive.connect(host="168.69.200.211",port=10000, hiveServer2=TRUE)
14/05/12 22:56:35 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/client/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/05/12 22:56:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/05/12 22:56:36 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
Exception in thread "Thread-5" java.lang.RuntimeException: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:337)
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.run(HiveJdbcClient.java:322)
Caused by: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:146)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:123)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:582)
at java.sql.DriverManager.getConnection(DriverManager.java:185)
at com.nexr.rhive.hive.DatabaseConnection.connect(DatabaseConnection.java:51)
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:330)
... 1 more
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type PLAIN
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:190)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:144)
... 7 more
Error: java.lang.IllegalStateException: Not connected to hiveserver
>
I don't think I have option of making plain connection. What I can do is will try to implement Keberos security to RHive package. Before I do any actual code change I will try to understand the current code. Will get back to you if I have any questions.
Feel free to contact us any time.
On Tue, May 13, 2014 at 12:11 PM, Prabhu notifications@github.com wrote:
Thanks! Bruce.
The earlier error was due to wrong Hive srever IP address. After correcting the IP address I ran the bellow script
library(RHive) Sys.setenv(RHIVE_FS_HOME="/user/prabhunkl/rhive") rhive.init(hiveHome="/usr/lib/hive", hadoopHome="/usr/lib/hadoop", hadoopConf="/etc/hadoop/conf", verbose=TRUE) rhive.connect(host="168.69.200.211",port=10000, hiveServer2=TRUE)
And I am getting following Exception in RStudio. I clealy understand the bellow exception thrown Hive as it's not acception plain connection and also it's require Keberos ticket (TGT)
rhive.connect(host="168.69.200.211",port=10000, hiveServer2=TRUE) 14/05/12 22:56:35 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/lib/hadoop/client/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/05/12 22:56:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/05/12 22:56:36 WARN hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. Exception in thread "Thread-5" java.lang.RuntimeException: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:337) at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.run(HiveJdbcClient.java:322) Caused by: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:146) at org.apache.hive.jdbc.HiveConnection.
(HiveConnection.java:123) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:582) at java.sql.DriverManager.getConnection(DriverManager.java:185) at com.nexr.rhive.hive.DatabaseConnection.connect(DatabaseConnection.java:51) at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:330) ... 1 more Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type PLAIN at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:190) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:144) ... 7 more Error: java.lang.IllegalStateException: Not connected to hiveserver I don't think I have option of making plain connection. What I can do is will try to implement Keberos security to RHive package. Before I do any actual code change I will try to understand the current code. Will get back to you if I have any questions.
— Reply to this email directly or view it on GitHubhttps://github.com/nexr/RHive/issues/59#issuecomment-42912981 .
Any progress on implementing Kerberos security for RHive?
I have made a Kerberos implementation. Yet to submit the code.
May I know what kind of Kerberos implementation in your environment this will be help full to me determine my implementation will work or not.
Sent from my iPhone
On Aug 16, 2014, at 11:34 PM, Mark Conway notifications@github.com wrote:
Any progress on implementing Kerberos security for RHive?
— Reply to this email directly or view it on GitHub.
Prabhunkl, Can you submit this code when you have the time. I have a need for it as well.
Hi Prabhunkl,
Can you please share the code for Rhive using Kerberos? I need it badly. Thanks in advance.
Hi,
Is there a timeline when kerberos for RHive will be available ? Which version of RHive should we expect this ?
Hi ssshow16: I think I have some problem connecting R and Hive too, hiveserver2 is not started no matter how I tried. I've posted my problem on stackoverflow and please take a look:
http://stackoverflow.com/questions/30995208/rhive-hiveserver2-kerberos-keytab-and-principal
I think the both Kerberos keytab and principal need to be adjusted.
I tried this on a non-kerberized cluster and this is how I did this:
R library(RHive) rhive.init(hiveHome="/opt/hive", hiveLib="/opt/hive/lib:/rhive/lib/2.0-0.2", hadoopHome="/opt/hadoop", hadoopConf="/etc/hadoop", hadoopLib="/opt/hadoop/lib", verbose=FALSE) rhive.connect("hiveserver-ranjana-ci-1940.test.altiscale.com",10000) OR rhive.connect("hiveserver-ranjana-ci-1940.test.altiscale.com",hiveServer2=TRUE,updateJar=FALSE,defaultFS="hdfs://nn-host-name:/user/root",user=NULL,password=NULL) rhive.query("show databases") rhive.query("use default") rhive.query("show tables") rhive.hdfs.ls(path="/user/root")
Hello,
Any update on kerberos implementation on RHive ?
Regards,
Hi There,
Is the RHive will work with hiveserver2 which is enabled with kerberos security ?
When I try to connect to Hive I am getting following exception in my R Studio console
Thanks, Prabhu.