nexr / RHive

RHive is an R extension facilitating distributed computing via Apache Hive.
http://nexr.github.io/RHive
122 stars 63 forks source link

rhive.write.table java.sql.SQLException: Error while processing statement: FAILED: #74

Closed suolemen closed 9 years ago

suolemen commented 9 years ago

when i exec commond : rhive.write.table(liyg_a,"liyt_w3ww",sep=",", naString=NULL, rowName=FALSE,rowNameColumn="rowname")

find a error!!!!! java.sql.SQLException: Error while processing statement: FAILED: SemanticException Line 1:17 Invalid path ''/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c'': No files matching path hdfs://maste.com/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c

what the problem ? who can help me ?

ssshow16 commented 9 years ago

I guess that RHive fail to upload data into HDFS path 'hdfs:// maste.com/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c'

Please check if you can put a file into HDFS by using RHive function 'rhive.hdfs.put' or command line 'hadoop dfs -put {file} hdfs:// maste.com/rhive/data/root/{file}'

If you cannot, please check your hadoop configuration or hadoop installation!

On Tue, Nov 4, 2014 at 4:21 PM, suolemen notifications@github.com wrote:

when i exec commond : rhive.write.table(liyg_a,"liyt_w3ww",sep=",", naString=NULL, rowName=FALSE,rowNameColumn="rowname")

find a error!!!!! java.sql.SQLException: Error while processing statement: FAILED: SemanticException Line 1:17 Invalid path ''/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c'': No files matching path hdfs:// maste.com/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c

what the problem ? who can help me ?

— Reply to this email directly or view it on GitHub https://github.com/nexr/RHive/issues/74.

suolemen commented 9 years ago

hadoop dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt err :
DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it.

hadoop -> hdfs is ok!! :
hdfs dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt

rhive.hdfs.put("/data/rdata.txt","hdfs://maste.com/rhive/data/root/bbbccc2.txt") error: java.lang.IllegalArgumentException: Wrong FS: hdfs://hdmaster.infobird.com/rhive/data/root/bbbccc2.txt, expected: file:///

is the version problems?

ssshow16 commented 9 years ago

Please, let me know rhive environment variables. You can check it by using rhive function "rhive.env()"

On Tue, Nov 4, 2014 at 7:05 PM, suolemen notifications@github.com wrote:

hadoop dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt err :

DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it.

hadoop -> hdfs is ok!! :

hdfs dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt

rhive.hdfs.put("/data/rdata.txt","hdfs:// maste.com/rhive/data/root/bbbccc2.txt") error: java.lang.IllegalArgumentException: Wrong FS: hdfs:// hdmaster.infobird.com/rhive/data/root/bbbccc2.txt, expected: file:///

is the version problems?

— Reply to this email directly or view it on GitHub https://github.com/nexr/RHive/issues/74#issuecomment-61617676.

suolemen commented 9 years ago

rhive.env() hadoop home: /usr/lib/hadoop fs: file:/// hive home: /usr/lib/hive user name: root user home: /root

ssshow16 commented 9 years ago

Your defaultFS value is missing.

Try again by using the following function:

rhive.connect(defaultFS = "{your defaultFS}")

Please, let me know the result.

On Wed, Nov 5, 2014 at 10:44 AM, suolemen notifications@github.com wrote:

rhive.env() hadoop home: /usr/lib/hadoop fs: file:/// hive home: /usr/lib/hive user name: root user home: /root

— Reply to this email directly or view it on GitHub https://github.com/nexr/RHive/issues/74#issuecomment-61747236.

suolemen commented 9 years ago

hadoop hive fsdir: hdfs dfs -ls /user/hive/warehouse use whitch? rhive.connect("10.122.74.236",defaultFS="hdfs://maste.com/rhive/data/") rhive.connect("10.122.74.236",defaultFS="hdfs://master.com/user/hive/warehouse") rhive.connect("10.122.74.236",defaultFS="/rhive/data/") rhive.connect("10.122.74.236",defaultFS="/user/hive/warehouse")

suolemen commented 9 years ago

hadoop : core-site.xml

fs.defaultFS hdfs://maste.com/ hadoop.tmp.dir /data/hadoop/tmp

rhive.connect("10.122.74.236",defaultFS="hdfs://master.com/")

rhive.env() hadoop home: /usr/lib/hadoop fs: hdfs://master.com/ hive home: /usr/lib/hive user name: root user home: /root temp dir: /tmp/root>

suolemen commented 9 years ago

rhive.write.table() is ok ! think you